WorldWideScience

Sample records for microarray experiments designed

  1. Design issues in toxicogenomics using DNA microarray experiment

    International Nuclear Information System (INIS)

    Lee, Kyoung-Mu; Kim, Ju-Han; Kang, Daehee

    2005-01-01

    The methods of toxicogenomics might be classified into omics study (e.g., genomics, proteomics, and metabolomics) and population study focusing on risk assessment and gene-environment interaction. In omics study, microarray is the most popular approach. Genes falling into several categories (e.g., xenobiotics metabolism, cell cycle control, DNA repair etc.) can be selected up to 20,000 according to a priori hypothesis. The appropriate type of samples and species should be selected in advance. Multiple doses and varied exposure durations are suggested to identify those genes clearly linked to toxic response. Microarray experiments can be affected by numerous nuisance variables including experimental designs, sample extraction, type of scanners, etc. The number of slides might be determined from the magnitude and variance of expression change, false-positive rate, and desired power. Instead, pooling samples is an alternative. Online databases on chemicals with known exposure-disease outcomes and genetic information can aid the interpretation of the normalized results. Gene function can be inferred from microarray data analyzed by bioinformatics methods such as cluster analysis. The population study often adopts hospital-based or nested case-control design. Biases in subject selection and exposure assessment should be minimized, and confounding bias should also be controlled for in stratified or multiple regression analysis. Optimal sample sizes are dependent on the statistical test for gene-to-environment or gene-to-gene interaction. The design issues addressed in this mini-review are crucial in conducting toxicogenomics study. In addition, integrative approach of exposure assessment, epidemiology, and clinical trial is required

  2. Direct calibration of PICKY-designed microarrays

    Directory of Open Access Journals (Sweden)

    Ronald Pamela C

    2009-10-01

    Full Text Available Abstract Background Few microarrays have been quantitatively calibrated to identify optimal hybridization conditions because it is difficult to precisely determine the hybridization characteristics of a microarray using biologically variable cDNA samples. Results Using synthesized samples with known concentrations of specific oligonucleotides, a series of microarray experiments was conducted to evaluate microarrays designed by PICKY, an oligo microarray design software tool, and to test a direct microarray calibration method based on the PICKY-predicted, thermodynamically closest nontarget information. The complete set of microarray experiment results is archived in the GEO database with series accession number GSE14717. Additional data files and Perl programs described in this paper can be obtained from the website http://www.complex.iastate.edu under the PICKY Download area. Conclusion PICKY-designed microarray probes are highly reliable over a wide range of hybridization temperatures and sample concentrations. The microarray calibration method reported here allows researchers to experimentally optimize their hybridization conditions. Because this method is straightforward, uses existing microarrays and relatively inexpensive synthesized samples, it can be used by any lab that uses microarrays designed by PICKY. In addition, other microarrays can be reanalyzed by PICKY to obtain the thermodynamically closest nontarget information for calibration.

  3. Teolenn: an efficient and customizable workflow to design high-quality probes for microarray experiments

    Science.gov (United States)

    Jourdren, Laurent; Duclos, Aurélie; Brion, Christian; Portnoy, Thomas; Mathis, Hugues; Margeot, Antoine; Le Crom, Stéphane

    2010-01-01

    Despite the development of new high-throughput sequencing techniques, microarrays are still attractive tools to study small genome organisms, thanks to sample multiplexing and high-feature densities. However, the oligonucleotide design remains a delicate step for most users. A vast array of software is available to deal with this problem, but each program is developed with its own strategy, which makes the choice of the best solution difficult. Here we describe Teolenn, a universal probe design workflow developed with a flexible and customizable module organization allowing fixed or variable length oligonucleotide generation. In addition, our software is able to supply quality scores for each of the designed probes. In order to assess the relevance of these scores, we performed a real hybridization using a tiling array designed against the Trichoderma reesei fungus genome. We show that our scoring pipeline correlates with signal quality for 97.2% of all the designed probes, allowing for a posteriori comparisons between quality scores and signal intensities. This result is useful in discarding any bad scoring probes during the design step in order to get high-quality microarrays. Teolenn is available at http://transcriptome.ens.fr/teolenn/. PMID:20176570

  4. A newly designed 45 to 60 mer oligonucleotide Agilent platform microarray for global gene expression studies of Synechocystis PCC6803: example salt stress experiment

    NARCIS (Netherlands)

    Aguirre von Wobeser, E.; Huisman, J.; Ibelings, B.; Matthijs, H.C.P.; Matthijs, H.C.P.

    2005-01-01

    A newly designed 45 to 60 mer oligonucleotide Agilent platform microarray for global gene expression studies of Synechocystis PCC6803: example salt stress experiment Eneas Aguirre-von-Wobeser 1, Jef Huisman1, Bas Ibelings2 and Hans C.P. Matthijs1 1 Universiteit van Amsterdam, Amsterdam, The

  5. Normalization for triple-target microarray experiments

    Directory of Open Access Journals (Sweden)

    Magniette Frederic

    2008-04-01

    Full Text Available Abstract Background Most microarray studies are made using labelling with one or two dyes which allows the hybridization of one or two samples on the same slide. In such experiments, the most frequently used dyes are Cy3 and Cy5. Recent improvements in the technology (dye-labelling, scanner and, image analysis allow hybridization up to four samples simultaneously. The two additional dyes are Alexa488 and Alexa494. The triple-target or four-target technology is very promising, since it allows more flexibility in the design of experiments, an increase in the statistical power when comparing gene expressions induced by different conditions and a scaled down number of slides. However, there have been few methods proposed for statistical analysis of such data. Moreover the lowess correction of the global dye effect is available for only two-color experiments, and even if its application can be derived, it does not allow simultaneous correction of the raw data. Results We propose a two-step normalization procedure for triple-target experiments. First the dye bleeding is evaluated and corrected if necessary. Then the signal in each channel is normalized using a generalized lowess procedure to correct a global dye bias. The normalization procedure is validated using triple-self experiments and by comparing the results of triple-target and two-color experiments. Although the focus is on triple-target microarrays, the proposed method can be used to normalize p differently labelled targets co-hybridized on a same array, for any value of p greater than 2. Conclusion The proposed normalization procedure is effective: the technical biases are reduced, the number of false positives is under control in the analysis of differentially expressed genes, and the triple-target experiments are more powerful than the corresponding two-color experiments. There is room for improving the microarray experiments by simultaneously hybridizing more than two samples.

  6. Universal Reference RNA as a standard for microarray experiments

    Directory of Open Access Journals (Sweden)

    Fero Michael

    2004-03-01

    Full Text Available Abstract Background Obtaining reliable and reproducible two-color microarray gene expression data is critically important for understanding the biological significance of perturbations made on a cellular system. Microarray design, RNA preparation and labeling, hybridization conditions and data acquisition and analysis are variables difficult to simultaneously control. A useful tool for monitoring and controlling intra- and inter-experimental variation is Universal Reference RNA (URR, developed with the goal of providing hybridization signal at each microarray probe location (spot. Measuring signal at each spot as the ratio of experimental RNA to reference RNA targets, rather than relying on absolute signal intensity, decreases variability by normalizing signal output in any two-color hybridization experiment. Results Human, mouse and rat URR (UHRR, UMRR and URRR, respectively were prepared from pools of RNA derived from individual cell lines representing different tissues. A variety of microarrays were used to determine percentage of spots hybridizing with URR and producing signal above a user defined threshold (microarray coverage. Microarray coverage was consistently greater than 80% for all arrays tested. We confirmed that individual cell lines contribute their own unique set of genes to URR, arguing for a pool of RNA from several cell lines as a better configuration for URR as opposed to a single cell line source for URR. Microarray coverage comparing two separately prepared batches each of UHRR, UMRR and URRR were highly correlated (Pearson's correlation coefficients of 0.97. Conclusion Results of this study demonstrate that large quantities of pooled RNA from individual cell lines are reproducibly prepared and possess diverse gene representation. This type of reference provides a standard for reducing variation in microarray experiments and allows more reliable comparison of gene expression data within and between experiments and

  7. Design of a covalently bonded glycosphingolipid microarray

    DEFF Research Database (Denmark)

    Arigi, Emma; Blixt, Klas Ola; Buschard, Karsten

    2012-01-01

    , the major classes of plant and fungal GSLs. In this work, a prototype "universal" GSL-based covalent microarray has been designed, and preliminary evaluation of its potential utility in assaying protein-GSL binding interactions investigated. An essential step in development involved the enzymatic release...... of the fatty acyl moiety of the ceramide aglycone of selected mammalian GSLs with sphingolipid N-deacylase (SCDase). Derivatization of the free amino group of a typical lyso-GSL, lyso-G(M1), with a prototype linker assembled from succinimidyl-[(N-maleimidopropionamido)-diethyleneglycol] ester and 2...

  8. Development and application of a microarray meter tool to optimize microarray experiments

    Directory of Open Access Journals (Sweden)

    Rouse Richard JD

    2008-07-01

    Full Text Available Abstract Background Successful microarray experimentation requires a complex interplay between the slide chemistry, the printing pins, the nucleic acid probes and targets, and the hybridization milieu. Optimization of these parameters and a careful evaluation of emerging slide chemistries are a prerequisite to any large scale array fabrication effort. We have developed a 'microarray meter' tool which assesses the inherent variations associated with microarray measurement prior to embarking on large scale projects. Findings The microarray meter consists of nucleic acid targets (reference and dynamic range control and probe components. Different plate designs containing identical probe material were formulated to accommodate different robotic and pin designs. We examined the variability in probe quality and quantity (as judged by the amount of DNA printed and remaining post-hybridization using three robots equipped with capillary printing pins. Discussion The generation of microarray data with minimal variation requires consistent quality control of the (DNA microarray manufacturing and experimental processes. Spot reproducibility is a measure primarily of the variations associated with printing. The microarray meter assesses array quality by measuring the DNA content for every feature. It provides a post-hybridization analysis of array quality by scoring probe performance using three metrics, a a measure of variability in the signal intensities, b a measure of the signal dynamic range and c a measure of variability of the spot morphologies.

  9. Design of an Enterobacteriaceae Pan-genome Microarray Chip

    DEFF Research Database (Denmark)

    Lukjancenko, Oksana; Ussery, David

    2010-01-01

    -density microarray chip has been designed, using 116 Enterobacteriaceae genome sequences, taking into account the enteric pan-genome. Probes for the microarray were checked in silico and performance of the chip, based on experimental strains from four different genera, demonstrate a relatively high ability...... to distinguish those strains on genus, species, and pathotype/serovar levels. Additionally, the microarray performed well when investigating which genes were found in a given strain of interest. The Enterobacteriaceae pan-genome microarray, based on 116 genomes, provides a valuable tool for determination...

  10. A Java-based tool for the design of classification microarrays.

    Science.gov (United States)

    Meng, Da; Broschat, Shira L; Call, Douglas R

    2008-08-04

    Classification microarrays are used for purposes such as identifying strains of bacteria and determining genetic relationships to understand the epidemiology of an infectious disease. For these cases, mixed microarrays, which are composed of DNA from more than one organism, are more effective than conventional microarrays composed of DNA from a single organism. Selection of probes is a key factor in designing successful mixed microarrays because redundant sequences are inefficient and limited representation of diversity can restrict application of the microarray. We have developed a Java-based software tool, called PLASMID, for use in selecting the minimum set of probe sequences needed to classify different groups of plasmids or bacteria. The software program was successfully applied to several different sets of data. The utility of PLASMID was illustrated using existing mixed-plasmid microarray data as well as data from a virtual mixed-genome microarray constructed from different strains of Streptococcus. Moreover, use of data from expression microarray experiments demonstrated the generality of PLASMID. In this paper we describe a new software tool for selecting a set of probes for a classification microarray. While the tool was developed for the design of mixed microarrays-and mixed-plasmid microarrays in particular-it can also be used to design expression arrays. The user can choose from several clustering methods (including hierarchical, non-hierarchical, and a model-based genetic algorithm), several probe ranking methods, and several different display methods. A novel approach is used for probe redundancy reduction, and probe selection is accomplished via stepwise discriminant analysis. Data can be entered in different formats (including Excel and comma-delimited text), and dendrogram, heat map, and scatter plot images can be saved in several different formats (including jpeg and tiff). Weights generated using stepwise discriminant analysis can be stored for

  11. Shared probe design and existing microarray reanalysis using PICKY

    Directory of Open Access Journals (Sweden)

    Chou Hui-Hsien

    2010-04-01

    Full Text Available Abstract Background Large genomes contain families of highly similar genes that cannot be individually identified by microarray probes. This limitation is due to thermodynamic restrictions and cannot be resolved by any computational method. Since gene annotations are updated more frequently than microarrays, another common issue facing microarray users is that existing microarrays must be routinely reanalyzed to determine probes that are still useful with respect to the updated annotations. Results PICKY 2.0 can design shared probes for sets of genes that cannot be individually identified using unique probes. PICKY 2.0 uses novel algorithms to track sharable regions among genes and to strictly distinguish them from other highly similar but nontarget regions during thermodynamic comparisons. Therefore, PICKY does not sacrifice the quality of shared probes when choosing them. The latest PICKY 2.1 includes the new capability to reanalyze existing microarray probes against updated gene sets to determine probes that are still valid to use. In addition, more precise nonlinear salt effect estimates and other improvements are added, making PICKY 2.1 more versatile to microarray users. Conclusions Shared probes allow expressed gene family members to be detected; this capability is generally more desirable than not knowing anything about these genes. Shared probes also enable the design of cross-genome microarrays, which facilitate multiple species identification in environmental samples. The new nonlinear salt effect calculation significantly increases the precision of probes at a lower buffer salt concentration, and the probe reanalysis function improves existing microarray result interpretations.

  12. Systematic interpretation of microarray data using experiment annotations

    Directory of Open Access Journals (Sweden)

    Frohme Marcus

    2006-12-01

    Full Text Available Abstract Background Up to now, microarray data are mostly assessed in context with only one or few parameters characterizing the experimental conditions under study. More explicit experiment annotations, however, are highly useful for interpreting microarray data, when available in a statistically accessible format. Results We provide means to preprocess these additional data, and to extract relevant traits corresponding to the transcription patterns under study. We found correspondence analysis particularly well-suited for mapping such extracted traits. It visualizes associations both among and between the traits, the hereby annotated experiments, and the genes, revealing how they are all interrelated. Here, we apply our methods to the systematic interpretation of radioactive (single channel and two-channel data, stemming from model organisms such as yeast and drosophila up to complex human cancer samples. Inclusion of technical parameters allows for identification of artifacts and flaws in experimental design. Conclusion Biological and clinical traits can act as landmarks in transcription space, systematically mapping the variance of large datasets from the predominant changes down toward intricate details.

  13. Uropathogenic Escherichia coli virulence genes: invaluable approaches for designing DNA microarray probes.

    Science.gov (United States)

    Jahandeh, Nadia; Ranjbar, Reza; Behzadi, Payam; Behzadi, Elham

    2015-01-01

    The pathotypes of uropathogenic Escherichia coli (UPEC) cause different types of urinary tract infections (UTIs). The presence of a wide range of virulence genes in UPEC enables us to design appropriate DNA microarray probes. These probes, which are used in DNA microarray technology, provide us with an accurate and rapid diagnosis and definitive treatment in association with UTIs caused by UPEC pathotypes. The main goal of this article is to introduce the UPEC virulence genes as invaluable approaches for designing DNA microarray probes. Main search engines such as Google Scholar and databases like NCBI were searched to find and study several original pieces of literature, review articles, and DNA gene sequences. In parallel with in silico studies, the experiences of the authors were helpful for selecting appropriate sources and writing this review article. There is a significant variety of virulence genes among UPEC strains. The DNA sequences of virulence genes are fabulous patterns for designing microarray probes. The location of virulence genes and their sequence lengths influence the quality of probes. The use of selected virulence genes for designing microarray probes gives us a wide range of choices from which the best probe candidates can be chosen. DNA microarray technology provides us with an accurate, rapid, cost-effective, sensitive, and specific molecular diagnostic method which is facilitated by designing microarray probes. Via these tools, we are able to have an accurate diagnosis and a definitive treatment regarding UTIs caused by UPEC pathotypes.

  14. Advanced spot quality analysis in two-colour microarray experiments

    Directory of Open Access Journals (Sweden)

    Vetter Guillaume

    2008-09-01

    Full Text Available Abstract Background Image analysis of microarrays and, in particular, spot quantification and spot quality control, is one of the most important steps in statistical analysis of microarray data. Recent methods of spot quality control are still in early age of development, often leading to underestimation of true positive microarray features and, consequently, to loss of important biological information. Therefore, improving and standardizing the statistical approaches of spot quality control are essential to facilitate the overall analysis of microarray data and subsequent extraction of biological information. Findings We evaluated the performance of two image analysis packages MAIA and GenePix (GP using two complementary experimental approaches with a focus on the statistical analysis of spot quality factors. First, we developed control microarrays with a priori known fluorescence ratios to verify the accuracy and precision of the ratio estimation of signal intensities. Next, we developed advanced semi-automatic protocols of spot quality evaluation in MAIA and GP and compared their performance with available facilities of spot quantitative filtering in GP. We evaluated these algorithms for standardised spot quality analysis in a whole-genome microarray experiment assessing well-characterised transcriptional modifications induced by the transcription regulator SNAI1. Using a set of RT-PCR or qRT-PCR validated microarray data, we found that the semi-automatic protocol of spot quality control we developed with MAIA allowed recovering approximately 13% more spots and 38% more differentially expressed genes (at FDR = 5% than GP with default spot filtering conditions. Conclusion Careful control of spot quality characteristics with advanced spot quality evaluation can significantly increase the amount of confident and accurate data resulting in more meaningful biological conclusions.

  15. The MGED Ontology: a resource for semantics-based description of microarray experiments.

    Science.gov (United States)

    Whetzel, Patricia L; Parkinson, Helen; Causton, Helen C; Fan, Liju; Fostel, Jennifer; Fragoso, Gilberto; Game, Laurence; Heiskanen, Mervi; Morrison, Norman; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Taylor, Chris; White, Joseph; Stoeckert, Christian J

    2006-04-01

    The generation of large amounts of microarray data and the need to share these data bring challenges for both data management and annotation and highlights the need for standards. MIAME specifies the minimum information needed to describe a microarray experiment and the Microarray Gene Expression Object Model (MAGE-OM) and resulting MAGE-ML provide a mechanism to standardize data representation for data exchange, however a common terminology for data annotation is needed to support these standards. Here we describe the MGED Ontology (MO) developed by the Ontology Working Group of the Microarray Gene Expression Data (MGED) Society. The MO provides terms for annotating all aspects of a microarray experiment from the design of the experiment and array layout, through to the preparation of the biological sample and the protocols used to hybridize the RNA and analyze the data. The MO was developed to provide terms for annotating experiments in line with the MIAME guidelines, i.e. to provide the semantics to describe a microarray experiment according to the concepts specified in MIAME. The MO does not attempt to incorporate terms from existing ontologies, e.g. those that deal with anatomical parts or developmental stages terms, but provides a framework to reference terms in other ontologies and therefore facilitates the use of ontologies in microarray data annotation. The MGED Ontology version.1.2.0 is available as a file in both DAML and OWL formats at http://mged.sourceforge.net/ontologies/index.php. Release notes and annotation examples are provided. The MO is also provided via the NCICB's Enterprise Vocabulary System (http://nciterms.nci.nih.gov/NCIBrowser/Dictionary.do). Stoeckrt@pcbi.upenn.edu Supplementary data are available at Bioinformatics online.

  16. A Java-based tool for the design of classification microarrays

    Directory of Open Access Journals (Sweden)

    Broschat Shira L

    2008-08-01

    Full Text Available Abstract Background Classification microarrays are used for purposes such as identifying strains of bacteria and determining genetic relationships to understand the epidemiology of an infectious disease. For these cases, mixed microarrays, which are composed of DNA from more than one organism, are more effective than conventional microarrays composed of DNA from a single organism. Selection of probes is a key factor in designing successful mixed microarrays because redundant sequences are inefficient and limited representation of diversity can restrict application of the microarray. We have developed a Java-based software tool, called PLASMID, for use in selecting the minimum set of probe sequences needed to classify different groups of plasmids or bacteria. Results The software program was successfully applied to several different sets of data. The utility of PLASMID was illustrated using existing mixed-plasmid microarray data as well as data from a virtual mixed-genome microarray constructed from different strains of Streptococcus. Moreover, use of data from expression microarray experiments demonstrated the generality of PLASMID. Conclusion In this paper we describe a new software tool for selecting a set of probes for a classification microarray. While the tool was developed for the design of mixed microarrays–and mixed-plasmid microarrays in particular–it can also be used to design expression arrays. The user can choose from several clustering methods (including hierarchical, non-hierarchical, and a model-based genetic algorithm, several probe ranking methods, and several different display methods. A novel approach is used for probe redundancy reduction, and probe selection is accomplished via stepwise discriminant analysis. Data can be entered in different formats (including Excel and comma-delimited text, and dendrogram, heat map, and scatter plot images can be saved in several different formats (including jpeg and tiff. Weights

  17. Detecting variants with Metabolic Design, a new software tool to design probes for explorative functional DNA microarray development

    Directory of Open Access Journals (Sweden)

    Gravelat Fabrice

    2010-09-01

    Full Text Available Abstract Background Microorganisms display vast diversity, and each one has its own set of genes, cell components and metabolic reactions. To assess their huge unexploited metabolic potential in different ecosystems, we need high throughput tools, such as functional microarrays, that allow the simultaneous analysis of thousands of genes. However, most classical functional microarrays use specific probes that monitor only known sequences, and so fail to cover the full microbial gene diversity present in complex environments. We have thus developed an algorithm, implemented in the user-friendly program Metabolic Design, to design efficient explorative probes. Results First we have validated our approach by studying eight enzymes involved in the degradation of polycyclic aromatic hydrocarbons from the model strain Sphingomonas paucimobilis sp. EPA505 using a designed microarray of 8,048 probes. As expected, microarray assays identified the targeted set of genes induced during biodegradation kinetics experiments with various pollutants. We have then confirmed the identity of these new genes by sequencing, and corroborated the quantitative discrimination of our microarray by quantitative real-time PCR. Finally, we have assessed metabolic capacities of microbial communities in soil contaminated with aromatic hydrocarbons. Results show that our probe design (sensitivity and explorative quality can be used to study a complex environment efficiently. Conclusions We successfully use our microarray to detect gene expression encoding enzymes involved in polycyclic aromatic hydrocarbon degradation for the model strain. In addition, DNA microarray experiments performed on soil polluted by organic pollutants without prior sequence assumptions demonstrate high specificity and sensitivity for gene detection. Metabolic Design is thus a powerful, efficient tool that can be used to design explorative probes and monitor metabolic pathways in complex environments

  18. Design, construction and validation of a Plasmodium vivax microarray for the transcriptome profiling of clinical isolates

    KAUST Repository

    Boopathi, Pon Arunachalam

    2016-10-09

    High density oligonucleotide microarrays have been used on Plasmodium vivax field isolates to estimate whole genome expression. However, no microarray platform has been experimentally optimized for studying the transcriptome of field isolates. In the present study, we adopted both bioinformatics and experimental testing approaches to select best optimized probes suitable for detecting parasite transcripts from field samples and included them in designing a custom 15K P. vivax microarray. This microarray has long oligonucleotide probes (60 mer) that were in-situ synthesized onto glass slides using Agilent SurePrint technology and has been developed into an 8X15K format (8 identical arrays on a single slide). Probes in this array were experimentally validated and represents 4180 P. vivax genes in sense orientation, of which 1219 genes have also probes in antisense orientation. Validation of the 15K array by using field samples (n =14) has shown 99% of parasite transcript detection from any of the samples. Correlation analysis between duplicate probes (n = 85) present in the arrays showed perfect correlation (r(2) = 0.98) indicating the reproducibility. Multiple probes representing the same gene exhibited similar kind of expression pattern across the samples (positive correlation, r >= 0.6). Comparison of hybridization data with the previous studies and quantitative real-time PCR experiments were performed to highlight the microarray validation procedure. This array is unique in its design, and results indicate that the array is sensitive and reproducible. Hence, this microarray could be a valuable functional genomics tool to generate reliable expression data from P. vivax field isolates. (C) 2016 Published by Elsevier B.V.

  19. Design, construction and validation of a Plasmodium vivax microarray for the transcriptome profiling of clinical isolates

    KAUST Repository

    Boopathi, Pon Arunachalam; Subudhi, Amit; Middha, Sheetal; Acharya, Jyoti; Mugasimangalam, Raja Chinnadurai; Kochar, Sanjay Kumar; Kochar, Dhanpat Kumar; Das, Ashis

    2016-01-01

    High density oligonucleotide microarrays have been used on Plasmodium vivax field isolates to estimate whole genome expression. However, no microarray platform has been experimentally optimized for studying the transcriptome of field isolates. In the present study, we adopted both bioinformatics and experimental testing approaches to select best optimized probes suitable for detecting parasite transcripts from field samples and included them in designing a custom 15K P. vivax microarray. This microarray has long oligonucleotide probes (60 mer) that were in-situ synthesized onto glass slides using Agilent SurePrint technology and has been developed into an 8X15K format (8 identical arrays on a single slide). Probes in this array were experimentally validated and represents 4180 P. vivax genes in sense orientation, of which 1219 genes have also probes in antisense orientation. Validation of the 15K array by using field samples (n =14) has shown 99% of parasite transcript detection from any of the samples. Correlation analysis between duplicate probes (n = 85) present in the arrays showed perfect correlation (r(2) = 0.98) indicating the reproducibility. Multiple probes representing the same gene exhibited similar kind of expression pattern across the samples (positive correlation, r >= 0.6). Comparison of hybridization data with the previous studies and quantitative real-time PCR experiments were performed to highlight the microarray validation procedure. This array is unique in its design, and results indicate that the array is sensitive and reproducible. Hence, this microarray could be a valuable functional genomics tool to generate reliable expression data from P. vivax field isolates. (C) 2016 Published by Elsevier B.V.

  20. Fast gene ontology based clustering for microarray experiments.

    Science.gov (United States)

    Ovaska, Kristian; Laakso, Marko; Hautaniemi, Sampsa

    2008-11-21

    Analysis of a microarray experiment often results in a list of hundreds of disease-associated genes. In order to suggest common biological processes and functions for these genes, Gene Ontology annotations with statistical testing are widely used. However, these analyses can produce a very large number of significantly altered biological processes. Thus, it is often challenging to interpret GO results and identify novel testable biological hypotheses. We present fast software for advanced gene annotation using semantic similarity for Gene Ontology terms combined with clustering and heat map visualisation. The methodology allows rapid identification of genes sharing the same Gene Ontology cluster. Our R based semantic similarity open-source package has a speed advantage of over 2000-fold compared to existing implementations. From the resulting hierarchical clustering dendrogram genes sharing a GO term can be identified, and their differences in the gene expression patterns can be seen from the heat map. These methods facilitate advanced annotation of genes resulting from data analysis.

  1. Optimal designs for one- and two-color microarrays using mixed models: a comparative evaluation of their efficiencies.

    Science.gov (United States)

    Lima Passos, Valéria; Tan, Frans E S; Winkens, Bjorn; Berger, Martijn P F

    2009-01-01

    Comparative studies between the one- and two-color microarrays provide supportive evidence for similarities of results on differential gene expression. So far, no design comparisons between the two platforms have been undertaken. With the objective of comparing optimal designs of one- and two-color microarrays in their statistical efficiencies, techniques of design optimization were applied within a mixed model framework. A- and D-optimal designs for the one- and two-color platforms were sought for a 3 x 3 factorial experiment. The results suggest that the choice of the platform will not affect the "subjects to groups" allocation, being concordant in the two designs. However, under financial constraints, the two-color arrays are expected to have a slight upper hand in terms of efficiency of model parameters estimates, once the price of arrays is more expensive than that of subjects. This statement is especially valid for microarray studies envisaging class comparisons.

  2. Fast Gene Ontology based clustering for microarray experiments

    Directory of Open Access Journals (Sweden)

    Ovaska Kristian

    2008-11-01

    Full Text Available Abstract Background Analysis of a microarray experiment often results in a list of hundreds of disease-associated genes. In order to suggest common biological processes and functions for these genes, Gene Ontology annotations with statistical testing are widely used. However, these analyses can produce a very large number of significantly altered biological processes. Thus, it is often challenging to interpret GO results and identify novel testable biological hypotheses. Results We present fast software for advanced gene annotation using semantic similarity for Gene Ontology terms combined with clustering and heat map visualisation. The methodology allows rapid identification of genes sharing the same Gene Ontology cluster. Conclusion Our R based semantic similarity open-source package has a speed advantage of over 2000-fold compared to existing implementations. From the resulting hierarchical clustering dendrogram genes sharing a GO term can be identified, and their differences in the gene expression patterns can be seen from the heat map. These methods facilitate advanced annotation of genes resulting from data analysis.

  3. Identification of potential biomarkers from microarray experiments using multiple criteria optimization

    International Nuclear Information System (INIS)

    Sánchez-Peña, Matilde L; Isaza, Clara E; Pérez-Morales, Jaileene; Rodríguez-Padilla, Cristina; Castro, José M; Cabrera-Ríos, Mauricio

    2013-01-01

    Microarray experiments are capable of determining the relative expression of tens of thousands of genes simultaneously, thus resulting in very large databases. The analysis of these databases and the extraction of biologically relevant knowledge from them are challenging tasks. The identification of potential cancer biomarker genes is one of the most important aims for microarray analysis and, as such, has been widely targeted in the literature. However, identifying a set of these genes consistently across different experiments, researches, microarray platforms, or cancer types is still an elusive endeavor. Besides the inherent difficulty of the large and nonconstant variability in these experiments and the incommensurability between different microarray technologies, there is the issue of the users having to adjust a series of parameters that significantly affect the outcome of the analyses and that do not have a biological or medical meaning. In this study, the identification of potential cancer biomarkers from microarray data is casted as a multiple criteria optimization (MCO) problem. The efficient solutions to this problem, found here through data envelopment analysis (DEA), are associated to genes that are proposed as potential cancer biomarkers. The method does not require any parameter adjustment by the user, and thus fosters repeatability. The approach also allows the analysis of different microarray experiments, microarray platforms, and cancer types simultaneously. The results include the analysis of three publicly available microarray databases related to cervix cancer. This study points to the feasibility of modeling the selection of potential cancer biomarkers from microarray data as an MCO problem and solve it using DEA. Using MCO entails a new optic to the identification of potential cancer biomarkers as it does not require the definition of a threshold value to establish significance for a particular gene and the selection of a normalization

  4. Rational design of DNA sequences for nanotechnology, microarrays and molecular computers using Eulerian graphs.

    Science.gov (United States)

    Pancoska, Petr; Moravek, Zdenek; Moll, Ute M

    2004-01-01

    Nucleic acids are molecules of choice for both established and emerging nanoscale technologies. These technologies benefit from large functional densities of 'DNA processing elements' that can be readily manufactured. To achieve the desired functionality, polynucleotide sequences are currently designed by a process that involves tedious and laborious filtering of potential candidates against a series of requirements and parameters. Here, we present a complete novel methodology for the rapid rational design of large sets of DNA sequences. This method allows for the direct implementation of very complex and detailed requirements for the generated sequences, thus avoiding 'brute force' filtering. At the same time, these sequences have narrow distributions of melting temperatures. The molecular part of the design process can be done without computer assistance, using an efficient 'human engineering' approach by drawing a single blueprint graph that represents all generated sequences. Moreover, the method eliminates the necessity for extensive thermodynamic calculations. Melting temperature can be calculated only once (or not at all). In addition, the isostability of the sequences is independent of the selection of a particular set of thermodynamic parameters. Applications are presented for DNA sequence designs for microarrays, universal microarray zip sequences and electron transfer experiments.

  5. A Reliable and Distributed LIMS for Efficient Management of the Microarray Experiment Environment

    Directory of Open Access Journals (Sweden)

    Jin Hee-Jeong

    2007-03-01

    Full Text Available A microarray is a principal technology in molecular biology. It generates thousands of expressions of genotypes at once. Typically, a microarray experiment contains many kinds of information, such as gene names, sequences, expression profiles, scanned images, and annotation. So, the organization and analysis of vast amounts of data are required. Microarray LIMS (Laboratory Information Management System provides data management, search, and basic analysis. Recently, microarray joint researches, such as the skeletal system disease and anti-cancer medicine have been widely conducted. This research requires data sharing among laboratories within the joint research group. In this paper, we introduce a web based microarray LIMS, SMILE (Small and solid MIcroarray Lims for Experimenters, especially for shared data management. The data sharing function of SMILE is based on Friend-to-Friend (F2F, which is based on anonymous P2P (Peer-to-Peer, in which people connect directly with their “friends”. It only allows its friends to exchange data directly using IP addresses or digital signatures you trust. In SMILE, there are two types of friends: “service provider”, which provides data, and “client”, which is provided with data. So, the service provider provides shared data only to its clients. SMILE provides useful functions for microarray experiments, such as variant data management, image analysis, normalization, system management, project schedule management, and shared data management. Moreover, it connections with two systems: ArrayMall for analyzing microarray images and GENAW for constructing a genetic network. SMILE is available on http://neobio.cs.pusan.ac.kr:8080/smile.

  6. Design of oligonucleotides for microarrays and perspectives for design of multi-transcriptome arrays

    DEFF Research Database (Denmark)

    Nielsen, Henrik Bjørn; Wernersson, Rasmus; Knudsen, Steen

    2003-01-01

    with an overview of these parameters. We present here a flexible tool named OligoWiz for designing oligonucleotides for multiple purposes. OligoWiz presents a set of parameter scores in a graphical interface to facilitate an overview for the user. Additional custom parameter scores can easily be added......Optimal design of oligonucleotides for microarrays involves tedious and laborious work evaluating potential oligonucleotides relative to a series of parameters. The currently available tools for this purpose are limited in their flexibility and do not present the oligonucleotide designer...... to the program to extend the default parameters: homology, DeltaTm, low-complexity, position and GATC-only. Furthermore we present an analysis of the limitations in designing oligonucleotide sets that can detect transcripts from multiple organisms. OligoWiz is available at www.cbs.dtu.dk/services/OligoWiz/....

  7. Detection of NASBA amplified bacterial tmRNA molecules on SLICSel designed microarray probes

    Directory of Open Access Journals (Sweden)

    Toome Kadri

    2011-02-01

    Full Text Available Abstract Background We present a comprehensive technological solution for bacterial diagnostics using tmRNA as a marker molecule. A robust probe design algorithm for microbial detection microarray is implemented. The probes were evaluated for specificity and, combined with NASBA (Nucleic Acid Sequence Based Amplification amplification, for sensitivity. Results We developed a new web-based program SLICSel for the design of hybridization probes, based on nearest-neighbor thermodynamic modeling. A SLICSel minimum binding energy difference criterion of 4 kcal/mol was sufficient to design of Streptococcus pneumoniae tmRNA specific microarray probes. With lower binding energy difference criteria, additional hybridization specificity tests on the microarray were needed to eliminate non-specific probes. Using SLICSel designed microarray probes and NASBA we were able to detect S. pneumoniae tmRNA from a series of total RNA dilutions equivalent to the RNA content of 0.1-10 CFU. Conclusions The described technological solution and both its separate components SLICSel and NASBA-microarray technology independently are applicative for many different areas of microbial diagnostics.

  8. Detection of NASBA amplified bacterial tmRNA molecules on SLICSel designed microarray probes

    LENUS (Irish Health Repository)

    Scheler, Ott

    2011-02-28

    Abstract Background We present a comprehensive technological solution for bacterial diagnostics using tmRNA as a marker molecule. A robust probe design algorithm for microbial detection microarray is implemented. The probes were evaluated for specificity and, combined with NASBA (Nucleic Acid Sequence Based Amplification) amplification, for sensitivity. Results We developed a new web-based program SLICSel for the design of hybridization probes, based on nearest-neighbor thermodynamic modeling. A SLICSel minimum binding energy difference criterion of 4 kcal\\/mol was sufficient to design of Streptococcus pneumoniae tmRNA specific microarray probes. With lower binding energy difference criteria, additional hybridization specificity tests on the microarray were needed to eliminate non-specific probes. Using SLICSel designed microarray probes and NASBA we were able to detect S. pneumoniae tmRNA from a series of total RNA dilutions equivalent to the RNA content of 0.1-10 CFU. Conclusions The described technological solution and both its separate components SLICSel and NASBA-microarray technology independently are applicative for many different areas of microbial diagnostics.

  9. Design of modern experiments

    International Nuclear Information System (INIS)

    Park, Sung Hweon

    1984-03-01

    This book is for researchers and engineers, which is written to focus on practical design of experiments. It gives descriptions of conception of design of experiments, basic statistics theory, one way design of experiment, two-way layout without repetition, two-way layout with repetition, partition, a correlation analysis and regression analysis, latin squares, factorial design, design of experiment by table of orthogonal arrays, design of experiment of response surface, design of experiment on compound, Evop, and design of experiment of taguchi.

  10. The PowerAtlas: a power and sample size atlas for microarray experimental design and research

    Directory of Open Access Journals (Sweden)

    Wang Jelai

    2006-02-01

    Full Text Available Abstract Background Microarrays permit biologists to simultaneously measure the mRNA abundance of thousands of genes. An important issue facing investigators planning microarray experiments is how to estimate the sample size required for good statistical power. What is the projected sample size or number of replicate chips needed to address the multiple hypotheses with acceptable accuracy? Statistical methods exist for calculating power based upon a single hypothesis, using estimates of the variability in data from pilot studies. There is, however, a need for methods to estimate power and/or required sample sizes in situations where multiple hypotheses are being tested, such as in microarray experiments. In addition, investigators frequently do not have pilot data to estimate the sample sizes required for microarray studies. Results To address this challenge, we have developed a Microrarray PowerAtlas 1. The atlas enables estimation of statistical power by allowing investigators to appropriately plan studies by building upon previous studies that have similar experimental characteristics. Currently, there are sample sizes and power estimates based on 632 experiments from Gene Expression Omnibus (GEO. The PowerAtlas also permits investigators to upload their own pilot data and derive power and sample size estimates from these data. This resource will be updated regularly with new datasets from GEO and other databases such as The Nottingham Arabidopsis Stock Center (NASC. Conclusion This resource provides a valuable tool for investigators who are planning efficient microarray studies and estimating required sample sizes.

  11. Mismatch oligonucleotides in human and yeast: guidelines for probe design on tiling microarrays

    Directory of Open Access Journals (Sweden)

    Jee Justin

    2008-12-01

    Full Text Available Abstract Background Mismatched oligonucleotides are widely used on microarrays to differentiate specific from nonspecific hybridization. While many experiments rely on such oligos, the hybridization behavior of various degrees of mismatch (MM structure has not been extensively studied. Here, we present the results of two large-scale microarray experiments on S. cerevisiae and H. sapiens genomic DNA, to explore MM oligonucleotide behavior with real sample mixtures under tiling-array conditions. Results We examined all possible nucleotide substitutions at the central position of 36-nucleotide probes, and found that nonspecific binding by MM oligos depends upon the individual nucleotide substitutions they incorporate: C→A, C→G and T→A (yielding purine-purine mispairs are most disruptive, whereas A→X were least disruptive. We also quantify a marked GC skew effect: substitutions raising probe GC content exhibit higher intensity (and vice versa. This skew is small in highly-expressed regions (± 0.5% of total intensity range and large (± 2% or more elsewhere. Multiple mismatches per oligo are largely additive in effect: each MM added in a distributed fashion causes an additional 21% intensity drop relative to PM, three-fold more disruptive than adding adjacent mispairs (7% drop per MM. Conclusion We investigate several parameters for oligonucleotide design, including the effects of each central nucleotide substitution on array signal intensity and of multiple MM per oligo. To avoid GC skew, individual substitutions should not alter probe GC content. RNA sample mixture complexity may increase the amount of nonspecific hybridization, magnify GC skew and boost the intensity of MM oligos at all levels.

  12. In silico design and performance of peptide microarrays for breast cancer tumour-auto-antibody testing

    Directory of Open Access Journals (Sweden)

    Andreas Weinhäusel

    2012-06-01

    Full Text Available The simplicity and potential of minimally invasive testing using sera from patients makes auto-antibody based biomarkers a very promising tool for use in cancer diagnostics. Protein microarrays have been used for the identification of such auto-antibody signatures. Because high throughput protein expression and purification is laborious, synthetic peptides might be a good alternative for microarray generation and multiplexed analyses. In this study, we designed 1185 antigenic peptides, deduced from proteins expressed by 642 cDNA expression clones found to be sero-reactive in both breast tumour patients and controls. The sero-reactive proteins and the corresponding peptides were used for the production of protein and peptide microarrays. Serum samples from females with benign and malignant breast tumours and healthy control sera (n=16 per group were then analysed. Correct classification of the serum samples on peptide microarrays were 78% for discrimination of ‘malignant versus healthy controls’, 72% for ‘benign versus malignant’ and 94% for ‘benign versus controls’. On protein arrays, correct classification for these contrasts was 69%, 59% and 59%, respectively. The over-representation analysis of the classifiers derived from class prediction showed enrichment of genes associated with ribosomes, spliceosomes, endocytosis and the pentose phosphate pathway. Sequence analyses of the peptides with the highest sero-reactivity demonstrated enrichment of the zinc-finger domain. Peptides’ sero-reactivities were found negatively correlated with hydrophobicity and positively correlated with positive charge, high inter-residue protein contact energies and a secondary structure propensity bias. This study hints at the possibility of using in silico designed antigenic peptide microarrays as an alternative to protein microarrays for the improvement of tumour auto-antibody based diagnostics.

  13. Position dependent mismatch discrimination on DNA microarraysexperiments and model

    Directory of Open Access Journals (Sweden)

    Michel Wolfgang

    2008-12-01

    Full Text Available Abstract Background The propensity of oligonucleotide strands to form stable duplexes with complementary sequences is fundamental to a variety of biological and biotechnological processes as various as microRNA signalling, microarray hybridization and PCR. Yet our understanding of oligonucleotide hybridization, in particular in presence of surfaces, is rather limited. Here we use oligonucleotide microarrays made in-house by optically controlled DNA synthesis to produce probe sets comprising all possible single base mismatches and base bulges for each of 20 sequence motifs under study. Results We observe that mismatch discrimination is mostly determined by the defect position (relative to the duplex ends as well as by the sequence context. We investigate the thermodynamics of the oligonucleotide duplexes on the basis of double-ended molecular zipper. Theoretical predictions of defect positional influence as well as long range sequence influence agree well with the experimental results. Conclusion Molecular zipping at thermodynamic equilibrium explains the binding affinity of mismatched DNA duplexes on microarrays well. The position dependent nearest neighbor model (PDNN can be inferred from it. Quantitative understanding of microarray experiments from first principles is in reach.

  14. A comparison of alternative 60-mer probe designs in an in-situ synthesized oligonucleotide microarray

    Directory of Open Access Journals (Sweden)

    Fairbanks Benjamin D

    2006-04-01

    Full Text Available Abstract Background DNA microarrays have proven powerful for functional genomics studies. Several technologies exist for the generation of whole-genome arrays. It is well documented that 25mer probes directed against different regions of the same gene produce variable signal intensity values. However, the extent to which this is true for probes of greater length (60mers is not well characterized. Moreover, this information has not previously been reported for whole-genome arrays designed against bacteria, whose genomes may differ substantially in characteristics directly affecting microarray performance. Results We report here an analysis of alternative 60mer probe designs for an in-situ synthesized oligonucleotide array for the GC rich, β-proteobacterium Burkholderia cenocepacia. Probes were designed using the ArrayOligoSel3.5 software package and whole-genome microarrays synthesized by Agilent, Inc. using their in-situ, ink-jet technology platform. We first validated the quality of the microarrays as demonstrated by an average signal to noise ratio of >1000. Next, we determined that the variance of replicate probes (1178 total probes examined of identical sequence was 3.8% whereas the variance of alternative probes (558 total alternative probes examined designs was 9.5%. We determined that depending upon the definition, about 2.4% of replicate and 7.8% of alternative probes produced outlier conclusions. Finally, we determined none of the probe design subscores (GC content, internal repeat, binding energy and self annealment produced by ArrayOligoSel3.5 were predictive or probes that produced outlier signals. Conclusion Our analysis demonstrated that the use of multiple probes per target sequence is not essential for in-situ synthesized 60mer oligonucleotide arrays designed against bacteria. Although probes producing outlier signals were identified, the use of ratios results in less than 10% of such outlier conclusions. We also determined that

  15. Methods for interpreting lists of affected genes obtained in a DNA microarray experiment

    DEFF Research Database (Denmark)

    Hedegaard, Jakob; Arce, Christina; Bicciato, Silvio

    2009-01-01

    The aim of this paper was to describe and compare the methods used and the results obtained by the participants in a joint EADGENE (European Animal Disease Genomic Network of Excellence) and SABRE (Cutting Edge Genomics for Sustainable Animal Breeding) workshop focusing on post analysis of microa...... a microarray experiment conducted to study the host reactions in broilers occurring shortly after a secondary challenge with either a homologous or heterologous species of Eimeria...

  16. Methods for interpreting lists of affected genes obtained in a DNA microarray experiment

    Directory of Open Access Journals (Sweden)

    Hedegaard Jakob

    2009-07-01

    Full Text Available Abstract Background The aim of this paper was to describe and compare the methods used and the results obtained by the participants in a joint EADGENE (European Animal Disease Genomic Network of Excellence and SABRE (Cutting Edge Genomics for Sustainable Animal Breeding workshop focusing on post analysis of microarray data. The participating groups were provided with identical lists of microarray probes, including test statistics for three different contrasts, and the normalised log-ratios for each array, to be used as the starting point for interpreting the affected probes. The data originated from a microarray experiment conducted to study the host reactions in broilers occurring shortly after a secondary challenge with either a homologous or heterologous species of Eimeria. Results Several conceptually different analytical approaches, using both commercial and public available software, were applied by the participating groups. The following tools were used: Ingenuity Pathway Analysis, MAPPFinder, LIMMA, GOstats, GOEAST, GOTM, Globaltest, TopGO, ArrayUnlock, Pathway Studio, GIST and AnnotationDbi. The main focus of the approaches was to utilise the relation between probes/genes and their gene ontology and pathways to interpret the affected probes/genes. The lack of a well-annotated chicken genome did though limit the possibilities to fully explore the tools. The main results from these analyses showed that the biological interpretation is highly dependent on the statistical method used but that some common biological conclusions could be reached. Conclusion It is highly recommended to test different analytical methods on the same data set and compare the results to obtain a reliable biological interpretation of the affected genes in a DNA microarray experiment.

  17. Design and evaluation of Actichip, a thematic microarray for the study of the actin cytoskeleton

    Science.gov (United States)

    Muller, Jean; Mehlen, André; Vetter, Guillaume; Yatskou, Mikalai; Muller, Arnaud; Chalmel, Frédéric; Poch, Olivier; Friederich, Evelyne; Vallar, Laurent

    2007-01-01

    Background The actin cytoskeleton plays a crucial role in supporting and regulating numerous cellular processes. Mutations or alterations in the expression levels affecting the actin cytoskeleton system or related regulatory mechanisms are often associated with complex diseases such as cancer. Understanding how qualitative or quantitative changes in expression of the set of actin cytoskeleton genes are integrated to control actin dynamics and organisation is currently a challenge and should provide insights in identifying potential targets for drug discovery. Here we report the development of a dedicated microarray, the Actichip, containing 60-mer oligonucleotide probes for 327 genes selected for transcriptome analysis of the human actin cytoskeleton. Results Genomic data and sequence analysis features were retrieved from GenBank and stored in an integrative database called Actinome. From these data, probes were designed using a home-made program (CADO4MI) allowing sequence refinement and improved probe specificity by combining the complementary information recovered from the UniGene and RefSeq databases. Actichip performance was analysed by hybridisation with RNAs extracted from epithelial MCF-7 cells and human skeletal muscle. Using thoroughly standardised procedures, we obtained microarray images with excellent quality resulting in high data reproducibility. Actichip displayed a large dynamic range extending over three logs with a limit of sensitivity between one and ten copies of transcript per cell. The array allowed accurate detection of small changes in gene expression and reliable classification of samples based on the expression profiles of tissue-specific genes. When compared to two other oligonucleotide microarray platforms, Actichip showed similar sensitivity and concordant expression ratios. Moreover, Actichip was able to discriminate the highly similar actin isoforms whereas the two other platforms did not. Conclusion Our data demonstrate that

  18. Relative impact of key sources of systematic noise in Affymetrix and Illumina gene-expression microarray experiments

    Directory of Open Access Journals (Sweden)

    Kitchen Robert R

    2011-12-01

    Full Text Available Abstract Background Systematic processing noise, which includes batch effects, is very common in microarray experiments but is often ignored despite its potential to confound or compromise experimental results. Compromised results are most likely when re-analysing or integrating datasets from public repositories due to the different conditions under which each dataset is generated. To better understand the relative noise-contributions of various factors in experimental-design, we assessed several Illumina and Affymetrix datasets for technical variation between replicate hybridisations of Universal Human Reference (UHRR and individual or pooled breast-tumour RNA. Results A varying degree of systematic noise was observed in each of the datasets, however in all cases the relative amount of variation between standard control RNA replicates was found to be greatest at earlier points in the sample-preparation workflow. For example, 40.6% of the total variation in reported expressions were attributed to replicate extractions, compared to 13.9% due to amplification/labelling and 10.8% between replicate hybridisations. Deliberate probe-wise batch-correction methods were effective in reducing the magnitude of this variation, although the level of improvement was dependent on the sources of noise included in the model. Systematic noise introduced at the chip, run, and experiment levels of a combined Illumina dataset were found to be highly dependant upon the experimental design. Both UHRR and pools of RNA, which were derived from the samples of interest, modelled technical variation well although the pools were significantly better correlated (4% average improvement and better emulated the effects of systematic noise, over all probes, than the UHRRs. The effect of this noise was not uniform over all probes, with low GC-content probes found to be more vulnerable to batch variation than probes with a higher GC-content. Conclusions The magnitude of systematic

  19. Relative impact of key sources of systematic noise in Affymetrix and Illumina gene-expression microarray experiments.

    Science.gov (United States)

    Kitchen, Robert R; Sabine, Vicky S; Simen, Arthur A; Dixon, J Michael; Bartlett, John M S; Sims, Andrew H

    2011-12-01

    Systematic processing noise, which includes batch effects, is very common in microarray experiments but is often ignored despite its potential to confound or compromise experimental results. Compromised results are most likely when re-analysing or integrating datasets from public repositories due to the different conditions under which each dataset is generated. To better understand the relative noise-contributions of various factors in experimental-design, we assessed several Illumina and Affymetrix datasets for technical variation between replicate hybridisations of Universal Human Reference (UHRR) and individual or pooled breast-tumour RNA. A varying degree of systematic noise was observed in each of the datasets, however in all cases the relative amount of variation between standard control RNA replicates was found to be greatest at earlier points in the sample-preparation workflow. For example, 40.6% of the total variation in reported expressions were attributed to replicate extractions, compared to 13.9% due to amplification/labelling and 10.8% between replicate hybridisations. Deliberate probe-wise batch-correction methods were effective in reducing the magnitude of this variation, although the level of improvement was dependent on the sources of noise included in the model. Systematic noise introduced at the chip, run, and experiment levels of a combined Illumina dataset were found to be highly dependent upon the experimental design. Both UHRR and pools of RNA, which were derived from the samples of interest, modelled technical variation well although the pools were significantly better correlated (4% average improvement) and better emulated the effects of systematic noise, over all probes, than the UHRRs. The effect of this noise was not uniform over all probes, with low GC-content probes found to be more vulnerable to batch variation than probes with a higher GC-content. The magnitude of systematic processing noise in a microarray experiment is variable

  20. A permutation-based multiple testing method for time-course microarray experiments

    Directory of Open Access Journals (Sweden)

    George Stephen L

    2009-10-01

    Full Text Available Abstract Background Time-course microarray experiments are widely used to study the temporal profiles of gene expression. Storey et al. (2005 developed a method for analyzing time-course microarray studies that can be applied to discovering genes whose expression trajectories change over time within a single biological group, or those that follow different time trajectories among multiple groups. They estimated the expression trajectories of each gene using natural cubic splines under the null (no time-course and alternative (time-course hypotheses, and used a goodness of fit test statistic to quantify the discrepancy. The null distribution of the statistic was approximated through a bootstrap method. Gene expression levels in microarray data are often complicatedly correlated. An accurate type I error control adjusting for multiple testing requires the joint null distribution of test statistics for a large number of genes. For this purpose, permutation methods have been widely used because of computational ease and their intuitive interpretation. Results In this paper, we propose a permutation-based multiple testing procedure based on the test statistic used by Storey et al. (2005. We also propose an efficient computation algorithm. Extensive simulations are conducted to investigate the performance of the permutation-based multiple testing procedure. The application of the proposed method is illustrated using the Caenorhabditis elegans dauer developmental data. Conclusion Our method is computationally efficient and applicable for identifying genes whose expression levels are time-dependent in a single biological group and for identifying the genes for which the time-profile depends on the group in a multi-group setting.

  1. GeneRank: Using search engine technology for the analysis of microarray experiments

    Directory of Open Access Journals (Sweden)

    Breitling Rainer

    2005-09-01

    Full Text Available Abstract Background Interpretation of simple microarray experiments is usually based on the fold-change of gene expression between a reference and a "treated" sample where the treatment can be of many types from drug exposure to genetic variation. Interpretation of the results usually combines lists of differentially expressed genes with previous knowledge about their biological function. Here we evaluate a method – based on the PageRank algorithm employed by the popular search engine Google – that tries to automate some of this procedure to generate prioritized gene lists by exploiting biological background information. Results GeneRank is an intuitive modification of PageRank that maintains many of its mathematical properties. It combines gene expression information with a network structure derived from gene annotations (gene ontologies or expression profile correlations. Using both simulated and real data we find that the algorithm offers an improved ranking of genes compared to pure expression change rankings. Conclusion Our modification of the PageRank algorithm provides an alternative method of evaluating microarray experimental results which combines prior knowledge about the underlying network. GeneRank offers an improvement compared to assessing the importance of a gene based on its experimentally observed fold-change alone and may be used as a basis for further analytical developments.

  2. GeneRank: using search engine technology for the analysis of microarray experiments.

    Science.gov (United States)

    Morrison, Julie L; Breitling, Rainer; Higham, Desmond J; Gilbert, David R

    2005-09-21

    Interpretation of simple microarray experiments is usually based on the fold-change of gene expression between a reference and a "treated" sample where the treatment can be of many types from drug exposure to genetic variation. Interpretation of the results usually combines lists of differentially expressed genes with previous knowledge about their biological function. Here we evaluate a method--based on the PageRank algorithm employed by the popular search engine Google--that tries to automate some of this procedure to generate prioritized gene lists by exploiting biological background information. GeneRank is an intuitive modification of PageRank that maintains many of its mathematical properties. It combines gene expression information with a network structure derived from gene annotations (gene ontologies) or expression profile correlations. Using both simulated and real data we find that the algorithm offers an improved ranking of genes compared to pure expression change rankings. Our modification of the PageRank algorithm provides an alternative method of evaluating microarray experimental results which combines prior knowledge about the underlying network. GeneRank offers an improvement compared to assessing the importance of a gene based on its experimentally observed fold-change alone and may be used as a basis for further analytical developments.

  3. Experimenting with a design experiment

    Directory of Open Access Journals (Sweden)

    Bakker, Judith

    2012-12-01

    Full Text Available The design experiment is an experimental research method that aims to help design and further develop new (policy instruments. For the development of a set of guidelines for the facilitation of citizens’ initiatives by local governments, we are experimenting with this method. It offers good opportunities for modeling interventions by testing their instrumental validity –the usefulness for the intended practical purposes. At the same time design experiments are also useful for evaluating the empirical validity of theoretical arguments and the further development of these arguments in the light of empirical evidence (by using e.g. the technique of pattern matching. We describe how we have applied this methodology in two cases and discuss our research approach. We encountered some unexpected difficulties, especially in the cooperation with professionals and citizens. These difficulties complicate the valid attribution of causal effects to the use of the new instrument. However, our preliminary conclusion is that design experiments are useful in our field of study

    El experimento de diseño es un método de investigación experimental que tiene como objetivo diseñar y desarrollar posteriormente nuevas herramientas (políticas. En este artículo experimentamos con este método para desarrollar un conjunto de directrices que permitan a los gobiernos locales facilitar las iniciativas ciudadanas. El método ofrece la oportunidad de modelar las intervenciones poniendo a prueba su validez instrumental (su utilidad para el fin práctico que se proponen. Al mismo tiempo, los experimentos de diseño son útiles también para evaluar la validez empírica de las discusiones teóricas y el posterior desarrollo de esas discusiones a la luz de la evidencia empírica (usando, por ejemplo, técnicas de concordancia de patrones. En este trabajo describimos cómo hemos aplicado este método a dos casos y discutimos nuestro enfoque de

  4. A Bayesian decision procedure for testing multiple hypotheses in DNA microarray experiments.

    Science.gov (United States)

    Gómez-Villegas, Miguel A; Salazar, Isabel; Sanz, Luis

    2014-02-01

    DNA microarray experiments require the use of multiple hypothesis testing procedures because thousands of hypotheses are simultaneously tested. We deal with this problem from a Bayesian decision theory perspective. We propose a decision criterion based on an estimation of the number of false null hypotheses (FNH), taking as an error measure the proportion of the posterior expected number of false positives with respect to the estimated number of true null hypotheses. The methodology is applied to a Gaussian model when testing bilateral hypotheses. The procedure is illustrated with both simulated and real data examples and the results are compared to those obtained by the Bayes rule when an additive loss function is considered for each joint action and the generalized loss 0-1 function for each individual action. Our procedure significantly reduced the percentage of false negatives whereas the percentage of false positives remains at an acceptable level.

  5. Gene Expression Browser: Large-Scale and Cross-Experiment Microarray Data Management, Search & Visualization

    Science.gov (United States)

    The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...

  6. Strategies for comparing gene expression profiles from different microarray platforms: application to a case-control experiment.

    Science.gov (United States)

    Severgnini, Marco; Bicciato, Silvio; Mangano, Eleonora; Scarlatti, Francesca; Mezzelani, Alessandra; Mattioli, Michela; Ghidoni, Riccardo; Peano, Clelia; Bonnal, Raoul; Viti, Federica; Milanesi, Luciano; De Bellis, Gianluca; Battaglia, Cristina

    2006-06-01

    Meta-analysis of microarray data is increasingly important, considering both the availability of multiple platforms using disparate technologies and the accumulation in public repositories of data sets from different laboratories. We addressed the issue of comparing gene expression profiles from two microarray platforms by devising a standardized investigative strategy. We tested this procedure by studying MDA-MB-231 cells, which undergo apoptosis on treatment with resveratrol. Gene expression profiles were obtained using high-density, short-oligonucleotide, single-color microarray platforms: GeneChip (Affymetrix) and CodeLink (Amersham). Interplatform analyses were carried out on 8414 common transcripts represented on both platforms, as identified by LocusLink ID, representing 70.8% and 88.6% of annotated GeneChip and CodeLink features, respectively. We identified 105 differentially expressed genes (DEGs) on CodeLink and 42 DEGs on GeneChip. Among them, only 9 DEGs were commonly identified by both platforms. Multiple analyses (BLAST alignment of probes with target sequences, gene ontology, literature mining, and quantitative real-time PCR) permitted us to investigate the factors contributing to the generation of platform-dependent results in single-color microarray experiments. An effective approach to cross-platform comparison involves microarrays of similar technologies, samples prepared by identical methods, and a standardized battery of bioinformatic and statistical analyses.

  7. Use of a multi-thermal washer for DNA microarrays simplifies probe design and gives robust genotyping assays

    DEFF Research Database (Denmark)

    Petersen, J.; Poulsen, Lena; Petronis, S.

    2008-01-01

    is called a multi-thermal array washer (MTAW), and it has eight individually controlled heating zones, each of which corresponds to the location of a subarray on a slide. Allele-specific oligonucleotide probes for nine mutations in the beta-globin gene were spotted in eight identical subarrays at positions......DNA microarrays are generally operated at a single condition, which severely limits the freedom of designing probes for allele-specific hybridization assays. Here, we demonstrate a fluidic device for multi-stringency posthybridization washing of microarrays on microscope slides. This device...

  8. Carbohydrate microarrays

    DEFF Research Database (Denmark)

    Park, Sungjin; Gildersleeve, Jeffrey C; Blixt, Klas Ola

    2012-01-01

    In the last decade, carbohydrate microarrays have been core technologies for analyzing carbohydrate-mediated recognition events in a high-throughput fashion. A number of methods have been exploited for immobilizing glycans on the solid surface in a microarray format. This microarray...... of substrate specificities of glycosyltransferases. This review covers the construction of carbohydrate microarrays, detection methods of carbohydrate microarrays and their applications in biological and biomedical research....

  9. Mann-Whitney Type Tests for Microarray Experiments: The R Package gMWT

    Directory of Open Access Journals (Sweden)

    Daniel Fischer

    2015-06-01

    Full Text Available We present the R package gMWT which is designed for the comparison of several treatments (or groups for a large number of variables. The comparisons are made using certain probabilistic indices (PI. The PIs computed here tell how often pairs or triples of observations coming from different groups appear in a specific order of magnitude. Classical two and several sample rank test statistics such as the Mann-Whitney-Wilcoxon, Kruskal-Wallis, or Jonckheere-Terpstra test statistics are simple functions of these PI. Also new test statistics for directional alternatives are provided. The package gMWT can be used to calculate the variable-wise PI estimates, to illustrate their multivariate distribution and mutual dependence with joint scatterplot matrices, and to construct several classical and new rank tests based on the PIs. The aim of the paper is first to briefly explain the theory that is necessary to understand the behavior of the estimated PIs and the rank tests based on them. Second, the use of the package is described and illustrated with simulated and real data examples. It is stressed that the package provides a new flexible toolbox to analyze large gene or microRNA expression data sets, collected on microarrays or by other high-throughput technologies. The testing procedures can be used in an eQTL analysis, for example, as implemented in the package GeneticTools.

  10. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  11. Towards the integration, annotation and association of historical microarray experiments with RNA-seq.

    Science.gov (United States)

    Chavan, Shweta S; Bauer, Michael A; Peterson, Erich A; Heuck, Christoph J; Johann, Donald J

    2013-01-01

    Transcriptome analysis by microarrays has produced important advances in biomedicine. For instance in multiple myeloma (MM), microarray approaches led to the development of an effective disease subtyping via cluster assignment, and a 70 gene risk score. Both enabled an improved molecular understanding of MM, and have provided prognostic information for the purposes of clinical management. Many researchers are now transitioning to Next Generation Sequencing (NGS) approaches and RNA-seq in particular, due to its discovery-based nature, improved sensitivity, and dynamic range. Additionally, RNA-seq allows for the analysis of gene isoforms, splice variants, and novel gene fusions. Given the voluminous amounts of historical microarray data, there is now a need to associate and integrate microarray and RNA-seq data via advanced bioinformatic approaches. Custom software was developed following a model-view-controller (MVC) approach to integrate Affymetrix probe set-IDs, and gene annotation information from a variety of sources. The tool/approach employs an assortment of strategies to integrate, cross reference, and associate microarray and RNA-seq datasets. Output from a variety of transcriptome reconstruction and quantitation tools (e.g., Cufflinks) can be directly integrated, and/or associated with Affymetrix probe set data, as well as necessary gene identifiers and/or symbols from a diversity of sources. Strategies are employed to maximize the annotation and cross referencing process. Custom gene sets (e.g., MM 70 risk score (GEP-70)) can be specified, and the tool can be directly assimilated into an RNA-seq pipeline. A novel bioinformatic approach to aid in the facilitation of both annotation and association of historic microarray data, in conjunction with richer RNA-seq data, is now assisting with the study of MM cancer biology.

  12. Democratic design experiments

    DEFF Research Database (Denmark)

    Ehn, Pelle; Brandt, Eva; Halse, Joachim

    2016-01-01

    Designers and design researchers are increasingly exploring societal challenges through engagements with issues that call forward new publics and new modes of democratic citizenship. Whatever this is called design activism, social design, adversarial design, participatory design or something else...

  13. DNA microarrays : a molecular cloning manual

    National Research Council Canada - National Science Library

    Sambrook, Joseph; Bowtell, David

    2002-01-01

    .... DNA Microarrays provides authoritative, detailed instruction on the design, construction, and applications of microarrays, as well as comprehensive descriptions of the software tools and strategies...

  14. miRNAs in lung cancer - Studying complex fingerprints in patient's blood cells by microarray experiments

    Directory of Open Access Journals (Sweden)

    Huwer Hanno

    2009-10-01

    Full Text Available Abstract Background Deregulated miRNAs are found in cancer cells and recently in blood cells of cancer patients. Due to their inherent stability miRNAs may offer themselves for blood based tumor diagnosis. Here we addressed the question whether there is a sufficient number of miRNAs deregulated in blood cells of cancer patients to be able to distinguish between cancer patients and controls. Methods We synthesized 866 human miRNAs and miRNA star sequences as annotated in the Sanger miRBase onto a microarray designed by febit biomed gmbh. Using the fully automated Geniom Real Time Analyzer platform, we analyzed the miRNA expression in 17 blood cell samples of patients with non-small cell lung carcinomas (NSCLC and in 19 blood samples of healthy controls. Results Using t-test, we detected 27 miRNAs significantly deregulated in blood cells of lung cancer patients as compared to the controls. Some of these miRNAs were validated using qRT-PCR. To estimate the value of each deregulated miRNA, we grouped all miRNAs according to their diagnostic information that was measured by Mutual Information. Using a subset of 24 miRNAs, a radial basis function Support Vector Machine allowed for discriminating between blood cellsamples of tumor patients and controls with an accuracy of 95.4% [94.9%-95.9%], a specificity of 98.1% [97.3%-98.8%], and a sensitivity of 92.5% [91.8%-92.5%]. Conclusion Our findings support the idea that neoplasia may lead to a deregulation of miRNA expression in blood cells of cancer patients compared to blood cells of healthy individuals. Furthermore, we provide evidence that miRNA patterns can be used to detect human cancers from blood cells.

  15. Applying Instructional Design Theories to Bioinformatics Education in Microarray Analysis and Primer Design Workshops

    Science.gov (United States)

    Shachak, Aviv; Ophir, Ron; Rubin, Eitan

    2005-01-01

    The need to support bioinformatics training has been widely recognized by scientists, industry, and government institutions. However, the discussion of instructional methods for teaching bioinformatics is only beginning. Here we report on a systematic attempt to design two bioinformatics workshops for graduate biology students on the basis of…

  16. Computational biology of genome expression and regulation--a review of microarray bioinformatics.

    Science.gov (United States)

    Wang, Junbai

    2008-01-01

    Microarray technology is being used widely in various biomedical research areas; the corresponding microarray data analysis is an essential step toward the best utilizing of array technologies. Here we review two components of the microarray data analysis: a low level of microarray data analysis that emphasizes the designing, the quality control, and the preprocessing of microarray experiments, then a high level of microarray data analysis that focuses on the domain-specific microarray applications such as tumor classification, biomarker prediction, analyzing array CGH experiments, and reverse engineering of gene expression networks. Additionally, we will review the recent development of building a predictive model in genome expression and regulation studies. This review may help biologists grasp a basic knowledge of microarray bioinformatics as well as its potential impact on the future evolvement of biomedical research fields.

  17. Extracting Insights from Experience Designers to Enhance User Experience Design

    OpenAIRE

    Kremer, Simon; Lindemann, Udo

    2016-01-01

    User Experience (UX) summarizes how a user expects, perceives and assesses an encounter with a product. User Experience Design (UXD) aims at creating meaningful experiences. While UXD is a rather young discipline with-in product development and traditional processes predominate, other disciplines traditionally focus on creating experiences. We engaged with experience de-signers from the fields of arts, movies, sports, music and event management. By analyzing their working processes via interv...

  18. The tissue micro-array data exchange specification: a web based experience browsing imported data

    Science.gov (United States)

    Nohle, David G; Hackman, Barbara A; Ayers, Leona W

    2005-01-01

    Background The AIDS and Cancer Specimen Resource (ACSR) is an HIV/AIDS tissue bank consortium sponsored by the National Cancer Institute (NCI) Division of Cancer Treatment and Diagnosis (DCTD). The ACSR offers to approved researchers HIV infected biologic samples and uninfected control tissues including tissue cores in micro-arrays (TMA) accompanied by de-identified clinical data. Researchers interested in the type and quality of TMA tissue cores and the associated clinical data need an efficient method for viewing available TMA materials. Because each of the tissue samples within a TMA has separate data including a core tissue digital image and clinical data, an organized, standard approach to producing, navigating and publishing such data is necessary. The Association for Pathology Informatics (API) extensible mark-up language (XML) TMA data exchange specification (TMA DES) proposed in April 2003 provides a common format for TMA data. Exporting TMA data into the proposed format offers an opportunity to implement the API TMA DES. Using our public BrowseTMA tool, we created a web site that organizes and cross references TMA lists, digital "virtual slide" images, TMA DES export data, linked legends and clinical details for researchers. Microsoft Excel® and Microsoft Word® are used to convert tabular clinical data and produce an XML file in the TMA DES format. The BrowseTMA tool contains Extensible Stylesheet Language Transformation (XSLT) scripts that convert XML data into Hyper-Text Mark-up Language (HTML) web pages with hyperlinks automatically added to allow rapid navigation. Results Block lists, virtual slide images, legends, clinical details and exports have been placed on the ACSR web site for 14 blocks with 1623 cores of 2.0, 1.0 and 0.6 mm sizes. Our virtual microscope can be used to view and annotate these TMA images. Researchers can readily navigate from TMA block lists to TMA legends and to clinical details for a selected tissue core. Exports for 11

  19. Automated detection of regions of interest for tissue microarray experiments: an image texture analysis

    International Nuclear Information System (INIS)

    Karaçali, Bilge; Tözeren, Aydin

    2007-01-01

    Recent research with tissue microarrays led to a rapid progress toward quantifying the expressions of large sets of biomarkers in normal and diseased tissue. However, standard procedures for sampling tissue for molecular profiling have not yet been established. This study presents a high throughput analysis of texture heterogeneity on breast tissue images for the purpose of identifying regions of interest in the tissue for molecular profiling via tissue microarray technology. Image texture of breast histology slides was described in terms of three parameters: the percentage of area occupied in an image block by chromatin (B), percentage occupied by stroma-like regions (P), and a statistical heterogeneity index H commonly used in image analysis. Texture parameters were defined and computed for each of the thousands of image blocks in our dataset using both the gray scale and color segmentation. The image blocks were then classified into three categories using the texture feature parameters in a novel statistical learning algorithm. These categories are as follows: image blocks specific to normal breast tissue, blocks specific to cancerous tissue, and those image blocks that are non-specific to normal and disease states. Gray scale and color segmentation techniques led to identification of same regions in histology slides as cancer-specific. Moreover the image blocks identified as cancer-specific belonged to those cell crowded regions in whole section image slides that were marked by two pathologists as regions of interest for further histological studies. These results indicate the high efficiency of our automated method for identifying pathologic regions of interest on histology slides. Automation of critical region identification will help minimize the inter-rater variability among different raters (pathologists) as hundreds of tumors that are used to develop an array have typically been evaluated (graded) by different pathologists. The region of interest

  20. OpWise: Operons aid the identification of differentially expressed genes in bacterial microarray experiments

    Directory of Open Access Journals (Sweden)

    Arkin Adam P

    2006-01-01

    Full Text Available Abstract Background Differentially expressed genes are typically identified by analyzing the variation between replicate measurements. These procedures implicitly assume that there are no systematic errors in the data even though several sources of systematic error are known. Results OpWise estimates the amount of systematic error in bacterial microarray data by assuming that genes in the same operon have matching expression patterns. OpWise then performs a Bayesian analysis of a linear model to estimate significance. In simulations, OpWise corrects for systematic error and is robust to deviations from its assumptions. In several bacterial data sets, significant amounts of systematic error are present, and replicate-based approaches overstate the confidence of the changers dramatically, while OpWise does not. Finally, OpWise can identify additional changers by assigning genes higher confidence if they are consistent with other genes in the same operon. Conclusion Although microarray data can contain large amounts of systematic error, operons provide an external standard and allow for reasonable estimates of significance. OpWise is available at http://microbesonline.org/OpWise.

  1. DNA Microarray Technology; TOPICAL

    International Nuclear Information System (INIS)

    WERNER-WASHBURNE, MARGARET; DAVIDSON, GEORGE S.

    2002-01-01

    Collaboration between Sandia National Laboratories and the University of New Mexico Biology Department resulted in the capability to train students in microarray techniques and the interpretation of data from microarray experiments. These studies provide for a better understanding of the role of stationary phase and the gene regulation involved in exit from stationary phase, which may eventually have important clinical implications. Importantly, this research trained numerous students and is the basis for three new Ph.D. projects

  2. Experience With Rapid Microarray-Based Diagnostic Technology and Antimicrobial Stewardship for Patients With Gram-Positive Bacteremia.

    Science.gov (United States)

    Neuner, Elizabeth A; Pallotta, Andrea M; Lam, Simon W; Stowe, David; Gordon, Steven M; Procop, Gary W; Richter, Sandra S

    2016-11-01

    OBJECTIVE To describe the impact of rapid diagnostic microarray technology and antimicrobial stewardship for patients with Gram-positive blood cultures. DESIGN Retrospective pre-intervention/post-intervention study. SETTING A 1,200-bed academic medical center. PATIENTS Inpatients with blood cultures positive for Staphylococcus aureus, Enterococcus faecalis, E. faecium, Streptococcus pneumoniae, S. pyogenes, S. agalactiae, S. anginosus, Streptococcus spp., and Listeria monocytogenes during the 6 months before and after implementation of Verigene Gram-positive blood culture microarray (BC-GP) with an antimicrobial stewardship intervention. METHODS Before the intervention, no rapid diagnostic technology was used or antimicrobial stewardship intervention was undertaken, except for the use of peptide nucleic acid fluorescent in situ hybridization and MRSA agar to identify staphylococcal isolates. After the intervention, all Gram-positive blood cultures underwent BC-GP microarray and the antimicrobial stewardship intervention consisting of real-time notification and pharmacist review. RESULTS In total, 513 patients with bacteremia were included in this study: 280 patients with S. aureus, 150 patients with enterococci, 82 patients with stretococci, and 1 patient with L. monocytogenes. The number of antimicrobial switches was similar in the pre-BC-GP (52%; 155 of 300) and post-BC-GP (50%; 107 of 213) periods. The time to antimicrobial switch was significantly shorter in the post-BC-GP group than in the pre-BC-GP group: 48±41 hours versus 75±46 hours, respectively (P<.001). The most common antimicrobial switch was de-escalation and time to de-escalation, was significantly shorter in the post-BC-GP group than in the pre-BC-GP group: 53±41 hours versus 82±48 hours, respectively (P<.001). There was no difference in mortality or hospital length of stay as a result of the intervention. CONCLUSIONS The combination of a rapid microarray diagnostic test with an antimicrobial

  3. High-density rhesus macaque oligonucleotide microarray design using early-stage rhesus genome sequence information and human genome annotations

    Directory of Open Access Journals (Sweden)

    Magness Charles L

    2007-01-01

    Full Text Available Abstract Background Until recently, few genomic reagents specific for non-human primate research have been available. To address this need, we have constructed a macaque-specific high-density oligonucleotide microarray by using highly fragmented low-pass sequence contigs from the rhesus genome project together with the detailed sequence and exon structure of the human genome. Using this method, we designed oligonucleotide probes to over 17,000 distinct rhesus/human gene orthologs and increased by four-fold the number of available genes relative to our first-generation expressed sequence tag (EST-derived array. Results We constructed a database containing 248,000 exon sequences from 23,000 human RefSeq genes and compared each human exon with its best matching sequence in the January 2005 version of the rhesus genome project list of 486,000 DNA contigs. Best matching rhesus exon sequences for each of the 23,000 human genes were then concatenated in the proper order and orientation to produce a rhesus "virtual transcriptome." Microarray probes were designed, one per gene, to the region closest to the 3' untranslated region (UTR of each rhesus virtual transcript. Each probe was compared to a composite rhesus/human transcript database to test for cross-hybridization potential yielding a final probe set representing 18,296 rhesus/human gene orthologs, including transcript variants, and over 17,000 distinct genes. We hybridized mRNA from rhesus brain and spleen to both the EST- and genome-derived microarrays. Besides four-fold greater gene coverage, the genome-derived array also showed greater mean signal intensities for genes present on both arrays. Genome-derived probes showed 99.4% identity when compared to 4,767 rhesus GenBank sequence tag site (STS sequences indicating that early stage low-pass versions of complex genomes are of sufficient quality to yield valuable functional genomic information when combined with finished genome information from

  4. Real Life Experiences with Experience Design

    DEFF Research Database (Denmark)

    Dalsgård, Peter; Halskov, Kim

    2006-01-01

    technologies for knowledge dissemination and marketing, in cooperation with public institutions and businesses. We argue that collaborative formulation of core design intentions and values is a valuable instrument in guiding experience design processes, and present three cases from this project, two of which...... resulted in interactive installations. The case installations range from walk-up-and-use consoles, to immersive, responsive, environments based on bodily interaction. We compare the installations, and discuss the interrelations between the resulting interfaces and the intentions for creating...

  5. Design of experiments

    International Nuclear Information System (INIS)

    Drijard, D.

    1978-01-01

    The paper is mostly devoted to simulation problems. The part which concerns detector optimization was essentially treated during the School in a seminar about the Split-Field Magnet (SFM) detector installed at the CERN Intersecting Storage Rings (ISR). This is not given in the written notes since very little of general use can be said about this subject, unless very trivial. The author describes in a detailed way the tools which allow such studies to be made. The notes start by a summary of statistical terms. The main emphasis is then put on Monte Carlo methods and generation of random variables. The last section treats the utilization of detector acceptance, which will be one of the most important parts to optimize when designing a detector. (Auth.)

  6. A random variance model for detection of differential gene expression in small microarray experiments.

    Science.gov (United States)

    Wright, George W; Simon, Richard M

    2003-12-12

    Microarray techniques provide a valuable way of characterizing the molecular nature of disease. Unfortunately expense and limited specimen availability often lead to studies with small sample sizes. This makes accurate estimation of variability difficult, since variance estimates made on a gene by gene basis will have few degrees of freedom, and the assumption that all genes share equal variance is unlikely to be true. We propose a model by which the within gene variances are drawn from an inverse gamma distribution, whose parameters are estimated across all genes. This results in a test statistic that is a minor variation of those used in standard linear models. We demonstrate that the model assumptions are valid on experimental data, and that the model has more power than standard tests to pick up large changes in expression, while not increasing the rate of false positives. This method is incorporated into BRB-ArrayTools version 3.0 (http://linus.nci.nih.gov/BRB-ArrayTools.html). ftp://linus.nci.nih.gov/pub/techreport/RVM_supplement.pdf

  7. Interim Report on SNP analysis and forensic microarray probe design for South American hemorrhagic fever viruses, tick-borne encephalitis virus, henipaviruses, Old World Arenaviruses, filoviruses, Crimean-Congo hemorrhagic fever viruses, Rift Valley fever

    Energy Technology Data Exchange (ETDEWEB)

    Jaing, C; Gardner, S

    2012-06-05

    The goal of this project is to develop forensic genotyping assays for select agent viruses, enhancing the current capabilities for the viral bioforensics and law enforcement community. We used a multipronged approach combining bioinformatics analysis, PCR-enriched samples, microarrays and TaqMan assays to develop high resolution and cost effective genotyping methods for strain level forensic discrimination of viruses. We have leveraged substantial experience and efficiency gained through year 1 on software development, SNP discovery, TaqMan signature design and phylogenetic signature mapping to scale up the development of forensics signatures in year 2. In this report, we have summarized the whole genome wide SNP analysis and microarray probe design for forensics characterization of South American hemorrhagic fever viruses, tick-borne encephalitis viruses and henipaviruses, Old World Arenaviruses, filoviruses, Crimean-Congo hemorrhagic fever virus, Rift Valley fever virus and Japanese encephalitis virus.

  8. Introduction to Statistically Designed Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Heaney, Mike

    2016-09-13

    Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introduced and finally a case study will be presented to demonstrate this methodology.

  9. Small Satellite Mechanical Design Experience

    OpenAIRE

    Meyers, Stewart

    1993-01-01

    The design approach used and the experience gained in the building of four small satellite payloads is explained. Specific recommendations are made and the lessons learned on the SAMPEX program are detailed.

  10. Design of Experiments : An Overview

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2008-01-01

    Design Of Experiments (DOE) is needed for experiments with real-life systems, and with either deterministic or random simulation models. This contribution discusses the different types of DOE for these three domains, but focusses on random simulation. DOE may have two goals: sensitivity analysis

  11. Not proper ROC curves as new tool for the analysis of differentially expressed genes in microarray experiments

    Directory of Open Access Journals (Sweden)

    Pistoia Vito

    2008-10-01

    Full Text Available Abstract Background Most microarray experiments are carried out with the purpose of identifying genes whose expression varies in relation with specific conditions or in response to environmental stimuli. In such studies, genes showing similar mean expression values between two or more groups are considered as not differentially expressed, even if hidden subclasses with different expression values may exist. In this paper we propose a new method for identifying differentially expressed genes, based on the area between the ROC curve and the rising diagonal (ABCR. ABCR represents a more general approach than the standard area under the ROC curve (AUC, because it can identify both proper (i.e., concave and not proper ROC curves (NPRC. In particular, NPRC may correspond to those genes that tend to escape standard selection methods. Results We assessed the performance of our method using data from a publicly available database of 4026 genes, including 14 normal B cell samples (NBC and 20 heterogeneous lymphomas (namely: 9 follicular lymphomas and 11 chronic lymphocytic leukemias. Moreover, NBC also included two sub-classes, i.e., 6 heavily stimulated and 8 slightly or not stimulated samples. We identified 1607 differentially expressed genes with an estimated False Discovery Rate of 15%. Among them, 16 corresponded to NPRC and all escaped standard selection procedures based on AUC and t statistics. Moreover, a simple inspection to the shape of such plots allowed to identify the two subclasses in either one class in 13 cases (81%. Conclusion NPRC represent a new useful tool for the analysis of microarray data.

  12. Design and analysis of experiments

    CERN Document Server

    Dean, Angela; Draguljić, Danel

    2017-01-01

    This textbook takes a strategic approach to the broad-reaching subject of experimental design by identifying the objectives behind an experiment and teaching practical considerations that govern design and implementation, concepts that serve as the basis for the analytical techniques covered. Rather than a collection of miscellaneous approaches, chapters build on the planning, running, and analyzing of simple experiments in an approach that results from decades of teaching the subject. In most experiments, the procedures can be reproduced by readers, thus giving them a broad exposure to experiments that are simple enough to be followed through their entire course. Outlines of student and published experiments appear throughout the text and as exercises at the end of the chapters. The authors develop the theory of estimable functions and analysis of variance with detail, but at a mathematical level that is simultaneously approachable. Throughout the book, statistical aspects of analysis complement practical as...

  13. Design of modern experiments(revised version)

    International Nuclear Information System (INIS)

    Park, Sung Hweon

    1984-03-01

    This book mentions design of modern experiments. It includes conception of design of experiments, a key statistics theory, one way design of experiment, two-way layout without repetition and with repetition, multi layout and analysis of enumerated data, partition, correlation and regression analysis, latin squares, factorial design, design of experiment by table of orthogonal arrays I, II, incomplete block design, design of response surface, design of compound experiment, Evop and steepest ascent or descent method and design of experiment of taguchi.

  14. Design experience: CRBRP radiation shielding

    International Nuclear Information System (INIS)

    Disney, R.K.; Chan, T.C.; Gallo, F.G.; Hedgecock, L.R.; McGinnis, C.A.; Wrights, G.N.

    1978-11-01

    The Clinch River Breeder Reactor Plant (CRBRP) is being designed as a fast breeder demonstration project in the U.S. Liquid Metal Fast Breeder Reactor (LMFBR) program. Radiation shielding design of the facility consists of a comprehensive design approach to assure compliance with design and government regulatory requirements. Studies conducted during the CRBRP design process involved the aspects of radiation shielding dealing with protection of components, systems, and personnel from radiation exposure. Achievement of feasible designs, while considering the mechanical, structural, nuclear, and thermal performance of the component or system, has required judicious trade-offs in radiation shielding performance. Specific design problems which have been addressed are in-vessel radial shielding to protect permanent core support structures, flux monitor system shielding to isolate flux monitoring systems for extraneous background sources, reactor vessel support shielding to allow personnel access to the closure head during full power operation, and primary heat transport system pipe chaseway shielding to limit intermediate heat transport system sodium system coolant activation. The shielding design solutions to these problems defined a need for prototypic or benchmark experiments to provide assurance of the predicted shielding performance of selected design solutions and the verification of design methodology. Design activities of CRBRP plant components an systems, which have the potential for radiation exposure of plant personnel during operation or maintenance, are controlled by a design review process related to radiation shielding. The program implements design objectives, design requirements, and cost/benefit guidelines to assure that radiation exposures will be ''as low as reasonably achievable''

  15. Versator divertor experiment: preliminary designs

    International Nuclear Information System (INIS)

    Wan, A.S.; Yang, T.F.

    1984-08-01

    The emergence of magnetic divertors as an impurity control and ash removal mechanism for future tokamak reactors bring on the need for further experimental verification of the divertor merits and their ability to operate at reactor relevant conditions, such as with auxiliary heating. This paper presents preliminary designs of a bundle and a poloidal divertor for Versator II, which can operate in conjunction with the existing 150 kW of LHRF heating or LH current drive. The bundle divertor option also features a new divertor configuration which should improve the engineering and physics results of the DITE experiment. Further design optimization in both physics and engineering designs are currently under way

  16. Tissue microarray design and construction for scientific, industrial and diagnostic use

    Directory of Open Access Journals (Sweden)

    Daniela Pilla

    2012-01-01

    Full Text Available Context: In 2013 the high throughput technology known as Tissue Micro Array (TMA will be fifteen years old. Its elements (design, construction and analysis are intuitive and the core histopathology technique is unsophisticated, which may be a reason why has eluded a rigorous scientific scrutiny. The source of errors, particularly in specimen identification and how to control for it is unreported. Formal validation of the accuracy of segmenting (also known as de-arraying hundreds of samples, pairing with the sample data is lacking. Aims: We wanted to address these issues in order to bring the technique to recognized standards of quality in TMA use for research, diagnostics and industrial purposes. Results: We systematically addressed the sources of error and used barcode-driven data input throughout the whole process including matching the design with a TMA virtual image and segmenting that image back to individual cases, together with the associated data. In addition we demonstrate on mathematical grounds that a TMA design, when superimposed onto the corresponding whole slide image, validates on each and every sample the correspondence between the image and patient′s data. Conclusions: High throughput use of the TMA technology is a safe and efficient method for research, diagnosis and industrial use if all sources of errors are identified and addressed.

  17. Heterologous microarray experiments allow the identification of the early events associated with potato tuber cold sweetening

    Directory of Open Access Journals (Sweden)

    Vitulli Federico

    2008-04-01

    Full Text Available Abstract Background Since its discovery more than 100 years ago, potato (Solanum tuberosum tuber cold-induced sweetening (CIS has been extensively investigated. Several carbohydrate-associated genes would seem to be involved in the process. However, many uncertainties still exist, as the relative contribution of each gene to the process is often unclear, possibly as the consequence of the heterogeneity of experimental systems. Some enzymes associated with CIS, such as β-amylases and invertases, have still to be identified at a sequence level. In addition, little is known about the early events that trigger CIS and on the involvement/association with CIS of genes different from carbohydrate-associated genes. Many of these uncertainties could be resolved by profiling experiments, but no GeneChip is available for the potato, and the production of the potato cDNA spotted array (TIGR has recently been discontinued. In order to obtain an overall picture of early transcriptional events associated with CIS, we investigated whether the commercially-available tomato Affymetrix GeneChip could be used to identify which potato cold-responsive gene family members should be further studied in detail by Real-Time (RT-PCR (qPCR. Results A tomato-potato Global Match File was generated for the interpretation of various aspects of the heterologous dataset, including the retrieval of best matching potato counterparts and annotation, and the establishment of a core set of highly homologous genes. Several cold-responsive genes were identified, and their expression pattern was studied in detail by qPCR over 26 days. We detected biphasic behaviour of mRNA accumulation for carbohydrate-associated genes and our combined GeneChip-qPCR data identified, at a sequence level, enzymatic activities such as β-amylases and invertases previously reported as being involved in CIS. The GeneChip data also unveiled important processes accompanying CIS, such as the induction of redox

  18. miRNAs in lung cancer - Studying complex fingerprints in patient's blood cells by microarray experiments

    International Nuclear Information System (INIS)

    Keller, Andreas; Leidinger, Petra; Borries, Anne; Wendschlag, Anke; Wucherpfennig, Frank; Scheffler, Matthias; Huwer, Hanno; Lenhof, Hans-Peter; Meese, Eckart

    2009-01-01

    Deregulated miRNAs are found in cancer cells and recently in blood cells of cancer patients. Due to their inherent stability miRNAs may offer themselves for blood based tumor diagnosis. Here we addressed the question whether there is a sufficient number of miRNAs deregulated in blood cells of cancer patients to be able to distinguish between cancer patients and controls. We synthesized 866 human miRNAs and miRNA star sequences as annotated in the Sanger miRBase onto a microarray designed by febit biomed gmbh. Using the fully automated Geniom Real Time Analyzer platform, we analyzed the miRNA expression in 17 blood cell samples of patients with non-small cell lung carcinomas (NSCLC) and in 19 blood samples of healthy controls. Using t-test, we detected 27 miRNAs significantly deregulated in blood cells of lung cancer patients as compared to the controls. Some of these miRNAs were validated using qRT-PCR. To estimate the value of each deregulated miRNA, we grouped all miRNAs according to their diagnostic information that was measured by Mutual Information. Using a subset of 24 miRNAs, a radial basis function Support Vector Machine allowed for discriminating between blood cellsamples of tumor patients and controls with an accuracy of 95.4% [94.9%-95.9%], a specificity of 98.1% [97.3%-98.8%], and a sensitivity of 92.5% [91.8%-92.5%]. Our findings support the idea that neoplasia may lead to a deregulation of miRNA expression in blood cells of cancer patients compared to blood cells of healthy individuals. Furthermore, we provide evidence that miRNA patterns can be used to detect human cancers from blood cells

  19. Designing Effective Undergraduate Research Experiences

    Science.gov (United States)

    Severson, S.

    2010-12-01

    I present a model for designing student research internships that is informed by the best practices of the Center for Adaptive Optics (CfAO) Professional Development Program. The dual strands of the CfAO education program include: the preparation of early-career scientists and engineers in effective teaching; and changing the learning experiences of students (e.g., undergraduate interns) through inquiry-based "teaching laboratories." This paper will focus on the carry-over of these ideas into the design of laboratory research internships such as the CfAO Mainland internship program as well as NSF REU (Research Experiences for Undergraduates) and senior-thesis or "capstone" research programs. Key ideas in maximizing student learning outcomes and generating productive research during internships include: defining explicit content, scientific process, and attitudinal goals for the project; assessment of student prior knowledge and experience, then following up with formative assessment throughout the project; setting reasonable goals with timetables and addressing motivation; and giving students ownership of the research by implementing aspects of the inquiry process within the internship.

  20. Identifying Fishes through DNA Barcodes and Microarrays.

    Directory of Open Access Journals (Sweden)

    Marc Kochzius

    2010-09-01

    Full Text Available International fish trade reached an import value of 62.8 billion Euro in 2006, of which 44.6% are covered by the European Union. Species identification is a key problem throughout the life cycle of fishes: from eggs and larvae to adults in fisheries research and control, as well as processed fish products in consumer protection.This study aims to evaluate the applicability of the three mitochondrial genes 16S rRNA (16S, cytochrome b (cyt b, and cytochrome oxidase subunit I (COI for the identification of 50 European marine fish species by combining techniques of "DNA barcoding" and microarrays. In a DNA barcoding approach, neighbour Joining (NJ phylogenetic trees of 369 16S, 212 cyt b, and 447 COI sequences indicated that cyt b and COI are suitable for unambiguous identification, whereas 16S failed to discriminate closely related flatfish and gurnard species. In course of probe design for DNA microarray development, each of the markers yielded a high number of potentially species-specific probes in silico, although many of them were rejected based on microarray hybridisation experiments. None of the markers provided probes to discriminate the sibling flatfish and gurnard species. However, since 16S-probes were less negatively influenced by the "position of label" effect and showed the lowest rejection rate and the highest mean signal intensity, 16S is more suitable for DNA microarray probe design than cty b and COI. The large portion of rejected COI-probes after hybridisation experiments (>90% renders the DNA barcoding marker as rather unsuitable for this high-throughput technology.Based on these data, a DNA microarray containing 64 functional oligonucleotide probes for the identification of 30 out of the 50 fish species investigated was developed. It represents the next step towards an automated and easy-to-handle method to identify fish, ichthyoplankton, and fish products.

  1. The intraclass correlation coefficient applied for evaluation of data correction, labeling methods and rectal biopsy sampling in DNA microarray experiments

    NARCIS (Netherlands)

    Pellis, E.P.M.; Franssen-Hal, van N.L.W.; Burema, J.; Keijer, J.

    2003-01-01

    We show that the intraclass correlation coefficient (ICC) can be used as a relatively simple statistical measure to assess methodological and biological variation in DNA microarray analysis. The ICC is a measure that determines the reproducibility of a variable, which can easily be calculated from

  2. Emerging use of gene expression microarrays in plant physiology.

    Science.gov (United States)

    Wullschleger, Stan D; Difazio, Stephen P

    2003-01-01

    Microarrays have become an important technology for the global analysis of gene expression in humans, animals, plants, and microbes. Implemented in the context of a well-designed experiment, cDNA and oligonucleotide arrays can provide highthroughput, simultaneous analysis of transcript abundance for hundreds, if not thousands, of genes. However, despite widespread acceptance, the use of microarrays as a tool to better understand processes of interest to the plant physiologist is still being explored. To help illustrate current uses of microarrays in the plant sciences, several case studies that we believe demonstrate the emerging application of gene expression arrays in plant physiology were selected from among the many posters and presentations at the 2003 Plant and Animal Genome XI Conference. Based on this survey, microarrays are being used to assess gene expression in plants exposed to the experimental manipulation of air temperature, soil water content and aluminium concentration in the root zone. Analysis often includes characterizing transcript profiles for multiple post-treatment sampling periods and categorizing genes with common patterns of response using hierarchical clustering techniques. In addition, microarrays are also providing insights into developmental changes in gene expression associated with fibre and root elongation in cotton and maize, respectively. Technical and analytical limitations of microarrays are discussed and projects attempting to advance areas of microarray design and data analysis are highlighted. Finally, although much work remains, we conclude that microarrays are a valuable tool for the plant physiologist interested in the characterization and identification of individual genes and gene families with potential application in the fields of agriculture, horticulture and forestry.

  3. Emerging Use of Gene Expression Microarrays in Plant Physiology

    Directory of Open Access Journals (Sweden)

    Stephen P. Difazio

    2006-04-01

    Full Text Available Microarrays have become an important technology for the global analysis of gene expression in humans, animals, plants, and microbes. Implemented in the context of a well-designed experiment, cDNA and oligonucleotide arrays can provide highthroughput, simultaneous analysis of transcript abundance for hundreds, if not thousands, of genes. However, despite widespread acceptance, the use of microarrays as a tool to better understand processes of interest to the plant physiologist is still being explored. To help illustrate current uses of microarrays in the plant sciences, several case studies that we believe demonstrate the emerging application of gene expression arrays in plant physiology were selected from among the many posters and presentations at the 2003 Plant and Animal Genome XI Conference. Based on this survey, microarrays are being used to assess gene expression in plants exposed to the experimental manipulation of air temperature, soil water content and aluminium concentration in the root zone. Analysis often includes characterizing transcript profiles for multiple post-treatment sampling periods and categorizing genes with common patterns of response using hierarchical clustering techniques. In addition, microarrays are also providing insights into developmental changes in gene expression associated with fibre and root elongation in cotton and maize, respectively. Technical and analytical limitations of microarrays are discussed and projects attempting to advance areas of microarray design and data analysis are highlighted. Finally, although much work remains, we conclude that microarrays are a valuable tool for the plant physiologist interested in the characterization and identification of individual genes and gene families with potential application in the fields of agriculture, horticulture and forestry.

  4. Design and analysis of experiments with SAS

    CERN Document Server

    Lawson, John

    2010-01-01

    IntroductionStatistics and Data Collection Beginnings of Statistically Planned Experiments Definitions and Preliminaries Purposes of Experimental Design Types of Experimental Designs Planning Experiments Performing the Experiments Use of SAS SoftwareCompletely Randomized Designs with One Factor Introduction Replication and Randomization A Historical Example Linear Model for Completely Randomized Design (CRD) Verifying Assumptions of the Linear Model Analysis Strategies When Assumptions Are Violated Determining the Number of Replicates Comparison of Treatments after the F-TestFactorial Designs

  5. A new method for class prediction based on signed-rank algorithms applied to Affymetrix® microarray experiments

    Directory of Open Access Journals (Sweden)

    Vassal Aurélien

    2008-01-01

    Full Text Available Abstract Background The huge amount of data generated by DNA chips is a powerful basis to classify various pathologies. However, constant evolution of microarray technology makes it difficult to mix data from different chip types for class prediction of limited sample populations. Affymetrix® technology provides both a quantitative fluorescence signal and a decision (detection call: absent or present based on signed-rank algorithms applied to several hybridization repeats of each gene, with a per-chip normalization. We developed a new prediction method for class belonging based on the detection call only from recent Affymetrix chip type. Biological data were obtained by hybridization on U133A, U133B and U133Plus 2.0 microarrays of purified normal B cells and cells from three independent groups of multiple myeloma (MM patients. Results After a call-based data reduction step to filter out non class-discriminative probe sets, the gene list obtained was reduced to a predictor with correction for multiple testing by iterative deletion of probe sets that sequentially improve inter-class comparisons and their significance. The error rate of the method was determined using leave-one-out and 5-fold cross-validation. It was successfully applied to (i determine a sex predictor with the normal donor group classifying gender with no error in all patient groups except for male MM samples with a Y chromosome deletion, (ii predict the immunoglobulin light and heavy chains expressed by the malignant myeloma clones of the validation group and (iii predict sex, light and heavy chain nature for every new patient. Finally, this method was shown powerful when compared to the popular classification method Prediction Analysis of Microarray (PAM. Conclusion This normalization-free method is routinely used for quality control and correction of collection errors in patient reports to clinicians. It can be easily extended to multiple class prediction suitable with

  6. Cost Optimal System Identification Experiment Design

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    A structural system identification experiment design method is formulated in the light of decision theory, structural reliability theory and optimization theory. The experiment design is based on a preposterior analysis, well-known from the classical decision theory. I.e. the decisions concerning...... reflecting the cost of the experiment and the value of obtained additional information. An example concerning design of an experiment for parametric identification of a single degree of freedom structural system shows the applicability of the experiment design method....... the experiment design are not based on obtained experimental data. Instead the decisions are based on the expected experimental data assumed to be obtained from the measurements, estimated based on prior information and engineering judgement. The design method provides a system identification experiment design...

  7. Development, characterization and experimental validation of a cultivated sunflower (Helianthus annuus L.) gene expression oligonucleotide microarray.

    Science.gov (United States)

    Fernandez, Paula; Soria, Marcelo; Blesa, David; DiRienzo, Julio; Moschen, Sebastian; Rivarola, Maximo; Clavijo, Bernardo Jose; Gonzalez, Sergio; Peluffo, Lucila; Príncipi, Dario; Dosio, Guillermo; Aguirrezabal, Luis; García-García, Francisco; Conesa, Ana; Hopp, Esteban; Dopazo, Joaquín; Heinz, Ruth Amelia; Paniego, Norma

    2012-01-01

    Oligonucleotide-based microarrays with accurate gene coverage represent a key strategy for transcriptional studies in orphan species such as sunflower, H. annuus L., which lacks full genome sequences. The goal of this study was the development and functional annotation of a comprehensive sunflower unigene collection and the design and validation of a custom sunflower oligonucleotide-based microarray. A large scale EST (>130,000 ESTs) curation, assembly and sequence annotation was performed using Blast2GO (www.blast2go.de). The EST assembly comprises 41,013 putative transcripts (12,924 contigs and 28,089 singletons). The resulting Sunflower Unigen Resource (SUR version 1.0) was used to design an oligonucleotide-based Agilent microarray for cultivated sunflower. This microarray includes a total of 42,326 features: 1,417 Agilent controls, 74 control probes for sunflower replicated 10 times (740 controls) and 40,169 different non-control probes. Microarray performance was validated using a model experiment examining the induction of senescence by water deficit. Pre-processing and differential expression analysis of Agilent microarrays was performed using the Bioconductor limma package. The analyses based on p-values calculated by eBayes (psunflower unigene collection, and a custom, validated sunflower oligonucleotide-based microarray using Agilent technology. Both the curated unigene collection and the validated oligonucleotide microarray provide key resources for sunflower genome analysis, transcriptional studies, and molecular breeding for crop improvement.

  8. Identification of Differentially Expressed IGFBP5-Related Genes in Breast Cancer Tumor Tissues Using cDNA Microarray Experiments.

    Science.gov (United States)

    Akkiprik, Mustafa; Peker, İrem; Özmen, Tolga; Amuran, Gökçe Güllü; Güllüoğlu, Bahadır M; Kaya, Handan; Özer, Ayşe

    2015-11-10

    IGFBP5 is an important regulatory protein in breast cancer progression. We tried to identify differentially expressed genes (DEGs) between breast tumor tissues with IGFBP5 overexpression and their adjacent normal tissues. In this study, thirty-eight breast cancer and adjacent normal breast tissue samples were used to determine IGFBP5 expression by qPCR. cDNA microarrays were applied to the highest IGFBP5 overexpressed tumor samples compared to their adjacent normal breast tissue. Microarray analysis revealed that a total of 186 genes were differentially expressed in breast cancer compared with normal breast tissues. Of the 186 genes, 169 genes were downregulated and 17 genes were upregulated in the tumor samples. KEGG pathway analyses showed that protein digestion and absorption, focal adhesion, salivary secretion, drug metabolism-cytochrome P450, and phenylalanine metabolism pathways are involved. Among these DEGs, the prominent top two genes (MMP11 and COL1A1) which potentially correlated with IGFBP5 were selected for validation using real time RT-qPCR. Only COL1A1 expression showed a consistent upregulation with IGFBP5 expression and COL1A1 and MMP11 were significantly positively correlated. We concluded that the discovery of coordinately expressed genes related with IGFBP5 might contribute to understanding of the molecular mechanism of the function of IGFBP5 in breast cancer. Further functional studies on DEGs and association with IGFBP5 may identify novel biomarkers for clinical applications in breast cancer.

  9. STORYPLY : designing for user experiences using storycraft

    NARCIS (Netherlands)

    Atasoy, B.; Martens, J.B.O.S.; Markopoulos, P.; Martens, J.B.; Malins, J.; Coninx, K.; Liapis, A.

    2016-01-01

    The role of design shifts from designing objects towards designing for experiences. The design profession has to follow this trend but the current skill-set of designers focuses mainly on objects; their form, function, manufacturing and interaction. However, contemporary methods and tools that

  10. Mine-by experiment final design report

    International Nuclear Information System (INIS)

    Read, R.S.; Martin, C.D.

    1991-12-01

    The Underground Research Laboratory (URL) Mine-by Experiment is designed to provide information on rock mass response to excavation that will be used to assess important aspects of the design of a nuclear fuel waste disposal vault in a granitic pluton. The final experiment design is the result of a multidisciplinary approach, drawing on experience gained at other sites as well as the URL, and using both internal expertise and the external consultants. The final experiment design, including details on characterization, construction, instrumentation, and numerical modelling, is presented along with final design drawings

  11. Design principles for a large RFP experiment

    International Nuclear Information System (INIS)

    Phillpott, J.; Rostagni, G.; Di Marco, J.

    1981-01-01

    An RFP experiment (RFX) has been designed by an International Design Team, by groups of collaborating physicists and engineers working in their home laboratories. This international collaborative project has been brought to an advanced stage of system and component design by the co-operation of three design teams under the co-ordination of a Design Manager, based at Culham Laboratory. The paper summaries the important design principles for an RFP device, based on the outcome of this collaborative design project

  12. Tokamak Physics Experiment (TPX) design

    International Nuclear Information System (INIS)

    Schmidt, J.A.

    1995-01-01

    TPX is a national project involving a large number of US fusion laboratories, universities, and industries. The element of the TPX requirements that is a primary driver for the hardware design is the fact that TPX tokamak hardware is being designed to accommodate steady state operation if the external systems are upgraded from the 1,000 second initial operation. TPX not only incorporates new physics, but also pioneers new technologies to be used in ITER and other future reactors. TPX will be the first tokamak with fully superconducting magnetic field coils using advanced conductors, will have internal nuclear shielding, will use robotics for machine maintenance, and will remove the continuous, concentrated heat flow from the plasma with new dispersal techniques and with special materials that are actively cooled. The Conceptual Design for TPX was completed during Fiscal Year 1993. The Preliminary Design formally began at the beginning of Fiscal Year 1994. Industrial contracts have been awarded for the design, with options for fabrication, of the primary tokamak hardware. A large fraction of the design and R and D effort during FY94 was focused on the tokamak and in turn on the tokamak magnets. The reason for this emphasis is because the magnets require a large design and R and D effort, and are critical to the project schedule. The magnet development is focused on conductor development, quench protection, and manufacturing R and D. The Preliminary Design Review for the Magnets is planned for fall, 1995

  13. Design and Analysis of simulation experiments : Tutorial

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2017-01-01

    This tutorial reviews the design and analysis of simulation experiments. These experiments may have various goals: validation, prediction, sensitivity analysis, optimization (possibly robust), and risk or uncertainty analysis. These goals may be realized through metamodels. Two types of metamodels

  14. Involving Motion Graphics in Spatial Experience Design

    DEFF Research Database (Denmark)

    Steijn, Arthur

    2013-01-01

    elements such as e.g. space, tone, color, movement, time and timing. Developing this design model has two purposes. The first is as a tool for analyzing empirical examples or cases of where motion graphics is used in spatial experience design. The second is as a tool that can be used in the actual design...... process, and therefore it should be constructed as such. Since the development of the design model has this double focus, I involve design students in design laboratories related to my practice as a teacher in visual communication design and production design. I also reflect on how an initial design...

  15. Experiment Design and Analysis Guide - Neutronics & Physics

    Energy Technology Data Exchange (ETDEWEB)

    Misti A Lillo

    2014-06-01

    The purpose of this guide is to provide a consistent, standardized approach to performing neutronics/physics analysis for experiments inserted into the Advanced Test Reactor (ATR). This document provides neutronics/physics analysis guidance to support experiment design and analysis needs for experiments irradiated in the ATR. This guide addresses neutronics/physics analysis in support of experiment design, experiment safety, and experiment program objectives and goals. The intent of this guide is to provide a standardized approach for performing typical neutronics/physics analyses. Deviation from this guide is allowed provided that neutronics/physics analysis details are properly documented in an analysis report.

  16. HAMMLAB 1999 experimental control room: design - design rationale - experiences

    International Nuclear Information System (INIS)

    Foerdestroemmen, N. T.; Meyer, B. D.; Saarni, R.

    1999-01-01

    A presentation of HAMMLAB 1999 experimental control room, and the accumulated experiences gathered in the areas of design and design rationale as well as user experiences. It is concluded that HAMMLAB 1999 experimental control room is a realistic, compact and efficient control room well suited as an Advanced NPP Control Room (ml)

  17. Design of experiments in production engineering

    CERN Document Server

    2016-01-01

    This book covers design of experiments (DoE) applied in production engineering as a combination of manufacturing technology with applied management science. It presents recent research advances and applications of design experiments in production engineering and the chapters cover metal cutting tools, soft computing for modelling and optmization of machining, waterjet machining of high performance ceramics, among others.

  18. Smashing UX design foundations for designing online user experiences

    CERN Document Server

    Allen, Jesmond

    2012-01-01

    The ultimate guide to UX from the world's most popular resource for web designers and developers Smashing Magazine is the world's most popular resource for web designers and developers and with this book, the authors provide the pinnacle resource to becoming savvy with User Experience Design (UX). The authors first provide an overview of UX and chart its rise to becoming a valuable and necessary practice for narrowing the gap between Web sites, applications, and users in order to make a user's experience a happy, easy, and successful one.Examines the essential aspects of User Experience Design

  19. Elements of Design of Experiments

    Science.gov (United States)

    2007-06-01

    culture sample Yeast in this culture? Guess too little – incomplete fermentation ; too much -- bitter beer He wanted to get it right 1998 – Mike Kelly...Broadly - Spiral 1 Design for Maverick H/K AF T&E Days – Dec 05 I-17 A beer and a blemish … 1906 – W.T. Gossett, a Guinness chemist Draw a yeast...we’re 10-guys and that’s what we do. AF T&E Days – Dec 05 I-53 Yesterday’s typical OFAT Test – Brewing Beer … OFAT works if response contours align

  20. Design and analysis of experiments

    CERN Document Server

    Hinkelmann, Klaus

    This book discusses special modifications and extensions of designs that arise in certain fields of application such as genetics, bioinformatics, agriculture, medicine, manufacturing, marketing, etc. Well-known and highly-regarded contributors have written individual chapters that have been extensively reviewed by the Editor to ensure that each individual contribution relates to material found in Volumes 1 and 2 of this book series. The chapters in Volume 3 have an introductory/historical component and proceed to a more advanced technical level to discuss the latest results and future developm

  1. Spot detection and image segmentation in DNA microarray data.

    Science.gov (United States)

    Qin, Li; Rueda, Luis; Ali, Adnan; Ngom, Alioune

    2005-01-01

    Following the invention of microarrays in 1994, the development and applications of this technology have grown exponentially. The numerous applications of microarray technology include clinical diagnosis and treatment, drug design and discovery, tumour detection, and environmental health research. One of the key issues in the experimental approaches utilising microarrays is to extract quantitative information from the spots, which represent genes in a given experiment. For this process, the initial stages are important and they influence future steps in the analysis. Identifying the spots and separating the background from the foreground is a fundamental problem in DNA microarray data analysis. In this review, we present an overview of state-of-the-art methods for microarray image segmentation. We discuss the foundations of the circle-shaped approach, adaptive shape segmentation, histogram-based methods and the recently introduced clustering-based techniques. We analytically show that clustering-based techniques are equivalent to the one-dimensional, standard k-means clustering algorithm that utilises the Euclidean distance.

  2. Democratic design experiments: between parliament and laboratory

    DEFF Research Database (Denmark)

    Binder, Thomas; Brandt, Eva; Ehn, Pelle

    2015-01-01

    For more than four decades participatory design has provided exemplars and concepts for understanding the democratic potential of design participation. Despite important impacts on design methodology participatory design has however been stuck in a marginal position as it has wrestled with what has...... been performed and accomplished in participatory practices. In this article we discuss how participatory design may be reinvigorated as a design research programme for democratic design experiments in the light of the de-centring of human-centredness and the foregrounding of collaborative...

  3. Eye tracking in user experience design

    CERN Document Server

    Romano Bergstorm, Jennifer

    2014-01-01

    Eye Tracking for User Experience Design explores the many applications of eye tracking to better understand how users view and interact with technology. Ten leading experts in eye tracking discuss how they have taken advantage of this new technology to understand, design, and evaluate user experience. Real-world stories are included from these experts who have used eye tracking during the design and development of products ranging from information websites to immersive games. They also explore recent advances in the technology which tracks how users interact with mobile devices, large-screen displays and video game consoles. Methods for combining eye tracking with other research techniques for a more holistic understanding of the user experience are discussed. This is an invaluable resource to those who want to learn how eye tracking can be used to better understand and design for their users. * Includes highly relevant examples and information for those who perform user research and design interactive experi...

  4. Design for experience where technology meets design and strategy

    CERN Document Server

    Kim, Jinwoo

    2015-01-01

    Presents a strategic perspective and design methodology that guide the process of developing digital products and services that provide 'real experience' to users. Only when the material experienced runs its course to fulfilment is it then regarded as 'real experience' that is distinctively senseful, evaluated as valuable, and harmoniously related to others. Based on the theoretical background of human experience, the book focuses on these three questions: How can we understand the current dominant designs of digital products and services? What are the user experience factor

  5. Prenatal alcohol exposure alters gene expression in the rat brain: Experimental design and bioinformatic analysis of microarray data

    Directory of Open Access Journals (Sweden)

    Alexandre A. Lussier

    2015-09-01

    Full Text Available We previously identified gene expression changes in the prefrontal cortex and hippocampus of rats prenatally exposed to alcohol under both steady-state and challenge conditions (Lussier et al., 2015, Alcohol.: Clin. Exp. Res., 39, 251–261. In this study, adult female rats from three prenatal treatment groups (ad libitum-fed control, pair-fed, and ethanol-fed were injected with physiological saline solution or complete Freund׳s adjuvant (CFA to induce arthritis (adjuvant-induced arthritis, AA. The prefrontal cortex and hippocampus were collected 16 days (peak of arthritis or 39 days (during recovery following injection, and whole genome gene expression was assayed using Illumina׳s RatRef-12 expression microarray. Here, we provide additional metadata, detailed explanations of data pre-processing steps and quality control, as well as a basic framework for the bioinformatic analyses performed. The datasets from this study are publicly available on the GEO repository (accession number GSE63561.

  6. Resourcing of Experience in Co-Design

    DEFF Research Database (Denmark)

    Ylirisku, Salu; Revsbæk, Line; Buur, Jacob

    2017-01-01

    , knowledge to benefit its cultivation is expected to be highly valuable in contemporary multi-cultural design work. This paper approaches the study of the involvement of various stakeholders in design projects through a lens of resourcing experience. Building from G. H. Mead’s pragmatist theory, we devise...... and Scandinavia. By identifying ways in which experience is resourced in specific design interactions, the paper illustrates resourcing to be responsive, conceptual and habitual. The paper concludes by pinpointing strategic means that design teams may use in order to enable rich involvement and resourcing...

  7. Super Spool: An Experiment in Powerplant Design

    Science.gov (United States)

    Kesler, Ronald

    1974-01-01

    Discusses the use of rubberbands, an empty wooden thread spool, two wooden matches, a wax washer, and a small nail to conduct an experiment or demonstration in powerplant design. Detailed procedures and suggested activities are included. (CC)

  8. Integrated olfactory receptor and microarray gene expression databases

    Directory of Open Access Journals (Sweden)

    Crasto Chiquito J

    2007-06-01

    Full Text Available Abstract Background Gene expression patterns of olfactory receptors (ORs are an important component of the signal encoding mechanism in the olfactory system since they determine the interactions between odorant ligands and sensory neurons. We have developed the Olfactory Receptor Microarray Database (ORMD to house OR gene expression data. ORMD is integrated with the Olfactory Receptor Database (ORDB, which is a key repository of OR gene information. Both databases aim to aid experimental research related to olfaction. Description ORMD is a Web-accessible database that provides a secure data repository for OR microarray experiments. It contains both publicly available and private data; accessing the latter requires authenticated login. The ORMD is designed to allow users to not only deposit gene expression data but also manage their projects/experiments. For example, contributors can choose whether to make their datasets public. For each experiment, users can download the raw data files and view and export the gene expression data. For each OR gene being probed in a microarray experiment, a hyperlink to that gene in ORDB provides access to genomic and proteomic information related to the corresponding olfactory receptor. Individual ORs archived in ORDB are also linked to ORMD, allowing users access to the related microarray gene expression data. Conclusion ORMD serves as a data repository and project management system. It facilitates the study of microarray experiments of gene expression in the olfactory system. In conjunction with ORDB, ORMD integrates gene expression data with the genomic and functional data of ORs, and is thus a useful resource for both olfactory researchers and the public.

  9. Goober: A fully integrated and user-friendly microarray data management and analysis solution for core labs and bench biologists

    Directory of Open Access Journals (Sweden)

    Luo Wen

    2009-03-01

    Full Text Available Despite the large number of software tools developed to address different areas of microarray data analysis, very few offer an all-in-one solution with little learning curve. For microarray core labs, there are even fewer software packages available to help with their routine but critical tasks, such as data quality control (QC and inventory management. We have developed a simple-to-use web portal to allow bench biologists to analyze and query complicated microarray data and related biological pathways without prior training. Both experiment-based and gene-based analysis can be easily performed, even for the first-time user, through the intuitive multi-layer design and interactive graphic links. While being friendly to inexperienced users, most parameters in Goober can be easily adjusted via drop-down menus to allow advanced users to tailor their needs and perform more complicated analysis. Moreover, we have integrated graphic pathway analysis into the website to help users examine microarray data within the relevant biological content. Goober also contains features that cover most of the common tasks in microarray core labs, such as real time array QC, data loading, array usage and inventory tracking. Overall, Goober is a complete microarray solution to help biologists instantly discover valuable information from a microarray experiment and enhance the quality and productivity of microarray core labs. The whole package is freely available at http://sourceforge.net/projects/goober. A demo web server is available at http://www.goober-array.org.

  10. Transforming the Enrollment Experience Using Design Thinking

    Science.gov (United States)

    Apel, Aaron; Hull, Phil; Owczarek, Scott; Singer, Wren

    2018-01-01

    In an effort to simplify the advising and registration process and provide students with a more intuitive enrollment experience, especially at orientation, the University of Wisconsin-Madison's Office of the Registrar and Office of Undergraduate Advising co-sponsored a project to transform the enrollment experience. Using design thinking has…

  11. Design -|+ Negative emotions for positive experiences

    NARCIS (Netherlands)

    Fokkinga, S.F.

    2015-01-01

    Experience-driven design considers all aspects of a product – its appearance, cultural meaning, functionality, interaction, usability, technology, and indirect consequences of use – with the aim to optimize and orchestrate all these aspects and create the best possible user experience. Since the

  12. OPTIMAL EXPERIMENT DESIGN FOR MAGNETIC RESONANCE FINGERPRINTING

    OpenAIRE

    Zhao, Bo; Haldar, Justin P.; Setsompop, Kawin; Wald, Lawrence L.

    2016-01-01

    Magnetic resonance (MR) fingerprinting is an emerging quantitative MR imaging technique that simultaneously acquires multiple tissue parameters in an efficient experiment. In this work, we present an estimation-theoretic framework to evaluate and design MR fingerprinting experiments. More specifically, we derive the Cram��r-Rao bound (CRB), a lower bound on the covariance of any unbiased estimator, to characterize parameter estimation for MR fingerprinting. We then formulate an optimal experi...

  13. Design for Engaging Experience and Social Interaction

    Science.gov (United States)

    Harteveld, Casper; ten Thij, Eleonore; Copier, Marinka

    2011-01-01

    One of the goals of game designers is to design for an engaging experience and for social interaction. The question is how. We know that games can be engaging and allow for social interaction, but how do we achieve this or even improve on it? This article provides an overview of several scientific approaches that deal with this question. It…

  14. Cooperative adaptive cruise control, design and experiments

    NARCIS (Netherlands)

    Naus, G.J.L.; Vugts, R.P.A.; Ploeg, J.; Molengraft, van de M.J.G.; Steinbuch, M.

    2010-01-01

    The design of a CACC system and corresponding experiments are presented. The design targets string stable system behavior, which is assessed using a frequency-domain-based approach. Following this approach, it is shown that the available wireless information enables small inter-vehicle distances,

  15. The EADGENE Microarray Data Analysis Workshop

    DEFF Research Database (Denmark)

    de Koning, Dirk-Jan; Jaffrézic, Florence; Lund, Mogens Sandø

    2007-01-01

    Microarray analyses have become an important tool in animal genomics. While their use is becoming widespread, there is still a lot of ongoing research regarding the analysis of microarray data. In the context of a European Network of Excellence, 31 researchers representing 14 research groups from...... 10 countries performed and discussed the statistical analyses of real and simulated 2-colour microarray data that were distributed among participants. The real data consisted of 48 microarrays from a disease challenge experiment in dairy cattle, while the simulated data consisted of 10 microarrays...... statistical weights, to omitting a large number of spots or omitting entire slides. Surprisingly, these very different approaches gave quite similar results when applied to the simulated data, although not all participating groups analysed both real and simulated data. The workshop was very successful...

  16. Optimized Experiment Design for Marine Systems Identification

    DEFF Research Database (Denmark)

    Blanke, M.; Knudsen, Morten

    1999-01-01

    Simulation of maneuvring and design of motion controls for marine systems require non-linear mathematical models, which often have more than one-hundred parameters. Model identification is hence an extremely difficult task. This paper discusses experiment design for marine systems identification...... and proposes a sensitivity approach to solve the practical experiment design problem. The applicability of the sensitivity approach is demonstrated on a large non-linear model of surge, sway, roll and yaw of a ship. The use of the method is illustrated for a container-ship where both model and full-scale tests...

  17. Annotating breast cancer microarray samples using ontologies

    Science.gov (United States)

    Liu, Hongfang; Li, Xin; Yoon, Victoria; Clarke, Robert

    2008-01-01

    As the most common cancer among women, breast cancer results from the accumulation of mutations in essential genes. Recent advance in high-throughput gene expression microarray technology has inspired researchers to use the technology to assist breast cancer diagnosis, prognosis, and treatment prediction. However, the high dimensionality of microarray experiments and public access of data from many experiments have caused inconsistencies which initiated the development of controlled terminologies and ontologies for annotating microarray experiments, such as the standard microarray Gene Expression Data (MGED) ontology (MO). In this paper, we developed BCM-CO, an ontology tailored specifically for indexing clinical annotations of breast cancer microarray samples from the NCI Thesaurus. Our research showed that the coverage of NCI Thesaurus is very limited with respect to i) terms used by researchers to describe breast cancer histology (covering 22 out of 48 histology terms); ii) breast cancer cell lines (covering one out of 12 cell lines); and iii) classes corresponding to the breast cancer grading and staging. By incorporating a wider range of those terms into BCM-CO, we were able to indexed breast cancer microarray samples from GEO using BCM-CO and MGED ontology and developed a prototype system with web interface that allows the retrieval of microarray data based on the ontology annotations. PMID:18999108

  18. Optimal experiment design for magnetic resonance fingerprinting.

    Science.gov (United States)

    Bo Zhao; Haldar, Justin P; Setsompop, Kawin; Wald, Lawrence L

    2016-08-01

    Magnetic resonance (MR) fingerprinting is an emerging quantitative MR imaging technique that simultaneously acquires multiple tissue parameters in an efficient experiment. In this work, we present an estimation-theoretic framework to evaluate and design MR fingerprinting experiments. More specifically, we derive the Cramér-Rao bound (CRB), a lower bound on the covariance of any unbiased estimator, to characterize parameter estimation for MR fingerprinting. We then formulate an optimal experiment design problem based on the CRB to choose a set of acquisition parameters (e.g., flip angles and/or repetition times) that maximizes the signal-to-noise ratio efficiency of the resulting experiment. The utility of the proposed approach is validated by numerical studies. Representative results demonstrate that the optimized experiments allow for substantial reduction in the length of an MR fingerprinting acquisition, and substantial improvement in parameter estimation performance.

  19. The Ethics of User Experience Design

    DEFF Research Database (Denmark)

    Vistisen, Peter; Jensen, Thessa

    Design has in recent years been an increasing area in focus when developing digital interactive systems and services (Kolko 2010). Given the specific nature of material involved in designing digital media as ‘the material without qualities’ (Lowgreen & Stolterman 2007), and namely its total lack...... that the chosen point-of-view corresponds with the users, and thus ensures that the designed user experience actually is preferable for the user (Schauer & Merholz 2009). However, there has been a lack of discussions surrounding the ethical dimension of creating and maintaining an empathic point......-centered design process. Exemplifying the differences and ethical implications for the designer in the interaction with the user through the design of interactive digital systems. Finally the article discusses the need to understand design as a development of empathy for a given user or group of users by giving...

  20. The Ethics of User Experience Design

    DEFF Research Database (Denmark)

    Vistisen, Peter; Jensen, Thessa

    2013-01-01

    Design has in recent years been an increasing area in focus when developing digital interactive systems and services (Kolko 2010). Given the specific nature of material involved in designing digital media as ‘the material without qualities’ (Lowgreen & Stolterman 2007), and namely its total lack...... that the chosen point-of-view corresponds with the users, and thus ensures that the designed user experience actually is preferable for the user (Schauer & Merholz 2009). However, there has been a lack of discussions surrounding the ethical dimension of creating and maintaining an empathic point......-centered design process. Exemplifying the differences and ethical implications for the designer in the interaction with the user through the design of interactive digital systems. Finally the article discusses the need to understand design as a development of empathy for a given user or group of users by giving...

  1. Learning from experience. Feedback to design

    International Nuclear Information System (INIS)

    Hopwood, J.M.; Shalaby, B.A.; Keil, H.

    1997-01-01

    AECL has been the designer of 25 commercial scale CANDU reactors now in operation, with more under construction. AECL has taken the evolutionary approach in developing its current designs, the CANDU 6 and CANDU 9 Nuclear Power Plants. An integral part of this approach is to emphasize feedback of experience to the designers, in a continuous improvement process. AECL has implemented a formal process of gathering and responding to feedback from: NPP operation, construction and commissioning; regulatory input; R and D results: as well as paying close attention to market input. A number of recent examples of design improvement via this feedback process are described

  2. Microarray Я US: a user-friendly graphical interface to Bioconductor tools that enables accurate microarray data analysis and expedites comprehensive functional analysis of microarray results.

    Science.gov (United States)

    Dai, Yilin; Guo, Ling; Li, Meng; Chen, Yi-Bu

    2012-06-08

    Microarray data analysis presents a significant challenge to researchers who are unable to use the powerful Bioconductor and its numerous tools due to their lack of knowledge of R language. Among the few existing software programs that offer a graphic user interface to Bioconductor packages, none have implemented a comprehensive strategy to address the accuracy and reliability issue of microarray data analysis due to the well known probe design problems associated with many widely used microarray chips. There is also a lack of tools that would expedite the functional analysis of microarray results. We present Microarray Я US, an R-based graphical user interface that implements over a dozen popular Bioconductor packages to offer researchers a streamlined workflow for routine differential microarray expression data analysis without the need to learn R language. In order to enable a more accurate analysis and interpretation of microarray data, we incorporated the latest custom probe re-definition and re-annotation for Affymetrix and Illumina chips. A versatile microarray results output utility tool was also implemented for easy and fast generation of input files for over 20 of the most widely used functional analysis software programs. Coupled with a well-designed user interface, Microarray Я US leverages cutting edge Bioconductor packages for researchers with no knowledge in R language. It also enables a more reliable and accurate microarray data analysis and expedites downstream functional analysis of microarray results.

  3. Environmental analytical chemistry: Design of experiments

    International Nuclear Information System (INIS)

    Sanchez Alonso, F.

    1990-01-01

    The design of experiments is needed any time a work on analysis research or development is performed, in order to explain a physical phenomenon through a mathematical model or trying to optimize any kind of process. Therefore it results an unavoidable technique since multidimensional approximation are more economical and reliable. An empirical approximation is never so efficient and generally provides lower qualities. It is known as 'design of experiments' a group of mathematical-statistical techniques that have the maximum information about our problem and consequently the results obtained will have the maximum quality. The modelization of a physic phenomenon, the basic concepts in order to design the experiments and the analysis of results are studied in detail

  4. Creating Visual Design and Meaningful Audience Experiences

    DEFF Research Database (Denmark)

    Steijn, Arthur; Ion Wille, Jakob

    2014-01-01

    The main purpose of the EU Interreg funded Classical Composition Music and Experience Design project, was to rethink audience experiences and develop knowledge of applied technologies connected to classical music and live concerts. The project and its main objectives was motivated by at least thee...... conditions. The most important being 1) the development in new technology creating new expectations in audiences attending cultural events, including classical concerts, 2) resent decline in audiences attending classical music and 3) a will to strengthen relations between cultural institutions, creative...... businesses and educational institutions in the Øresund region (including the city and surroundings of Malmø and Copenhagen). Therefore the project Classical Composition Music and Experience Design focused on developing new and meaningful audience experiences where live classical music meets new digital...

  5. DEM Calibration Approach: design of experiment

    Science.gov (United States)

    Boikov, A. V.; Savelev, R. V.; Payor, V. A.

    2018-05-01

    The problem of DEM models calibration is considered in the article. It is proposed to divide models input parameters into those that require iterative calibration and those that are recommended to measure directly. A new method for model calibration based on the design of the experiment for iteratively calibrated parameters is proposed. The experiment is conducted using a specially designed stand. The results are processed with technical vision algorithms. Approximating functions are obtained and the error of the implemented software and hardware complex is estimated. The prospects of the obtained results are discussed.

  6. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  7. Workflows for microarray data processing in the Kepler environment.

    Science.gov (United States)

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R

  8. Workflows for microarray data processing in the Kepler environment

    Science.gov (United States)

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  9. Experience economy meets business model design

    DEFF Research Database (Denmark)

    Gudiksen, Sune Klok; Smed, Søren Graakjær; Poulsen, Søren Bolvig

    2012-01-01

    Through the last decade the experience economy has found solid ground and manifested itself as a parameter where business and organizations can differentiate from competitors. The fundamental premise is the one found in Pine & Gilmores model from 1999 over 'the progression of economic value' where...... produced, designed or staged experience that gains the most profit or creates return of investment. It becomes more obvious that other parameters in the future can be a vital part of the experience economy and one of these is business model innovation. Business model innovation is about continuous...

  10. Affective loop experiences: designing for interactional embodiment.

    Science.gov (United States)

    Höök, Kristina

    2009-12-12

    Involving our corporeal bodies in interaction can create strong affective experiences. Systems that both can be influenced by and influence users corporeally exhibit a use quality we name an affective loop experience. In an affective loop experience, (i) emotions are seen as processes, constructed in the interaction, starting from everyday bodily, cognitive or social experiences; (ii) the system responds in ways that pull the user into the interaction, touching upon end users' physical experiences; and (iii) throughout the interaction the user is an active, meaning-making individual choosing how to express themselves-the interpretation responsibility does not lie with the system. We have built several systems that attempt to create affective loop experiences with more or less successful results. For example, eMoto lets users send text messages between mobile phones, but in addition to text, the messages also have colourful and animated shapes in the background chosen through emotion-gestures with a sensor-enabled stylus pen. Affective Diary is a digital diary with which users can scribble their notes, but it also allows for bodily memorabilia to be recorded from body sensors mapping to users' movement and arousal and placed along a timeline. Users can see patterns in their bodily reactions and relate them to various events going on in their lives. The experiences of building and deploying these systems gave us insights into design requirements for addressing affective loop experiences, such as how to design for turn-taking between user and system, how to create for 'open' surfaces in the design that can carry users' own meaning-making processes, how to combine modalities to create for a 'unity' of expression, and the importance of mirroring user experience in familiar ways that touch upon their everyday social and corporeal experiences. But a more important lesson gained from deploying the systems is how emotion processes are co-constructed and experienced

  11. Participatory Design of Citizen Science Experiments

    Science.gov (United States)

    Senabre, Enric; Ferran-Ferrer, Nuria; Perelló, Josep

    2018-01-01

    This article describes and analyzes the collaborative design of a citizen science research project through co-creation. Three groups of secondary school students and a team of scientists conceived three experiments on human behavior and social capital in urban and public spaces. The study goal is to address how interdisciplinary work and attention…

  12. SYSTEMATIC DESIGNING IN ARCHITECTURAL EDUCATION: AN EXPERIENCE OF HOSPITAL DESIGN

    Directory of Open Access Journals (Sweden)

    Dicle AYDIN

    2010-07-01

    Full Text Available Architectural design is defined as decision-making process. Design studios play an important role in experiencing this process and provide the competence of design to prospective architects. The instructors of architecture aim to compel the imagination of the students develop creative thinking, raising the awareness among students about their abilities. Furthermore, executives of the studios pay attention to delimitative elements in design in order to provide the competence of problem solving for students. Each experience in education period prepares the prospective architects for the social environment and the realities of the future. The aim of the study is to examine a practicing in architectural education. The general hospital project was carried out with 40 students and 4 project executives within the 2007-2008 academic year Spring Semester Studio-7 courses. The steps followed in the studio process were analyzed with the design problem of “hospital”. Evaluations were performed on; the solution of functional-spatial organization, solutions about the activities of the users, convenience with the standards and regulations and prosperity-aesthetic notions in internal space. Prospective architects generally became successful in the design of hospital building with complex function. This experience raised awareness about access to information via thinking, provision of a new position for information in each concept.

  13. Development, characterization and experimental validation of a cultivated sunflower (Helianthus annuus L. gene expression oligonucleotide microarray.

    Directory of Open Access Journals (Sweden)

    Paula Fernandez

    Full Text Available Oligonucleotide-based microarrays with accurate gene coverage represent a key strategy for transcriptional studies in orphan species such as sunflower, H. annuus L., which lacks full genome sequences. The goal of this study was the development and functional annotation of a comprehensive sunflower unigene collection and the design and validation of a custom sunflower oligonucleotide-based microarray. A large scale EST (>130,000 ESTs curation, assembly and sequence annotation was performed using Blast2GO (www.blast2go.de. The EST assembly comprises 41,013 putative transcripts (12,924 contigs and 28,089 singletons. The resulting Sunflower Unigen Resource (SUR version 1.0 was used to design an oligonucleotide-based Agilent microarray for cultivated sunflower. This microarray includes a total of 42,326 features: 1,417 Agilent controls, 74 control probes for sunflower replicated 10 times (740 controls and 40,169 different non-control probes. Microarray performance was validated using a model experiment examining the induction of senescence by water deficit. Pre-processing and differential expression analysis of Agilent microarrays was performed using the Bioconductor limma package. The analyses based on p-values calculated by eBayes (p<0.01 allowed the detection of 558 differentially expressed genes between water stress and control conditions; from these, ten genes were further validated by qPCR. Over-represented ontologies were identified using FatiScan in the Babelomics suite. This work generated a curated and trustable sunflower unigene collection, and a custom, validated sunflower oligonucleotide-based microarray using Agilent technology. Both the curated unigene collection and the validated oligonucleotide microarray provide key resources for sunflower genome analysis, transcriptional studies, and molecular breeding for crop improvement.

  14. Fibre optic microarrays.

    Science.gov (United States)

    Walt, David R

    2010-01-01

    This tutorial review describes how fibre optic microarrays can be used to create a variety of sensing and measurement systems. This review covers the basics of optical fibres and arrays, the different microarray architectures, and describes a multitude of applications. Such arrays enable multiplexed sensing for a variety of analytes including nucleic acids, vapours, and biomolecules. Polymer-coated fibre arrays can be used for measuring microscopic chemical phenomena, such as corrosion and localized release of biochemicals from cells. In addition, these microarrays can serve as a substrate for fundamental studies of single molecules and single cells. The review covers topics of interest to chemists, biologists, materials scientists, and engineers.

  15. Conceptual design of Dipole Research Experiment (DREX)

    Science.gov (United States)

    Xiao, Qingmei; Wang, Zhibin; Wang, Xiaogang; Xiao, Chijie; Yang, Xiaoyi; Zheng, Jinxing

    2017-03-01

    A new terrella-like device for laboratory simulation of inner magnetosphere plasmas, Dipole Research Experiment, is scheduled to be built at the Harbin Institute of Technology (HIT), China, as a major state scientific research facility for space physics studies. It is designed to provide a ground experimental platform to reproduce the inner magnetosphere to simulate the processes of trapping, acceleration, and transport of energetic charged particles restrained in a dipole magnetic field configuration. The scaling relation of hydromagnetism between the laboratory plasma of the device and the geomagnetosphere plasma is applied to resemble geospace processes in the Dipole Research Experiment plasma. Multiple plasma sources, different kinds of coils with specific functions, and advanced diagnostics are designed to be equipped in the facility for multi-functions. The motivation, design criteria for the Dipole Research Experiment experiments and the means applied to generate the plasma of desired parameters in the laboratory are also described. Supported by National Natural Science Foundation of China (Nos. 11505040, 11261140326 and 11405038), China Postdoctoral Science Foundation (Nos. 2016M591518, 2015M570283) and Project Supported by Natural Scientific Research Innovation Foundation in Harbin Institute of Technology (No. 2017008).

  16. Learning from experience: feedback to CANDU design

    International Nuclear Information System (INIS)

    Allen, P.J.; Hopwood, J.M.; Rousseau, G.P.

    1998-01-01

    AECL's main product line is based on two single unit CANDU nuclear power plant designs; CANDU 6 and CANDU 9, each of which is based on successfully operating CANDU plants. AECL's CANDU development program is based upon evolutionary improvement. The evolutionary design approach ensures the maximum degree of operational provenness. It also allows successful features of today's plants to be retained while incorporating improvements as they develop to the appropriate level of design maturity. A key component of this evolutionary development is a formal process of gathering and responding to feedback from: NPP operation, construction and commissioning; regulatory input; equipment supplier input; R and D results; market input. The progresses for gathering and implementing the experience feedback and a number of recent examples of design improvements from this feedback process are described in the paper. (author)

  17. DESIGN OF EXPERIMENTS IN TRUCK COMPANY

    Directory of Open Access Journals (Sweden)

    Bibiana Kaselyova

    2015-07-01

    Full Text Available Purpose: Design of experiment (DOE represent very powerful tool for process improvement vastly supported by six sigma methodology. This approach is mostly used by large and manufacturing orientated companies. Presented research is focused on use of DOE in truck company, which is medium size and service orientated. Such study has several purposes. Firstly, detailed description of improvement effort based on DOE can be used as a methodological framework for companies similar to researched one. Secondly, it provides example of successfully implemented low cost design of experiment practise. Moreover, performed experiment identifies key factors, which influence the lifetime of truck tyres.Design/methodology: The research in this paper is based on experiment conducted in Slovakian Truck Company. It provides detailed case study of whole improvement effort, together with problem formulation, design creation and analysis, as well as the results interpretation. The company wants to improve lifetime of the truck tyres. Next to fuel consumption, consumption of tyres and their replacement represent according to them, one of most costly processes in company. Improvement effort was made through the use of PDCA cycle. It start with analysis of current state of tyres consumption. The variability of tyres consumption based on years and types was investigated. Then the causes of tyres replacement were identified and screening DOE was conducted. After a screening design, the full factorial design of experiment was used to identify main drivers of tyres deterioration and breakdowns. Based on result of DOE, the corrective action were propose and implement.Findings: Based on performed experiment our research describes process of tyres use and replacement. It defines main reasons for tyre breakdown and identify main drivers which influence truck tyres lifetime. Moreover it formulates corrective action to prolong tyres lifetime.Originality: The study represents full

  18. HANARO cooling features: design and experience

    International Nuclear Information System (INIS)

    Park, Cheol; Chae, Hee-Taek; Han, Gee-Yang; Jun, Byung-Jin; Ahn, Guk-Hoon

    1999-01-01

    In order to achieve the safe core cooling during normal operation and upset conditions, HANARO adopted an upward forced convection cooling system with dual containment arrangements instead of the forced downward flow system popularly used in the majority of forced convection cooling research reactors. This kind of upward flow system was selected by comparing the relative merits of upward and downward flow systems from various points of view such as safety, performance, maintenance. However, several operational matters which were not regarded as serious at design come out during operation. In this paper are presented the design and operational experiences on the unique cooling features of HANARO. (author)

  19. Simulation of integrated beam experiment designs

    International Nuclear Information System (INIS)

    Grote, D.P.; Sharp, W.M.

    2004-01-01

    Simulation of designs of an Integrated Beam Experiment (IBX) class accelerator have been carried out. These simulations are an important tool for validating such designs. Issues such as envelope mismatch and emittance growth can be examined in a self-consistent manner, including the details of injection, accelerator transitions, long-term transport, and longitudinal compression. The simulations are three-dimensional and time-dependent, and begin at the source. They continue up through the end of the acceleration region, at which point the data is passed on to a separate simulation of the drift compression. Results are be presented

  20. DNA Microarray Technology

    Science.gov (United States)

    Skip to main content DNA Microarray Technology Enter Search Term(s): Español Research Funding An Overview Bioinformatics Current Grants Education and Training Funding Extramural Research News Features Funding Divisions Funding ...

  1. CMM Interim Check Design of Experiments (U)

    Energy Technology Data Exchange (ETDEWEB)

    Montano, Joshua Daniel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-07-29

    Coordinate Measuring Machines (CMM) are widely used in industry, throughout the Nuclear Weapons Complex and at Los Alamos National Laboratory (LANL) to verify part conformance to design definition. Calibration cycles for CMMs at LANL are predominantly one year in length and include a weekly interim check to reduce risk. The CMM interim check makes use of Renishaw’s Machine Checking Gauge which is an off-the-shelf product simulates a large sphere within a CMM’s measurement volume and allows for error estimation. As verification on the interim check process a design of experiments investigation was proposed to test a couple of key factors (location and inspector). The results from the two-factor factorial experiment proved that location influenced results more than the inspector or interaction.

  2. IsoGeneGUI : Multiple approaches for dose-response analysis of microarray data using R

    NARCIS (Netherlands)

    Otava, Martin; Sengupta, Rudradev; Shkedy, Ziv; Lin, Dan; Pramana, Setia; Verbeke, Tobias; Haldermans, Philippe; Hothorn, Ludwig A.; Gerhard, Daniel; Kuiper, Rebecca M.; Klinglmueller, Florian; Kasim, Adetayo

    2017-01-01

    The analysis of transcriptomic experiments with ordered covariates, such as dose-response data, has become a central topic in bioinformatics, in particular in omics studies. Consequently, multiple R packages on CRAN and Bioconductor are designed to analyse microarray data from various perspectives

  3. Participatory design of citizen science experiments

    OpenAIRE

    Senabre, Enric; Ferran Ferrer, Núria; Perelló, Josep, 1974-

    2018-01-01

    This article describes and analyzes the collaborative design of a citizen science research project through cocreation. Three groups of secondary school students and a team of scientists conceived three experiments on human behavior and social capital in urban and public spaces. The study goal is to address how interdisciplinary work and attention to social concerns and needs, as well as the collective construction of research questions, can be integrated into scientific research. The 95 stude...

  4. Design calculations for NIF convergent ablator experiments

    Directory of Open Access Journals (Sweden)

    Olson R.E.

    2013-11-01

    Full Text Available The NIF convergent ablation tuning effort is underway. In the early experiments, we have discovered that the design code simulations over-predict the capsule implosion velocity and shock flash ρr, but under-predict the hohlraum x-ray flux measurements. The apparent inconsistency between the x-ray flux and radiography data implies that there are important unexplained aspects of the hohlraum and/or capsule behavior.

  5. Evolvable designs of experiments applications for circuits

    CERN Document Server

    Iordache, Octavian

    2009-01-01

    Adopting a groundbreaking approach, the highly regarded author shows how to design methods for planning increasingly complex experiments. He begins with a brief introduction to standard quality methods and the technology in standard electric circuits. The book then gives numerous examples of how to apply the proposed methodology in a series of real-life case studies. Although these case studies are taken from the printed circuit board industry, the methods are equally applicable to other fields of engineering.

  6. Small sodium valve design and operating experience

    International Nuclear Information System (INIS)

    McGough, C.B.

    1974-01-01

    The United States Liquid Metal Fast Breeder Reactor program (LMFBR) includes an extensive program devoted to the development of small sodium valves. This program is now focused on the development and production of valves for the Fast Flux Test Facility (FFTF) now under construction near Richland, Washington. Other AEC support facilities, such as various test loops located at the Liquid Metal Engineering Center (LMEC), Los Angeles, California, and at the Hanford Engineering Development Laboratory (HEDL), Richland, Washington, also have significant requirements for small sodium valves, and valves similar in design to the FFTF valves are being supplied to these AEC laboratories for use in their critical test installations. A principal motivation for these valve programs, beyond the immediate need to provide high-reliability valves for FFTF and the support facilities, is the necessity to develop small valve technology for the Clinch River Breeder Reactor Plant (CRBRP). FFTF small sodium valve design and development experience will be directly applied to the CRBRP program. Various test programs have been, and are being, conducted to verify the performance and integrity of the FFTF valves, and to uncover any potential problems so that they can be corrected before the valves are placed in service in FFTF. The principal small sodium valve designs being utilized in current U.S. programs, the test and operational experience obtained to date on them, problems uncovered, and future development and testing efforts being planned are reviewed. The standards and requirements to which the valves are being designed and fabricated, the valve designs in current use, valve operators, test and operating experience, and future valve development plans are summarized. (U.S.)

  7. Design of a water electrolysis flight experiment

    Science.gov (United States)

    Lee, M. Gene; Grigger, David J.; Thompson, C. Dean; Cusick, Robert J.

    1993-01-01

    Supply of oxygen (O2) and hydrogen (H2) by electolyzing water in space will play an important role in meeting the National Aeronautics and Space Administration's (NASA's) needs and goals for future space missios. Both O2 and H2 are envisioned to be used in a variety of processes including crew life support, spacecraft propulsion, extravehicular activity, electrical power generation/storage as well as in scientific experiment and manufacturing processes. The Electrolysis Performance Improvement Concept Study (EPICS) flight experiment described herein is sponsored by NASA Headquarters as a part of the In-Space Technology Experiment Program (IN-STEP). The objective of the EPICS is to further contribute to the improvement of the SEF technology, specifially by demonstrating and validating the SFE electromechanical process in microgravity as well as investigating perrformance improvements projected possible in a microgravity environment. This paper defines the experiment objective and presents the results of the preliminary design of the EPICS. The experiment will include testing three subscale self-contained SFE units: one containing baseline components, and two units having variations in key component materials. Tests will be conducted at varying current and thermal condition.

  8. Burnout detector design for heat transfer experiments

    International Nuclear Information System (INIS)

    Dias, H.F.

    1992-01-01

    This paper describes the design of an burnout detector for heat transfer experiments, applied during tests for optimization of fuel elements for PWR reactors. The burnout detector avoids the fuel rods destruction during the experiments at the Centro de Desenvolvimento da Tecnologia Nuclear. The detector evaluates the temperature changes over the fuel rods in the temperature changes over the fuel rods in the area where the burnout phenomenon could be anticipated. As soon as the phenomenon appears, the system power supply is turned off. The thermal Circuit No. 1, during the experiments, had been composed by nine fuel rods feed parallelly by the same power supply. Fine copper wires had been attached at the centre and at the ends of the fuel rod to take two Wheat stone bridge arms. The detector had been applied across the bridge diagonals, which must be balanced the burnout excursion can be detected as a small but fast increase of the signal over the detector. Large scale experiments had been carried out to compare the resistance bridge performance against a thermocouple attached through the fuel rod wall. These experiments had been showed us the advantages of the first method over the last, because the bridge evaluates the whole fuel rod, while the thermocouple evaluates only the area where it had been attached. (author)

  9. The POLARBEAR Experiment: Design and Characterization

    Science.gov (United States)

    Kermish, Zigmund David

    We present the design and characterization of the POLARBEAR experiment. POLARBEAR is a millimeter-wave polarimeter that will measure the Cosmic Microwave Background (CMB) polarization. It was designed to have both the sensitivity and angular resolution to detect the expected B-mode polarization due to gravitational lensing at small angular scales while still enabling a search for the degree scale B-mode polarization caused by inflationary gravitational waves. The instrument utilizes the Huan Tran Telescope (HTT), a 2.5-meter primary mirror telescope, coupled to a unique focal plane of 1,274 antenna-coupled transition-edge sensor (TES) detectors to achieve unprecedented sensitivity from angular scales of the experiment's 4 arcminute beam to several degrees. This dissertation focuses on the design, integration and characterization of the cryogenic receiver for the POLARBEAR instrument. The receiver cools the ˜20 cm focal plane to 0.25 Kelvin, with detector readout provided by a digital frequency-multiplexed SQUID system. The POLARBEAR receiver was been successfully deployed on the HTT for an engineering run in the Eastern Sierras of California and is currently deployed on Cerro Toco in the Atacama Dessert of Chile. We present results from lab tests done to characterize the instrument, from the engineering run and preliminary results from Chile.

  10. Advanced Reactor Fuels Irradiation Experiment Design Objectives

    International Nuclear Information System (INIS)

    Chichester, Heather Jean MacLean; Hayes, Steven Lowe; Dempsey, Douglas; Harp, Jason Michael

    2016-01-01

    This report summarizes the objectives of the current irradiation testing activities being undertaken by the Advanced Fuels Campaign relative to supporting the development and demonstration of innovative design features for metallic fuels in order to realize reliable performance to ultra-high burnups. The AFC-3 and AFC-4 test series are nearing completion; the experiments in this test series that have been completed or are in progress are reviewed and the objectives and test matrices for the final experiments in these two series are defined. The objectives, testing strategy, and test parameters associated with a future AFC test series, AFC-5, are documented. Finally, the future intersections and/or synergies of the AFC irradiation testing program with those of the TREAT transient testing program, emerging needs of proposed Versatile Test Reactor concepts, and the Joint Fuel Cycle Study program’s Integrated Recycle Test are discussed.

  11. Advanced Reactor Fuels Irradiation Experiment Design Objectives

    Energy Technology Data Exchange (ETDEWEB)

    Chichester, Heather Jean MacLean [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hayes, Steven Lowe [Idaho National Lab. (INL), Idaho Falls, ID (United States); Dempsey, Douglas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Harp, Jason Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This report summarizes the objectives of the current irradiation testing activities being undertaken by the Advanced Fuels Campaign relative to supporting the development and demonstration of innovative design features for metallic fuels in order to realize reliable performance to ultra-high burnups. The AFC-3 and AFC-4 test series are nearing completion; the experiments in this test series that have been completed or are in progress are reviewed and the objectives and test matrices for the final experiments in these two series are defined. The objectives, testing strategy, and test parameters associated with a future AFC test series, AFC-5, are documented. Finally, the future intersections and/or synergies of the AFC irradiation testing program with those of the TREAT transient testing program, emerging needs of proposed Versatile Test Reactor concepts, and the Joint Fuel Cycle Study program’s Integrated Recycle Test are discussed.

  12. The design of macromolecular crystallography diffraction experiments

    International Nuclear Information System (INIS)

    Evans, Gwyndaf; Axford, Danny; Owen, Robin L.

    2011-01-01

    Thoughts about the decisions made in designing macromolecular X-ray crystallography experiments at synchrotron beamlines are presented. The measurement of X-ray diffraction data from macromolecular crystals for the purpose of structure determination is the convergence of two processes: the preparation of diffraction-quality crystal samples on the one hand and the construction and optimization of an X-ray beamline and end station on the other. Like sample preparation, a macromolecular crystallography beamline is geared to obtaining the best possible diffraction measurements from crystals provided by the synchrotron user. This paper describes the thoughts behind an experiment that fully exploits both the sample and the beamline and how these map into everyday decisions that users can and should make when visiting a beamline with their most precious crystals

  13. Screening for copy-number alterations and loss of heterozygosity in chronic lymphocytic leukemia--a comparative study of four differently designed, high resolution microarray platforms

    DEFF Research Database (Denmark)

    Gunnarsson, R.; Staaf, J.; Jansson, M.

    2008-01-01

    Screening for gene copy-number alterations (CNAs) has improved by applying genome-wide microarrays, where SNP arrays also allow analysis of loss of heterozygozity (LOH). We here analyzed 10 chronic lymphocytic leukemia (CLL) samples using four different high-resolution platforms: BAC arrays (32K)...

  14. Vision Guided Intelligent Robot Design And Experiments

    Science.gov (United States)

    Slutzky, G. D.; Hall, E. L.

    1988-02-01

    The concept of an intelligent robot is an important topic combining sensors, manipulators, and artificial intelligence to design a useful machine. Vision systems, tactile sensors, proximity switches and other sensors provide the elements necessary for simple game playing as well as industrial applications. These sensors permit adaption to a changing environment. The AI techniques permit advanced forms of decision making, adaptive responses, and learning while the manipulator provides the ability to perform various tasks. Computer languages such as LISP and OPS5, have been utilized to achieve expert systems approaches in solving real world problems. The purpose of this paper is to describe several examples of visually guided intelligent robots including both stationary and mobile robots. Demonstrations will be presented of a system for constructing and solving a popular peg game, a robot lawn mower, and a box stacking robot. The experience gained from these and other systems provide insight into what may be realistically expected from the next generation of intelligent machines.

  15. Plasma focus system: Design, construction and experiments

    International Nuclear Information System (INIS)

    Alacakir, A.; Akguen, Y.; Boeluekdemir, A. S.

    2007-01-01

    The aim of this work is to construct a compact experimental system for fusion research. The design, construction and experiments of the 3 kJ Mather type plasma focus machine is described. This machine is established for neutron yield and fast neutron radiography by D-D reaction which is given by D + D→ 3 He (0.82 MeV) + n (2.45 MeV) . Investigation of the geometry of plasma focus machine in the presence of high voltage drive and vacuum system setup is shown. 108 neutron per pulse and 200 kA peak current is obtained for many shots. Scintillator screen for fast neutron imaging, sensitive to 2.45 MeV neutrons, is also manufactured in our labs. Structural neutron shielding computations for safety is also completed

  16. Design of Experiments for Food Engineering

    DEFF Research Database (Denmark)

    Pedersen, Søren Juhl; Geoffrey Vining, G.

    This work looks at the application of Design of Experiments (DoE) to Food Engineering (FE) problems in relation to quality. The field of Quality Engineering (QE) is a natural partnering field for FE due to the extensive developments that QE has had in using DoE for quality improvement especially...... in manufacturing industries. In the thesis the concepts concerning food quality is addressed and in addition how QE proposes to define quality. There is seen a merger in how QE’s definition of quality has been translated for food. At the same time within FE a divergence has been proposed in the literature...... that the fundamental principles of DoE have as much importance and relevance as ever for both the food industry and FE research....

  17. Designing solar thermal experiments based on simulation

    International Nuclear Information System (INIS)

    Huleihil, Mahmoud; Mazor, Gedalya

    2013-01-01

    In this study three different models to describe the temperature distribution inside a cylindrical solid body subjected to high solar irradiation were examined, beginning with the simpler approach, which is the single dimension lump system (time), progressing through the two-dimensional distributed system approach (time and vertical direction), and ending with the three-dimensional distributed system approach with azimuthally symmetry (time, vertical direction, and radial direction). The three models were introduced and solved analytically and numerically. The importance of the models and their solution was addressed. The simulations based on them might be considered as a powerful tool in designing experiments, as they make it possible to estimate the different effects of the parameters involved in these models

  18. Everyday life; Lived Experiences and Designed Learning

    DEFF Research Database (Denmark)

    Vestbo, Michelle; Helms, Niels Henrik; Dræbel, Tania Aase

    of participating in study life. Inspired by sociological phenomenological approach, the study uses participant observations, interviews and a workshop to explore the life-worlds of daily living of students who train to become professionals of social education or nutrition and health education. The study......Everyday life; Lived Experiences and Designed Learning: Students knowledge cultures and epistemic trajectories in a range of professional bachelor educations Helms, N.H., Vestbo, M., Steenfeldt, V.O., Dræbel, T.A., Hansen, T.A.E., Storm, H., and Schmidt, L.S.K. (University College Zealand......) In this panel the use of different methodological approaches to answer questions about students’ knowledge cultures and epistemic trajectories is discussed. The context is qualitative empirical educational studies in a range of professional bachelor educations; Nursing, Social Education and Nutrition and Health...

  19. Modern control room design experience and speculation

    International Nuclear Information System (INIS)

    Smith, J.E.

    1993-01-01

    Can operators trained to use conventional control panels readily adapt to CRT based control rooms? Does automation make the design of good man-machine interfaces more or less difficult? In a conventional, hard-wired control room is the operator's peripheral vision always an asset and how can one do better in a CRT based control room? Are Expert System assisted man-machine interfaces a boon or a bust? This paper explores these questions in the light of actual experience with advanced power plant control environments. This paper discusses how automation has in fact simplified the problem of ensuring that the operator has at all times a clear understanding of the plant state. The author contends that conventional hard-wired control rooms are very poor at providing the operator with a good overview of the plant status particularly under startup, or upset conditions and that CRT-based control rooms offer an opportunity for improvement. Experience with some early attempts at this are discussed together with some interesting proposals from other authors. Finally the paper discusses the experience to date with expert system assisted man-machine interfaces. Although promising for the future progress has been slow. The amount of knowledge research required is often formidable and consequently costly. Often when an adequate knowledge base is finally acquired it turns out to be better to use it to increase the level of automation and thus simplify the operator's task. The risks are not any greater and automation offers more consistent operation. It is important also to carefully distinguish between expert system assisted display selection and expert system operator guidance. The first is intended to help the operator in his quest for information. The second attempts to guide the operator actions. The good and the bad points of each of these approaches is discussed

  20. Interim Service ISDN Satellite (ISIS) hardware experiment design for advanced ISDN satellite design and experiments

    Science.gov (United States)

    Pepin, Gerard R.

    1992-01-01

    The Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) Hardware Experiment Design for Advanced Satellite Designs describes the design of the ISDN Satellite Terminal Adapter (ISTA) capable of translating ISDN protocol traffic into time division multiple access (TDMA) signals for use by a communications satellite. The ISTA connects the Type 1 Network Termination (NT1) via the U-interface on the line termination side of the CPE to the V.35 interface for satellite uplink. The same ISTA converts in the opposite direction the V.35 to U-interface data with a simple switch setting.

  1. DNA Microarray Technologies: A Novel Approach to Geonomic Research

    Energy Technology Data Exchange (ETDEWEB)

    Hinman, R.; Thrall, B.; Wong, K,

    2002-01-01

    A cDNA microarray allows biologists to examine the expression of thousands of genes simultaneously. Researchers may analyze the complete transcriptional program of an organism in response to specific physiological or developmental conditions. By design, a cDNA microarray is an experiment with many variables and few controls. One question that inevitably arises when working with a cDNA microarray is data reproducibility. How easy is it to confirm mRNA expression patterns? In this paper, a case study involving the treatment of a murine macrophage RAW 264.7 cell line with tumor necrosis factor alpha (TNF) was used to obtain a rough estimate of data reproducibility. Two trials were examined and a list of genes displaying either a > 2-fold or > 4-fold increase in gene expression was compiled. Variations in signal mean ratios between the two slides were observed. We can assume that erring in reproducibility may be compensated by greater inductive levels of similar genes. Steps taken to obtain results included serum starvation of cells before treatment, tests of mRNA for quality/consistency, and data normalization.

  2. Finite Element Analysis and Design of Experiments in Engineering Design

    OpenAIRE

    Eriksson, Martin

    1999-01-01

    Projects with the objective of introducing Finite Element Analysis (FEA) into the early phases of the design process have previously been carried out at the Department of Machine Design, see e.g. the Doctoral thesis by Burman [13]. These works clearly highlight the usefulness of introducing design analysis early in the design process. According to Bjärnemo and Burman [10] the most significant advantage of applying design analysis early in the design process was the shift from verification to ...

  3. Designing Technology for Active Spectator Experiences at Sporting Events

    DEFF Research Database (Denmark)

    Veerasawmy, Rune; Ludvigsen, Martin

    2010-01-01

    This paper explores the active spectator experience at sporting events, by presenting and reflecting upon a design experiment carried out at a number of football1 events. The initial hypothesis of the design process, leading to the design experiment has been that the spectator experience is not m......This paper explores the active spectator experience at sporting events, by presenting and reflecting upon a design experiment carried out at a number of football1 events. The initial hypothesis of the design process, leading to the design experiment has been that the spectator experience...... is not merely an experience of receiving and consuming entertainment. It is also heavily reliant on the active participation of the spectator in creating the atmosphere of the entire event. The BannerBattle experiment provides interactive technology in sport arenas with a form of interaction based on existing...

  4. Design of a new therapy for patients with chronic kidney disease: use of microarrays for selective hemoadsorption of uremic wastes and toxins to improve homeostasis

    Directory of Open Access Journals (Sweden)

    Shahidi Bonjar MR

    2015-01-01

    Full Text Available Mohammad Rashid Shahidi Bonjar,1 Leyla Shahidi Bonjar2 1School of Dentistry, Kerman University of Medical Sciences, Kerman, Iran; 2Department of Pharmacology, College of Pharmacy, Kerman University of Medical Sciences, Kerman, Iran Abstract: The hypothesis proposed here would provide near to optimum homeostasis for patients with chronic kidney disease (CKD without the need for hemodialysis. This strategy has not been described previously in the scientific literature. It involves a targeted therapy that may prevent progression of the disease and help to improve the well-being of CKD patients. It proposes a nanotechnological device, ie, a microarray-oriented homeostasis provider (MOHP, to improve homeostasis in CKD patients. MOHP would be an auxiliary kidney aid, and would improve the filtration functions that impaired kidneys cannot perform by their own. MOHP is composed of two main computer-oriented components, ie, a quantitative microarray detector (QMD and a homeostasis-oriented microarray column (HOMC. QMD detects and HOMC selectively removes defined quantities of uremic wastes, toxins and any other metabolites which is programmed for. The QMD and HOMC would accomplish this with the help of a peristaltic blood pump that would circulate blood aseptically in an extracorporeal closed circuit. During the passage of blood through the QMD, this microarray detector would quantitatively monitor all of the blood compounds that accumulate in the blood of a patient with impaired glomerular filtration, including small-sized, middle-sized and large-sized molecules. The electronic information collected by QMD would be electronically transmitted to the HOMC, which would adjust the molecules to the concentrations they are electronically programmed for and/or receive from QMD. This process of monitoring and removal of waste continues until the programmed homeostasis criteria are reached. Like a conventional kidney machine, MOHP can be used in hospitals and

  5. Integrative missing value estimation for microarray data.

    Science.gov (United States)

    Hu, Jianjun; Li, Haifeng; Waterman, Michael S; Zhou, Xianghong Jasmine

    2006-10-12

    Missing value estimation is an important preprocessing step in microarray analysis. Although several methods have been developed to solve this problem, their performance is unsatisfactory for datasets with high rates of missing data, high measurement noise, or limited numbers of samples. In fact, more than 80% of the time-series datasets in Stanford Microarray Database contain less than eight samples. We present the integrative Missing Value Estimation method (iMISS) by incorporating information from multiple reference microarray datasets to improve missing value estimation. For each gene with missing data, we derive a consistent neighbor-gene list by taking reference data sets into consideration. To determine whether the given reference data sets are sufficiently informative for integration, we use a submatrix imputation approach. Our experiments showed that iMISS can significantly and consistently improve the accuracy of the state-of-the-art Local Least Square (LLS) imputation algorithm by up to 15% improvement in our benchmark tests. We demonstrated that the order-statistics-based integrative imputation algorithms can achieve significant improvements over the state-of-the-art missing value estimation approaches such as LLS and is especially good for imputing microarray datasets with a limited number of samples, high rates of missing data, or very noisy measurements. With the rapid accumulation of microarray datasets, the performance of our approach can be further improved by incorporating larger and more appropriate reference datasets.

  6. Integrative missing value estimation for microarray data

    Directory of Open Access Journals (Sweden)

    Zhou Xianghong

    2006-10-01

    Full Text Available Abstract Background Missing value estimation is an important preprocessing step in microarray analysis. Although several methods have been developed to solve this problem, their performance is unsatisfactory for datasets with high rates of missing data, high measurement noise, or limited numbers of samples. In fact, more than 80% of the time-series datasets in Stanford Microarray Database contain less than eight samples. Results We present the integrative Missing Value Estimation method (iMISS by incorporating information from multiple reference microarray datasets to improve missing value estimation. For each gene with missing data, we derive a consistent neighbor-gene list by taking reference data sets into consideration. To determine whether the given reference data sets are sufficiently informative for integration, we use a submatrix imputation approach. Our experiments showed that iMISS can significantly and consistently improve the accuracy of the state-of-the-art Local Least Square (LLS imputation algorithm by up to 15% improvement in our benchmark tests. Conclusion We demonstrated that the order-statistics-based integrative imputation algorithms can achieve significant improvements over the state-of-the-art missing value estimation approaches such as LLS and is especially good for imputing microarray datasets with a limited number of samples, high rates of missing data, or very noisy measurements. With the rapid accumulation of microarray datasets, the performance of our approach can be further improved by incorporating larger and more appropriate reference datasets.

  7. 3D Biomaterial Microarrays for Regenerative Medicine

    DEFF Research Database (Denmark)

    Gaharwar, Akhilesh K.; Arpanaei, Ayyoob; Andresen, Thomas Lars

    2015-01-01

    Three dimensional (3D) biomaterial microarrays hold enormous promise for regenerative medicine because of their ability to accelerate the design and fabrication of biomimetic materials. Such tissue-like biomaterials can provide an appropriate microenvironment for stimulating and controlling stem...... for tissue engineering and drug screening applications....... cell differentiation into tissue-specifi c lineages. The use of 3D biomaterial microarrays can, if optimized correctly, result in a more than 1000-fold reduction in biomaterials and cells consumption when engineering optimal materials combinations, which makes these miniaturized systems very attractive...

  8. An experiment in multidisciplinary digital design

    NARCIS (Netherlands)

    Tuncer, B.; De Ruiter, P.; Mulders, S.

    2008-01-01

    The design and realization of complex buildings requires multidisciplinary design collaboration from early on in the design process. The intensive use of digital design environments in this process demands new knowledge and skills from the involved players including integrating and managing digital

  9. "Harshlighting" small blemishes on microarrays

    Directory of Open Access Journals (Sweden)

    Wittkowski Knut M

    2005-03-01

    Full Text Available Abstract Background Microscopists are familiar with many blemishes that fluorescence images can have due to dust and debris, glass flaws, uneven distribution of fluids or surface coatings, etc. Microarray scans show similar artefacts, which affect the analysis, particularly when one tries to detect subtle changes. However, most blemishes are hard to find by the unaided eye, particularly in high-density oligonucleotide arrays (HDONAs. Results We present a method that harnesses the statistical power provided by having several HDONAs available, which are obtained under similar conditions except for the experimental factor. This method "harshlights" blemishes and renders them evident. We find empirically that about 25% of our chips are blemished, and we analyze the impact of masking them on screening for differentially expressed genes. Conclusion Experiments attempting to assess subtle expression changes should be carefully screened for blemishes on the chips. The proposed method provides investigators with a novel robust approach to improve the sensitivity of microarray analyses. By utilizing topological information to identify and mask blemishes prior to model based analyses, the method prevents artefacts from confounding the process of background correction, normalization, and summarization.

  10. Jupiter energetic particle experiment ESAD proton sensor design

    International Nuclear Information System (INIS)

    Gruhn, C.R.; Higbie, P.R.

    1977-12-01

    A proton sensor design for the Jupiter Energetic Particle Experiment is described. The sensor design uses avalanche multiplication in order to lower the effective energy threshold. A complete signal-to-noise analysis is given for this design

  11. Architectural design of experience based factory model for software ...

    African Journals Online (AJOL)

    architectural design. Automation features are incorporated in the design in which workflow system and intelligent agents are integrated, and the facilitation of cloud environment is empowered to further support the automation. Keywords: architectural design; knowledge management; experience factory; workflow;

  12. Extending Immunological Profiling in the Gilthead Sea Bream, Sparus aurata, by Enriched cDNA Library Analysis, Microarray Design and Initial Studies upon the Inflammatory Response to PAMPs

    Directory of Open Access Journals (Sweden)

    Sebastian Boltaña

    2017-02-01

    Full Text Available This study describes the development and validation of an enriched oligonucleotide-microarray platform for Sparus aurata (SAQ to provide a platform for transcriptomic studies in this species. A transcriptome database was constructed by assembly of gilthead sea bream sequences derived from public repositories of mRNA together with reads from a large collection of expressed sequence tags (EST from two extensive targeted cDNA libraries characterizing mRNA transcripts regulated by both bacterial and viral challenge. The developed microarray was further validated by analysing monocyte/macrophage activation profiles after challenge with two Gram-negative bacterial pathogen-associated molecular patterns (PAMPs; lipopolysaccharide (LPS and peptidoglycan (PGN. Of the approximately 10,000 EST sequenced, we obtained a total of 6837 EST longer than 100 nt, with 3778 and 3059 EST obtained from the bacterial-primed and from the viral-primed cDNA libraries, respectively. Functional classification of contigs from the bacterial- and viral-primed cDNA libraries by Gene Ontology (GO showed that the top five represented categories were equally represented in the two libraries: metabolism (approximately 24% of the total number of contigs, carrier proteins/membrane transport (approximately 15%, effectors/modulators and cell communication (approximately 11%, nucleoside, nucleotide and nucleic acid metabolism (approximately 7.5% and intracellular transducers/signal transduction (approximately 5%. Transcriptome analyses using this enriched oligonucleotide platform identified differential shifts in the response to PGN and LPS in macrophage-like cells, highlighting responsive gene-cassettes tightly related to PAMP host recognition. As observed in other fish species, PGN is a powerful activator of the inflammatory response in S. aurata macrophage-like cells. We have developed and validated an oligonucleotide microarray (SAQ that provides a platform enriched for the study

  13. Extending Immunological Profiling in the Gilthead Sea Bream, Sparus aurata, by Enriched cDNA Library Analysis, Microarray Design and Initial Studies upon the Inflammatory Response to PAMPs.

    Science.gov (United States)

    Boltaña, Sebastian; Castellana, Barbara; Goetz, Giles; Tort, Lluis; Teles, Mariana; Mulero, Victor; Novoa, Beatriz; Figueras, Antonio; Goetz, Frederick W; Gallardo-Escarate, Cristian; Planas, Josep V; Mackenzie, Simon

    2017-02-03

    This study describes the development and validation of an enriched oligonucleotide-microarray platform for Sparus aurata (SAQ) to provide a platform for transcriptomic studies in this species. A transcriptome database was constructed by assembly of gilthead sea bream sequences derived from public repositories of mRNA together with reads from a large collection of expressed sequence tags (EST) from two extensive targeted cDNA libraries characterizing mRNA transcripts regulated by both bacterial and viral challenge. The developed microarray was further validated by analysing monocyte/macrophage activation profiles after challenge with two Gram-negative bacterial pathogen-associated molecular patterns (PAMPs; lipopolysaccharide (LPS) and peptidoglycan (PGN)). Of the approximately 10,000 EST sequenced, we obtained a total of 6837 EST longer than 100 nt, with 3778 and 3059 EST obtained from the bacterial-primed and from the viral-primed cDNA libraries, respectively. Functional classification of contigs from the bacterial- and viral-primed cDNA libraries by Gene Ontology (GO) showed that the top five represented categories were equally represented in the two libraries: metabolism (approximately 24% of the total number of contigs), carrier proteins/membrane transport (approximately 15%), effectors/modulators and cell communication (approximately 11%), nucleoside, nucleotide and nucleic acid metabolism (approximately 7.5%) and intracellular transducers/signal transduction (approximately 5%). Transcriptome analyses using this enriched oligonucleotide platform identified differential shifts in the response to PGN and LPS in macrophage-like cells, highlighting responsive gene-cassettes tightly related to PAMP host recognition. As observed in other fish species, PGN is a powerful activator of the inflammatory response in S. aurata macrophage-like cells. We have developed and validated an oligonucleotide microarray (SAQ) that provides a platform enriched for the study of gene

  14. Designing interactive technology for crowd experiences - beyond sanitization

    DEFF Research Database (Denmark)

    Veerasawmy, Rune

    2014-01-01

    This dissertation concerns the topic on designing interactive technology for crowd expe- riences. It takes the outset in the experience-oriented design approach within interaction design, exploring the research question how can we conceptually understand and design interactive technology for crowd...... experiences? Through theoretical studies of sociological crowd theory and pragmatist perspectives on experience combined with design exper- iments at sporting events this dissertation establishes an conceptual understanding of crowd experience. The outcome of this work is furthermore synthesized...... in a conceptual model of social experiences that presents crowd experiences as a distinct type of social experience. This is different from what previously have been explored within experi- ence-oriented design. This dissertation is composed of four research papers framed by an overview that summarizes...

  15. Design of a new therapy for patients with chronic kidney disease: use of microarrays for selective hemoadsorption of uremic wastes and toxins to improve homeostasis.

    Science.gov (United States)

    Shahidi Bonjar, Mohammad Rashid; Shahidi Bonjar, Leyla

    2015-01-01

    The hypothesis proposed here would provide near to optimum homeostasis for patients with chronic kidney disease (CKD) without the need for hemodialysis. This strategy has not been described previously in the scientific literature. It involves a targeted therapy that may prevent progression of the disease and help to improve the well-being of CKD patients. It proposes a nanotechnological device, ie, a microarray-oriented homeostasis provider (MOHP), to improve homeostasis in CKD patients. MOHP would be an auxiliary kidney aid, and would improve the filtration functions that impaired kidneys cannot perform by their own. MOHP is composed of two main computer-oriented components, ie, a quantitative microarray detector (QMD) and a homeostasis-oriented microarray column (HOMC). QMD detects and HOMC selectively removes defined quantities of uremic wastes, toxins and any other metabolites which is programmed for. The QMD and HOMC would accomplish this with the help of a peristaltic blood pump that would circulate blood aseptically in an extracorporeal closed circuit. During the passage of blood through the QMD, this microarray detector would quantitatively monitor all of the blood compounds that accumulate in the blood of a patient with impaired glomerular filtration, including small-sized, middle-sized and large-sized molecules. The electronic information collected by QMD would be electronically transmitted to the HOMC, which would adjust the molecules to the concentrations they are electronically programmed for and/or receive from QMD. This process of monitoring and removal of waste continues until the programmed homeostasis criteria are reached. Like a conventional kidney machine, MOHP can be used in hospitals and homes under the supervision of a trained technician. The main advantages of this treatment would include improved homeostasis, a reduced likelihood of side effects and of the morbidity resulting from CKD, slower progression of kidney impairment, prevention of

  16. Design optimization of condenser microphone: a design of experiment perspective.

    Science.gov (United States)

    Tan, Chee Wee; Miao, Jianmin

    2009-06-01

    A well-designed condenser microphone backplate is very important in the attainment of good frequency response characteristics--high sensitivity and wide bandwidth with flat response--and low mechanical-thermal noise. To study the design optimization of the backplate, a 2(6) factorial design with a single replicate, which consists of six backplate parameters and four responses, has been undertaken on a comprehensive condenser microphone model developed by Zuckerwar. Through the elimination of insignificant parameters via normal probability plots of the effect estimates, the projection of an unreplicated factorial design into a replicated one can be performed to carry out an analysis of variance on the factorial design. The air gap and slot have significant effects on the sensitivity, mechanical-thermal noise, and bandwidth while the slot/hole location interaction has major influence over the latter two responses. An organized and systematic approach of designing the backplate is summarized.

  17. Safety Research Experiment Facility project. Conceptual design report. Volume IX. Experiment handling

    International Nuclear Information System (INIS)

    1975-01-01

    Information on the SAREF Reactor experiment handling system is presented concerning functions and design requirements, design description, operation, casualty events and recovery procedures, and maintenance

  18. An Undergraduate Experiment in Alarm System Design.

    Science.gov (United States)

    Martini, R. A.; And Others

    1988-01-01

    Describes an experiment involving data acquisition by a computer, digital signal transmission from the computer to a digital logic circuit and signal interpretation by this circuit. The system is being used at the Illinois Institute of Technology. Discusses the fundamental concepts involved. Demonstrates the alarm experiment as it is used in…

  19. Design Experiments and the Generation of Theory

    DEFF Research Database (Denmark)

    Nortvig, Anne-Mette

    methodology (DBR) and a participatory design approach was used in a case in physiotherapy education from the beginning of the implementation of e-learning in order to create learning designs for the new teaching context and generate theory in relation to this. Ict-based learning designs were created...... in collaboration with teachers and students, and the empirical data are stemming from this work. The research question to discuss in this paper is concerned with the challenges that can arise in a DBR project when the interventions and design cycles do not evolve as planned. The paper therefore argues that it can...

  20. LOFT fuel design and operating experience

    International Nuclear Information System (INIS)

    Russell, M.L.

    1979-01-01

    The objective of the LOFT fuel design and fabrication effort was to provide a pressurized water reactor core that has (1) materials and geometric features to ensure that heat transfer, hydraulic, mechanical, chemical, metallurgical and nuclear behaviors are typical of large pressurized water reactors (PWR) during the loss-of-coolant accident (LOCA) sequence and (2) test instrumentation for measurement of core conditions. The LOFT core is unique because it is designed for exposure to several LOCAs without loss of function. This paper summarizes the design effort and extent to which the design objectives have been achieved

  1. Design aspects of low activation fusion ignition experiments

    International Nuclear Information System (INIS)

    Cheng, E.T.; Creedon, R.L.; Hopkins, G.R.; Trester, P.W.; Wong, C.P.C.; Schultz, K.R.

    1986-01-01

    Preliminary design studies have been done exploring (1) materials selection, (2) shutdown biological dose rates, (3) mechanical design and (4) thermal design of a fusion ignition experiment made of low activation materials. From the results of these preliminary design studies it appears that an ignition experiment could be built of low activation materials, and that this design would allow hands-on access for maintenance

  2. An Architectural Experience for Interface Design

    Science.gov (United States)

    Gong, Susan P.

    2016-01-01

    The problem of human-computer interface design was brought to the foreground with the emergence of the personal computer, the increasing complexity of electronic systems, and the need to accommodate the human operator in these systems. With each new technological generation discovering the interface design problems of its own technologies, initial…

  3. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  4. Experiences in Preserving Design Bases & Knowledge Management

    International Nuclear Information System (INIS)

    Koshy, Thomas

    2013-01-01

    Purpose: • To build repository of information on design and development in a retrievable manner to: • Establish licensing/design bases of the plant; • Identify the known vulnerabilities and how they are to be addressed; • Prevent undoing the lessons learned; • Facilitate advancement without repeating the undesirable incidents of the past

  5. Storytelling tools in support of user experience design

    NARCIS (Netherlands)

    Peng, Qiong

    2017-01-01

    Storytelling has been proposed as an intuitive way to support communication in user experience design. With story-based thinking, designers can gain a better understanding of the potential user experience, developing and discussing design ideas within an (imagined) context. This proposal introduces

  6. Experiment design for identification of structured linear systems

    NARCIS (Netherlands)

    Potters, M.G.

    2016-01-01

    Experiment Design for system identification involves the design of an optimal input signal with the purpose of accurately estimating unknown parameters in a system. Specifically, in the Least-Costly Experiment Design (LCED) framework, the optimal input signal results from an optimisation problem in

  7. Design experience on seismically isolated buildings

    International Nuclear Information System (INIS)

    Giuliani, G.C.

    1991-01-01

    This paper describes the practical problems associated with the structural design of seismically isolated buildings now under construction in Ancona, Italy. These structures are the first seismically isolated buildings in Italy. The Ancona region is in zone 2 of the Italian Seismic Code. It has a design acceleration of 0.07 g which corresponds to a ground surface acceleration of 0.25 g. The last significant earthquake was recorded on June 14, 1972, having a single shock-type wave with a peak acceleration of 0.53 g. Taking into account the aforesaid earthquake, the structural design of these new buildings was performed according to an acceleration spectrum which was different from the zone 2 seismic code and which provided protection for stronger ground motions. To minimize the cost of the structure, the buildings used ribbed plate decks, thus reducing the amount of material and the mass of the structures to be isolated. The design requirements, dynamic analysis performed, structural design, and practical engineering employed are reported in this paper. A comparison between the costs of a conventionally designed and a base-isolated structure is also reported. It shows a net savings of 7% for the base-isolated structure. The tests undertaken for certifying the mechanical properties of the isolators for both static and dynamic loads are also described, as is the full-scale dynamic test which is scheduled for next year (1990) for one of the completed buildings. (orig.)

  8. Design experience on seismically isolated buildings

    International Nuclear Information System (INIS)

    Giuliani, G.C.

    1989-01-01

    This paper describes the practical problems associated with the structural design of a group of seismically isolated buildings now under construction in Ancona, Italy. These structures are the first seismically isolated buildings in Italy. Taking into account previous earthquakes, the structural design of these new buildings was performed according to an acceleration spectrum which was different from its Zone 2 seismic code and which provided protection for stronger ground motions. To minimize the cost of the structure, the buildings used ribbed plate decks, thus reducing the amount of material and the mass of the structures to be isolated. The design requirements, dynamic analysis performed, structural design, and practical engineering employed are reported in this paper. A comparison between the costs of a conventionally designed and a base-isolated structure is also reported. The tests undertaken for certifying the mechanical properties of the isolators for both static and dynamic loads are also described, as is the full-scale dynamic test which is scheduled for next year (1990) for one of the completed buildings. Lessons learned in this design effort are potentially applicable to seismic base isolation for nuclear power plants

  9. Use of safety experience feedback to design new nuclear units

    International Nuclear Information System (INIS)

    Lange, D.; Crochon, J.P.

    1985-06-01

    For the designer, and about safety, the experience feedback can take place in 3 fields: the operating experience feedback (incidents analysis), the ''study'' experience feedback (improvement of justification and evolution of safety considerations), and the fabrication experience feedback. Some examples are presented for each field [fr

  10. Sensitivity and fidelity of DNA microarray improved with integration of Amplified Differential Gene Expression (ADGE

    Directory of Open Access Journals (Sweden)

    Ile Kristina E

    2003-07-01

    Full Text Available Abstract Background The ADGE technique is a method designed to magnify the ratios of gene expression before detection. It improves the detection sensitivity to small change of gene expression and requires small amount of starting material. However, the throughput of ADGE is low. We integrated ADGE with DNA microarray (ADGE microarray and compared it with regular microarray. Results When ADGE was integrated with DNA microarray, a quantitative relationship of a power function between detected and input ratios was found. Because of ratio magnification, ADGE microarray was better able to detect small changes in gene expression in a drug resistant model cell line system. The PCR amplification of templates and efficient labeling reduced the requirement of starting material to as little as 125 ng of total RNA for one slide hybridization and enhanced the signal intensity. Integration of ratio magnification, template amplification and efficient labeling in ADGE microarray reduced artifacts in microarray data and improved detection fidelity. The results of ADGE microarray were less variable and more reproducible than those of regular microarray. A gene expression profile generated with ADGE microarray characterized the drug resistant phenotype, particularly with reference to glutathione, proliferation and kinase pathways. Conclusion ADGE microarray magnified the ratios of differential gene expression in a power function, improved the detection sensitivity and fidelity and reduced the requirement for starting material while maintaining high throughput. ADGE microarray generated a more informative expression pattern than regular microarray.

  11. Impact of LMFBR operating experience on PFBR design

    International Nuclear Information System (INIS)

    Bhoje, S.B.; Chetal, S.C.; Chellapandi, P.; Govindarajan, S.; Lee, S.M.; Kameswara Rao, A.S.L.; Prabhakar, R.; Raghupathy, S.; Sodhi, B.S.; Sundaramoorthy, T.R.; Vaidyanathan, G.

    2000-01-01

    PFBR is a 500 MWe, sodium cooled, pool type, fast breeder reactor currently under detailed design. It is essential to reduce the capital cost of PFBR in order to make it competitive with thermal reactors. Operating experience of LMFBRs provides a vital input towards simplification of the design, improving its reliability, enhancing safety and achieving overall cost reduction. This paper includes a summary of LMFBR operating experience and details the design features of PFBR as influenced by operating experience of LMFBRs. (author)

  12. LOFT instrumented fuel design and operating experience

    International Nuclear Information System (INIS)

    Russell, M.L.

    1979-01-01

    A summary description of the Loss-of-Fluid Test (LOFT) system instrumented core construction details and operating experience through reactor startup and loss-of-coolant experiment (LOCE) operations performed to date are discussed. The discussion includes details of the test instrumentation attachment to the fuel assembly, the structural response of the fuel modules to the forces generated by a double-ended break of a pressurized water reactor (PWR) coolant pipe at the inlet to the reactor vessel, the durability of the LOFT fuel and test instrumentation, and the plans for incorporation of improved fuel assembly test instrumentation features in the LOFT core

  13. Integrating conceptualizations of experience into the interaction design process

    DEFF Research Database (Denmark)

    Dalsgaard, Peter

    2010-01-01

    From a design perspective, the increasing awareness of experiential aspects of interactive systems prompts the question of how conceptualizations of experience can inform and potentially be integrated into the interaction design process. This paper presents one approach to integrating theoretical...

  14. The Design, Experience and Practice of Networked Learning

    DEFF Research Database (Denmark)

    . The Design, Experience and Practice of Networked Learning will prove indispensable reading for researchers, teachers, consultants, and instructional designers in higher and continuing education; for those involved in staff and educational development, and for those studying post graduate qualifications...

  15. Statistical Analysis of Designed Experiments Theory and Applications

    CERN Document Server

    Tamhane, Ajit C

    2012-01-01

    A indispensable guide to understanding and designing modern experiments The tools and techniques of Design of Experiments (DOE) allow researchers to successfully collect, analyze, and interpret data across a wide array of disciplines. Statistical Analysis of Designed Experiments provides a modern and balanced treatment of DOE methodology with thorough coverage of the underlying theory and standard designs of experiments, guiding the reader through applications to research in various fields such as engineering, medicine, business, and the social sciences. The book supplies a foundation for the

  16. Exploring the use of internal and externalcontrols for assessing microarray technical performance

    Directory of Open Access Journals (Sweden)

    Game Laurence

    2010-12-01

    experiments. The observed consistency amongst the information carried by internal and external controls and whole-array quality measures offers promise for rationally-designed control standards for routine performance monitoring of multiplexed measurement platforms.

  17. Hypersonic drone vehicle design: A multidisciplinary experience

    Science.gov (United States)

    1988-01-01

    UCLA's Advanced Aeronautic Design group focussed their efforts on design problems of an unmanned hypersonic vehicle. It is felt that a scaled hypersonic drone is necesary to bridge the gap between present theory on hypersonics and the future reality of the National Aerospace Plane (NASP) for two reasons: (1) to fulfill a need for experimental data in the hypersonic regime, and (2) to provide a testbed for the scramjet engine which is to be the primary mode of propulsion for the NASP. The group concentrated on three areas of great concern to NASP design: propulsion, thermal management, and flight systems. Problem solving in these areas was directed toward design of the drone with the idea that the same design techniques could be applied to the NASP. A 70 deg swept double-delta wing configuration, developed in the 70's at the NASA Langley, was chosen as the aerodynamic and geometric model for the drone. This vehicle would be air launched from a B-1 at Mach 0.8 and 48,000 feet, rocket boosted by two internal engines to Mach 10 and 100,000 feet, and allowed to cruise under power of the scramjet engine until burnout. It would then return to base for an unpowered landing. Preliminary energy calculations based on flight requirements give the drone a gross launch weight of 134,000 pounds and an overall length of 85 feet.

  18. Designing the user experience of game development tools

    CERN Document Server

    Lightbown, David

    2015-01-01

    The Big Green Button My Story Who Should Read this Book? Companion Website and Twitter Account Before we BeginWelcome to Designing the User Experience of Game Development ToolsWhat Will We Learn in This Chapter?What Is This Book About?Defining User ExperienceThe Value of Improving the User Experience of Our ToolsParallels Between User Experience and Game DesignHow Do People Benefit From an Improved User Experience?Finding the Right BalanceWrapping UpThe User-Centered Design ProcessWhat Will We

  19. Design and analysis of Monte Carlo experiments

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Gentle, J.E.; Haerdle, W.; Mori, Y.

    2012-01-01

    By definition, computer simulation or Monte Carlo models are not solved by mathematical analysis (such as differential calculus), but are used for numerical experimentation. The goal of these experiments is to answer questions about the real world; i.e., the experimenters may use their models to

  20. Optical beam deflection sensor: design and experiments.

    Science.gov (United States)

    Sakamoto, João M S; Marques, Renan B; Kitano, Cláudio; Rodrigues, Nicolau A S; Riva, Rudimar

    2017-10-01

    In this work, we present a double-pass optical beam deflection sensor and its optical design method. To accomplish that, a mathematical model was proposed and computational simulations were performed, in order to obtain the sensor's characteristic curves and to analyze its behavior as function of design parameters. The mathematical model was validated by comparison with the characteristic curves acquired experimentally. The sensor was employed to detect acoustic pulses generated by a pulsed laser in a sample surface, in order to show its potential for monitoring applications handling high energy input as laser welding or laser ablation.

  1. Simulation for (sustainable) building design: Czech experiences

    NARCIS (Netherlands)

    Bartak, M.; Drkal, F.; Hensen, J.L.M.; Lain, M.; Schwarzer, J.; Sourek, B.

    2001-01-01

    This paper attempts to outline the current state-of-the-art in the Czech Republic regarding the use of integrated building performance simulation as a design tool. Integrated performance simulation for reducing the environmental impact of buildings is illustrated by means of three recent HVAC

  2. Interdisciplinary parametric design : The XXL experience

    NARCIS (Netherlands)

    Turrin, M.; Sariyildiz, I.S.; Paul, J.C.

    2015-01-01

    Focusing on large span structures for sport buildings, the paper tackles the role of parametric modelling and performance simulations, to enhance the integration between architectural and engineering design. The general approach contrasts post-engineering processes. In post-engineering, technical

  3. Hypersonic drone design: A multidisciplinary experience

    Science.gov (United States)

    1988-01-01

    Efforts were focused on design problems of an unmanned hypersonic vehicle. It is felt that a scaled hypersonic drone is necessary to bridge the gap between present theory on hypersonics and the future reality of the National Aerospace Plane (NASP) for two reasons: to fulfill a need for experimental data in the hypersonic regime, and to provide a testbed for the scramjet engine which is to be the primary mode of propulsion for the NASP. Three areas of great concern to NASP design were examined: propulsion, thermal management, and flight systems. Problem solving in these areas was directed towards design of the drone with the idea that the same design techniques could be applied to the NASP. A seventy degree swept double delta wing configuration, developed in the 70's at NASA Langley, was chosen as the aerodynamic and geometric model for the drone. This vehicle would be air-launched from a B-1 at Mach 0.8 and 48,000 feet, rocket boosted by two internal engines to Mach 10 and 100,000 feet, and allowed to cruise under power of the scramjet engine until burnout. It would then return to base for an unpowered landing. Preliminary energy calculations based upon the flight requirements give the drone a gross launch weight of 134,000 lb. and an overall length of 85 feet.

  4. Carbon Pricing: Design, Experiences and Issues

    DEFF Research Database (Denmark)

    Carbon Pricing reflects upon and further develops the ongoing and worthwhile global debate into how to design carbon pricing, and how to utilize the financial proceeds in the best possible way for society. The world has recently witnessed a significant downward adjustment in fossil fuel prices...

  5. Relaying experiences for care home design

    DEFF Research Database (Denmark)

    Raudaskoski, Pirkko Liisa

    2014-01-01

    stakeholders (researchers, family members, etc.) could put forward their ideas and wishes about the facilities of a soon-to-be-built care home for people with brain injury. In other words, the seminar was part of a wider diagnostic endeavor that was to be started in a specially designed building. The future...

  6. Evolutionary experience design – the case of Otopia

    DEFF Research Database (Denmark)

    Hansen, Kenneth

    experiences with the case of “Otopia”. “Otopia” is a large scale, new media experiment, which combines the areas of computer games, sports and performance in to a spectator oriented concept; it was premiered in a dome tent at the Roskilde Festival in Denmark the summer 2005. This paper presents and discusses......The design of experiences is a complicated challenge. It might not even be possible to design such a “thing”, but only to design for it. If this is the case it could seem appropriate with an evolutionary approach. This paper introduces such an approach to the design of new public oriented...... used as a means of specifying the basic immaterial design form. This discussion leads to the suggestion of a rule-based evolutionary model for the design of situations as a practical option for designers of new spectator oriented experiences in the future The project of Otopia was supported...

  7. LOFT fuel design and operating experience

    International Nuclear Information System (INIS)

    Russell, M.L.

    1978-01-01

    The purpose of the LOFT fuel is to provide a pressurized water reactor core that has (1) test instrumentation for measurement of core conditions and (2) materials and geometric features to ensure heat transfer, hydraulic, mechanical, chemical, metallurgical and nuclear behaviors are typical of large pressurized water reactors (LPWRS) during the loss-of-coolant accident (LOCA) sequence. The LOFT core is unique because it is designed for exposure to several LOCAs without loss of function

  8. Designing learning experiences together with children

    OpenAIRE

    Leinonen, Jonna; Venninen, Tuulikki

    2012-01-01

    Children’s participation in early childhood education context has attracted considerable attention in recent years. Participation means involving and enabling children to take part in decision-making processes about their everyday life. Educators are supporters and enablers of participatory practices. The process of planning activities is an important part of educator’s profession in early childhood education and it can be viewed as a designing learning process. But not only as adults designi...

  9. Mining meiosis and gametogenesis with DNA microarrays.

    Science.gov (United States)

    Schlecht, Ulrich; Primig, Michael

    2003-04-01

    Gametogenesis is a key developmental process that involves complex transcriptional regulation of numerous genes including many that are conserved between unicellular eukaryotes and mammals. Recent expression-profiling experiments using microarrays have provided insight into the co-ordinated transcription of several hundred genes during mitotic growth and meiotic development in budding and fission yeast. Furthermore, microarray-based studies have identified numerous loci that are regulated during the cell cycle or expressed in a germ-cell specific manner in eukaryotic model systems like Caenorhabditis elegans, Mus musculus as well as Homo sapiens. The unprecedented amount of information produced by post-genome biology has spawned novel approaches to organizing biological knowledge using currently available information technology. This review outlines experiments that contribute to an emerging comprehensive picture of the molecular machinery governing sexual reproduction in eukaryotes.

  10. Student designed experiments to learn fluids

    Science.gov (United States)

    Stern, Catalina

    2013-11-01

    Lasers and high speed cameras are a wonderful tool to visualize the very complex behavior of fluids, and to help students grasp concepts like turbulence, surface tension and vorticity. In this work we present experiments done by physics students in their senior year at the School of Science of the National University of Mexico as a final project in the continuum mechanics course. Every semester, the students make an oral presentation of their work and videos and images are kept in the web page ``Pasión por los Fluidos''. I acknowledge support from the Physics Department of Facultad de Ciencias, Universidad Nacional Autónoma de México.

  11. Experiment to measure vacuum birefringence: Conceptual design

    Science.gov (United States)

    Mueller, Guido; Tanner, David; Doebrich, Babette; Poeld, Jan; Lindner, Axel; Willke, Benno

    2016-03-01

    Vacuum birefringence is another lingering challenge which will soon become accessible to experimental verification. The effect was first calculated by Euler and Heisenberg in 1936 and is these days described as a one-loop correction to the differential index of refraction between light which is polarized parallel and perpendicular to an external magnetic field. Our plan is to realize (and slightly modify) an idea which was originally published by Hall, Ye, and Ma using advanced LIGO and LISA technology and the infrastructure of the ALPS light-shining-through-walls experiment following the ALPS IIc science run. This work is supported by the Deutsche Forschungsgemeinschaft and the Heising-Simons Foundation.

  12. Experiences with voice to design ceramics

    DEFF Research Database (Denmark)

    Hansen, Flemming Tvede; Jensen, Kristoffer

    2014-01-01

    This article presents SoundShaping, a system to create ceramics from the human voice and thus how digital technology makes new possibilities in ceramic craft. The article is about how experiential knowledge that the craftsmen gains in a direct physical and tactile interaction with a responding...... material can be transformed and utilised in the use of digital technologies. SoundShaping is based on a generic audio feature extraction system and the principal component analysis to ensure that the pertinent information in the voice is used. Moreover, 3D shape is created using simple geometric rules....... The shape is output to a 3D printer to make ceramic results. The system demonstrates the close connection between digital technology and craft practice. Several experiments and reflections demonstrate the validity of this work....

  13. Experiences with Voice to Design Ceramics

    DEFF Research Database (Denmark)

    Hansen, Flemming Tvede; Jensen, Kristoffer

    2013-01-01

    This article presents SoundShaping, a system to create ceramics from the human voice and thus how digital technology makes new possibilities in ceramic craft. The article is about how experiential knowledge that the craftsmen gains in a direct physical and tactile interaction with a responding...... material can be transformed and utilized in the use of digital technologies. SoundShaping is based on a generic audio feature extraction system and the principal component analysis to ensure that the pertinent information in the voice is used. Moreover, 3D shape is created using simple geometric rules....... The shape is output to a 3D printer to make ceramic results. The system demonstrates the close connection between digital technology and craft practice. Several experiments and reflections demonstrate the validity of this work....

  14. Small sodium valve design and operating experience

    International Nuclear Information System (INIS)

    Abramson, R.; Elie, X.; Vercasson, M.; Nedelec, J.

    1974-01-01

    Conventionally, valves for sodium pipes smaller than 125 mm in diameter are called ''small sodium valves''. However, this limit should rather be considered as the lower limit o ''large sodium valves''. In fact, both the largest sizes of small valves and the smallest of large valves can be found in the range of 125-300 mm in diameter. Thus what is said about small valves also applies, for a few valve types, above the 125 mm limit. Sodium valves are described here in a general manner, with no manufacturing details except when necessary for understanding valve behavior. Operating experience is pointed out wherever possible. Finally, some information is given about ongoing or proposed development plans. (U.S.)

  15. The application of DNA microarrays in gene expression analysis.

    Science.gov (United States)

    van Hal, N L; Vorst, O; van Houwelingen, A M; Kok, E J; Peijnenburg, A; Aharoni, A; van Tunen, A J; Keijer, J

    2000-03-31

    DNA microarray technology is a new and powerful technology that will substantially increase the speed of molecular biological research. This paper gives a survey of DNA microarray technology and its use in gene expression studies. The technical aspects and their potential improvements are discussed. These comprise array manufacturing and design, array hybridisation, scanning, and data handling. Furthermore, it is discussed how DNA microarrays can be applied in the working fields of: safety, functionality and health of food and gene discovery and pathway engineering in plants.

  16. A Model for Designing Adaptive Laboratory Evolution Experiments

    DEFF Research Database (Denmark)

    LaCroix, Ryan A.; Palsson, Bernhard O.; Feist, Adam M.

    2017-01-01

    in suboptimal experiments that can take multiple months to complete. With the availability of automation and computer simulations, we can now perform these experiments in an optimized fashion and can design experiments to generate greater fitness in an accelerated time frame, thereby pushing the limits of what...

  17. Central Equatorial Pacific Experiment (CEPEX). Design document

    Energy Technology Data Exchange (ETDEWEB)

    1993-04-01

    The Earth`s climate has varied significantly in the past, yet climate records reveal that in the tropics, sea surface temperatures seem to have been remarkably stable, varying by less than a few degrees Celsius over geologic time. Today, the large warm pool of the western Pacific shows similar characteristics. Its surface temperature always exceeds 27{degree}C, but never 31{degree}C. Heightened interest in this observation has been stimulated by questions of global climate change and the exploration of stabilizing climate feedback processes. Efforts to understand the observed weak sensitivity of tropical sea surface temperatures to climate forcing has led to a number of competing ideas about the nature of this apparent thermostat. Although there remains disagreement on the processes that regulate tropical sea surface temperature, most agree that further progress in resolving these differences requires comprehensive field observations of three-dimensional water vapor concentrations, solar and infrared radiative fluxes, surface fluxes of heat and water vapor, and cloud microphysical properties. This document describes the Central Equatorial Pacific Experiment (CEPEX) plan to collect such observations over the central equatorial Pacific Ocean during March of 1993.

  18. Ontology-based, Tissue MicroArray oriented, image centered tissue bank

    Directory of Open Access Journals (Sweden)

    Viti Federica

    2008-04-01

    Full Text Available Abstract Background Tissue MicroArray technique is becoming increasingly important in pathology for the validation of experimental data from transcriptomic analysis. This approach produces many images which need to be properly managed, if possible with an infrastructure able to support tissue sharing between institutes. Moreover, the available frameworks oriented to Tissue MicroArray provide good storage for clinical patient, sample treatment and block construction information, but their utility is limited by the lack of data integration with biomolecular information. Results In this work we propose a Tissue MicroArray web oriented system to support researchers in managing bio-samples and, through the use of ontologies, enables tissue sharing aimed at the design of Tissue MicroArray experiments and results evaluation. Indeed, our system provides ontological description both for pre-analysis tissue images and for post-process analysis image results, which is crucial for information exchange. Moreover, working on well-defined terms it is then possible to query web resources for literature articles to integrate both pathology and bioinformatics data. Conclusions Using this system, users associate an ontology-based description to each image uploaded into the database and also integrate results with the ontological description of biosequences identified in every tissue. Moreover, it is possible to integrate the ontological description provided by the user with a full compliant gene ontology definition, enabling statistical studies about correlation between the analyzed pathology and the most commonly related biological processes.

  19. Mapping the Journey: Visualising Collaborative Experiences for Sustainable Design Education

    Science.gov (United States)

    McMahon, Muireann; Bhamra, Tracy

    2017-01-01

    The paradigm of design is changing. Designers now need to be equipped with the skills and knowledge that will enable them to participate in the global move towards a sustainable future. The challenges arise as Design for Sustainability deals with very complex and often contradictory issues. Collaborative learning experiences recognise that these…

  20. The "Tutorless" Design Studio: A Radical Experiment in Blended Learning

    Science.gov (United States)

    Hill, Glen Andrew

    2017-01-01

    This paper describes a pedagogical experiment in which a suite of novel blended learning strategies was used to replace the traditional role of design tutors in a first year architectural design studio. The pedagogical objectives, blended learning strategies and outcomes of the course are detailed. While the quality of the student design work…

  1. Enhancing user experience design with an integrated storytelling method

    NARCIS (Netherlands)

    Peng, Qiong; Matterns, Jean Bernard; Marcus, A.

    2016-01-01

    Storytelling has been known as a service design method and been used broadly not only in service design but also in the context of user experience design. However, practitioners cannot yet fully appreciate the benefits of storytelling, and often confuse storytelling with storyboarding and scenarios.

  2. Evaluating design alternatives using conjoint experiments in virual reality

    NARCIS (Netherlands)

    Dijkstra, J.; Leeuwen, van J.P.; Timmermans, H.J.P.

    2003-01-01

    In this paper the authors describe the design of an experiment based on conjoint measurement that explores the possibility of using the Internet to evaluate design alternatives. These design alternatives are presented as panoramic views, and preferences are measured by asking subjects which

  3. User Experience Design (UX Design) in a Website Development : Website redesign

    OpenAIRE

    Orlova, Mariia

    2016-01-01

    The purpose of the study was to implement an approach of user experience for a website design. Mostly, I concentrated on revealing and understanding the concepts of UX design which include usability, visual design and human factors affecting the user experience. Another aim of the study was to investigate people’s behaviour related to web design. The thesis based on a project. The project was to redesign an existing web design for a company called Positive Communications. They provide differe...

  4. Divertor design for the Tokamak Physics Experiment

    International Nuclear Information System (INIS)

    Hill, D.N.; Braams, B.

    1994-05-01

    In this paper we discuss the present divertor design for the planned TPX tokamak, which will explore the physics and technology of steady-state (1000s pulses) heat and particle removal in high confinement (2--4x L-mode), high beta (β N ≥ 3) divertor plasmas sustained by non-inductive current drive. The TPX device will operate in the double-null divertor configuration, with actively cooled graphite targets forming a deep (0.5 m) slot at the outer strike point. The peak heat flux on, the highly tilted (74 degrees from normal) re-entrant (to recycle ions back toward the separatrix) will be in the range of 4--6 MW/m 2 with 18 MW of neutral beams and RF heating power. The combination of active pumping and gas puffing (deuterium plus impurities), along with higher heating power (45 MW maximum) will allow testing of radiative divertor concepts at ITER-like power densities

  5. An algorithm for finding biologically significant features in microarray data based on a priori manifold learning.

    Directory of Open Access Journals (Sweden)

    Zena M Hira

    Full Text Available Microarray databases are a large source of genetic data, which, upon proper analysis, could enhance our understanding of biology and medicine. Many microarray experiments have been designed to investigate the genetic mechanisms of cancer, and analytical approaches have been applied in order to classify different types of cancer or distinguish between cancerous and non-cancerous tissue. However, microarrays are high-dimensional datasets with high levels of noise and this causes problems when using machine learning methods. A popular approach to this problem is to search for a set of features that will simplify the structure and to some degree remove the noise from the data. The most widely used approach to feature extraction is principal component analysis (PCA which assumes a multivariate Gaussian model of the data. More recently, non-linear methods have been investigated. Among these, manifold learning algorithms, for example Isomap, aim to project the data from a higher dimensional space onto a lower dimension one. We have proposed a priori manifold learning for finding a manifold in which a representative set of microarray data is fused with relevant data taken from the KEGG pathway database. Once the manifold has been constructed the raw microarray data is projected onto it and clustering and classification can take place. In contrast to earlier fusion based methods, the prior knowledge from the KEGG databases is not used in, and does not bias the classification process--it merely acts as an aid to find the best space in which to search the data. In our experiments we have found that using our new manifold method gives better classification results than using either PCA or conventional Isomap.

  6. Evaluation of toxicity of the mycotoxin citrinin using yeast ORF DNA microarray and Oligo DNA microarray

    Directory of Open Access Journals (Sweden)

    Nobumasa Hitoshi

    2007-04-01

    Full Text Available Abstract Background Mycotoxins are fungal secondary metabolites commonly present in feed and food, and are widely regarded as hazardous contaminants. Citrinin, one of the very well known mycotoxins that was first isolated from Penicillium citrinum, is produced by more than 10 kinds of fungi, and is possibly spread all over the world. However, the information on the action mechanism of the toxin is limited. Thus, we investigated the citrinin-induced genomic response for evaluating its toxicity. Results Citrinin inhibited growth of yeast cells at a concentration higher than 100 ppm. We monitored the citrinin-induced mRNA expression profiles in yeast using the ORF DNA microarray and Oligo DNA microarray, and the expression profiles were compared with those of the other stress-inducing agents. Results obtained from both microarray experiments clustered together, but were different from those of the mycotoxin patulin. The oxidative stress response genes – AADs, FLR1, OYE3, GRE2, and MET17 – were significantly induced. In the functional category, expression of genes involved in "metabolism", "cell rescue, defense and virulence", and "energy" were significantly activated. In the category of "metabolism", genes involved in the glutathione synthesis pathway were activated, and in the category of "cell rescue, defense and virulence", the ABC transporter genes were induced. To alleviate the induced stress, these cells might pump out the citrinin after modification with glutathione. While, the citrinin treatment did not induce the genes involved in the DNA repair. Conclusion Results from both microarray studies suggest that citrinin treatment induced oxidative stress in yeast cells. The genotoxicity was less severe than the patulin, suggesting that citrinin is less toxic than patulin. The reproducibility of the expression profiles was much better with the Oligo DNA microarray. However, the Oligo DNA microarray did not completely overcome cross

  7. Updating design information questionnaire (DIQ) experiences

    International Nuclear Information System (INIS)

    Palafox-Garcia, P.

    2001-01-01

    Full text: 1. Introduction - Once the State signed with the International Atomic Energy Agency the Non-Proliferation Treaty (NPT), the State has to declare to the IAEA their facilities where they handle Nuclear Material. Each facility will have their own Safeguards Agreement and these are called Subsidiary Arrangements. In order to have a good control and accountability of this material, each facility is named Material Balance Area (MBA). Based on the Subsidiary Arrangements each MBA has to fill a proper IAEA format named DIQ in order to get the Facility Attachment. The DIQ format varies, relying on the kind of facility. 2. Facility - In the NNRI, we have two MBA's and the experiences that we have had to fill the DIQ formats had been, that it takes quite a time to get the proper Facility Attachment, because first you have to have the proper format, then you fill it properly with all their respective annexes and once it is reviewed and approved by the people involved, this is signed and sent to the IAEA, this first step took six months. Once the format is reviewed by the IAEA, they send it back to the facility, asking for proper comments in order to clarify it properly, this took three months. The facility update the comments and send it back, this took three months. With this format the IAEA prepares the Facility Attachment of the MBA and send it to the facility for its approval or comments, this took five months. The facility reviewed it and sent it back with some comments or doubts after tree months. The IAEA clarifies the comments and doubts and send to the facility the approved Facility Attachment, four months later. So in order to get the proper Facility Attachments for each of our MBA's, it has been taken 24 months (two years) at least. 3. Actual situation - At present, now that the nuclear activities have been diminished and consequently the nuclear material movements, because the Fuel Fabrication Pilot Plant (FFPP) we have, was stopped for financial reasons

  8. Geiger mode avalanche photodiodes for microarray systems

    Science.gov (United States)

    Phelan, Don; Jackson, Carl; Redfern, R. Michael; Morrison, Alan P.; Mathewson, Alan

    2002-06-01

    New Geiger Mode Avalanche Photodiodes (GM-APD) have been designed and characterized specifically for use in microarray systems. Critical parameters such as excess reverse bias voltage, hold-off time and optimum operating temperature have been experimentally determined for these photon-counting devices. The photon detection probability, dark count rate and afterpulsing probability have been measured under different operating conditions. An active- quench circuit (AQC) is presented for operating these GM- APDs. This circuit is relatively simple, robust and has such benefits as reducing average power dissipation and afterpulsing. Arrays of these GM-APDs have already been designed and together with AQCs open up the possibility of having a solid-state microarray detector that enables parallel analysis on a single chip. Another advantage of these GM-APDs over current technology is their low voltage CMOS compatibility which could allow for the fabrication of an AQC on the same device. Small are detectors have already been employed in the time-resolved detection of fluorescence from labeled proteins. It is envisaged that operating these new GM-APDs with this active-quench circuit will have numerous applications for the detection of fluorescence in microarray systems.

  9. Plasmonically amplified fluorescence bioassay with microarray format

    Science.gov (United States)

    Gogalic, S.; Hageneder, S.; Ctortecka, C.; Bauch, M.; Khan, I.; Preininger, Claudia; Sauer, U.; Dostalek, J.

    2015-05-01

    Plasmonic amplification of fluorescence signal in bioassays with microarray detection format is reported. A crossed relief diffraction grating was designed to couple an excitation laser beam to surface plasmons at the wavelength overlapping with the absorption and emission bands of fluorophore Dy647 that was used as a label. The surface of periodically corrugated sensor chip was coated with surface plasmon-supporting gold layer and a thin SU8 polymer film carrying epoxy groups. These groups were employed for the covalent immobilization of capture antibodies at arrays of spots. The plasmonic amplification of fluorescence signal on the developed microarray chip was tested by using interleukin 8 sandwich immunoassay. The readout was performed ex situ after drying the chip by using a commercial scanner with high numerical aperture collecting lens. Obtained results reveal the enhancement of fluorescence signal by a factor of 5 when compared to a regular glass chip.

  10. Development and validation of a flax (Linum usitatissimum L.) gene expression oligo microarray.

    Science.gov (United States)

    Fenart, Stéphane; Ndong, Yves-Placide Assoumou; Duarte, Jorge; Rivière, Nathalie; Wilmer, Jeroen; van Wuytswinkel, Olivier; Lucau, Anca; Cariou, Emmanuelle; Neutelings, Godfrey; Gutierrez, Laurent; Chabbert, Brigitte; Guillot, Xavier; Tavernier, Reynald; Hawkins, Simon; Thomasset, Brigitte

    2010-10-21

    Flax (Linum usitatissimum L.) has been cultivated for around 9,000 years and is therefore one of the oldest cultivated species. Today, flax is still grown for its oil (oil-flax or linseed cultivars) and its cellulose-rich fibres (fibre-flax cultivars) used for high-value linen garments and composite materials. Despite the wide industrial use of flax-derived products, and our actual understanding of the regulation of both wood fibre production and oil biosynthesis more information must be acquired in both domains. Recent advances in genomics are now providing opportunities to improve our fundamental knowledge of these complex processes. In this paper we report the development and validation of a high-density oligo microarray platform dedicated to gene expression analyses in flax. Nine different RNA samples obtained from flax inner- and outer-stems, seeds, leaves and roots were used to generate a collection of 1,066,481 ESTs by massive parallel pyrosequencing. Sequences were assembled into 59,626 unigenes and 48,021 sequences were selected for oligo design and high-density microarray (Nimblegen 385K) fabrication with eight, non-overlapping 25-mers oligos per unigene. 18 independent experiments were used to evaluate the hybridization quality, precision, specificity and accuracy and all results confirmed the high technical quality of our microarray platform. Cross-validation of microarray data was carried out using quantitative qRT-PCR. Nine target genes were selected on the basis of microarray results and reflected the whole range of fold change (both up-regulated and down-regulated genes in different samples). A statistically significant positive correlation was obtained comparing expression levels for each target gene across all biological replicates both in qRT-PCR and microarray results. Further experiments illustrated the capacity of our arrays to detect differential gene expression in a variety of flax tissues as well as between two contrasted flax varieties

  11. Development and validation of a flax (Linum usitatissimum L. gene expression oligo microarray

    Directory of Open Access Journals (Sweden)

    Gutierrez Laurent

    2010-10-01

    Full Text Available Abstract Background Flax (Linum usitatissimum L. has been cultivated for around 9,000 years and is therefore one of the oldest cultivated species. Today, flax is still grown for its oil (oil-flax or linseed cultivars and its cellulose-rich fibres (fibre-flax cultivars used for high-value linen garments and composite materials. Despite the wide industrial use of flax-derived products, and our actual understanding of the regulation of both wood fibre production and oil biosynthesis more information must be acquired in both domains. Recent advances in genomics are now providing opportunities to improve our fundamental knowledge of these complex processes. In this paper we report the development and validation of a high-density oligo microarray platform dedicated to gene expression analyses in flax. Results Nine different RNA samples obtained from flax inner- and outer-stems, seeds, leaves and roots were used to generate a collection of 1,066,481 ESTs by massive parallel pyrosequencing. Sequences were assembled into 59,626 unigenes and 48,021 sequences were selected for oligo design and high-density microarray (Nimblegen 385K fabrication with eight, non-overlapping 25-mers oligos per unigene. 18 independent experiments were used to evaluate the hybridization quality, precision, specificity and accuracy and all results confirmed the high technical quality of our microarray platform. Cross-validation of microarray data was carried out using quantitative qRT-PCR. Nine target genes were selected on the basis of microarray results and reflected the whole range of fold change (both up-regulated and down-regulated genes in different samples. A statistically significant positive correlation was obtained comparing expression levels for each target gene across all biological replicates both in qRT-PCR and microarray results. Further experiments illustrated the capacity of our arrays to detect differential gene expression in a variety of flax tissues as well

  12. Application of design of experiment on electrophoretic deposition of ...

    Indian Academy of Sciences (India)

    Unknown

    Keywords. Coating; electrophoretic deposition; glass-ceramic; design of experiment. 1. Introduction ... other chemicals used were of laboratory reagent grade. ... changes from 7⋅0 to 9⋅5 that adversely affects the deposi- tion efficiency and ...

  13. Optimal Design of Shock Tube Experiments for Parameter Inference

    KAUST Repository

    Bisetti, Fabrizio; Knio, Omar

    2014-01-01

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Research Center. The unknown parameters are the pre-exponential parameters and the activation

  14. Microarrays for the evaluation of cell-biomaterial surface interactions

    Science.gov (United States)

    Thissen, H.; Johnson, G.; McFarland, G.; Verbiest, B. C. H.; Gengenbach, T.; Voelcker, N. H.

    2007-01-01

    The evaluation of cell-material surface interactions is important for the design of novel biomaterials which are used in a variety of biomedical applications. While traditional in vitro test methods have routinely used samples of relatively large size, microarrays representing different biomaterials offer many advantages, including high throughput and reduced sample handling. Here, we describe the simultaneous cell-based testing of matrices of polymeric biomaterials, arrayed on glass slides with a low cell-attachment background coating. Arrays were constructed using a microarray robot at 6 fold redundancy with solid pins having a diameter of 375 μm. Printed solutions contained at least one monomer, an initiator and a bifunctional crosslinker. After subsequent UV polymerisation, the arrays were washed and characterised by X-ray photoelectron spectroscopy. Cell culture experiments were carried out over 24 hours using HeLa cells. After labelling with CellTracker ® Green for the final hour of incubation and subsequent fixation, the arrays were scanned. In addition, individual spots were also viewed by fluorescence microscopy. The evaluation of cell-surface interactions in high-throughput assays as demonstrated here is a key enabling technology for the effective development of future biomaterials.

  15. Food Enterprise Web Design Based on User Experience

    OpenAIRE

    Fei Wang

    2015-01-01

    Excellent auxiliary food enterprise web design conveyed good visual transmission effect through user experience. This study was based on the food enterprise managers and customers as the main operating object to get the performance of the web page creation, web page design not only focused on the function and work efficiency, the most important thing was that the user experience in the process of web page interaction.

  16. Recycled memories : can flashbacks be triggered through experience design?

    OpenAIRE

    Fridriksson, Fridrik Steinn

    2013-01-01

    This paper examines the phenomenon flashbacks, often named the Proust phenomenon, through the lens of experience design. The research question is Can flashbacks be triggered through experience design? It would then be possible to call flashbacks memories recycled memories. To answer the question former studies were researched, mainly from the standpoint of cognitive psychology. The thesis discusses how different senses produce flashbacks and how they can be used as triggers. The difference be...

  17. Adaptive Lighting Design – Staged Experiences of Light

    DEFF Research Database (Denmark)

    Søndergaard, Karin; Petersen, Kjell Yngve

    2015-01-01

    involved in the negotiations of how the lighting design unfolds. Each installation stages a specified place, where participants perform their own experiences of being and moving in dynamically changing lighting settings. Through investigative actions participants test the ways that the lighting...... compositions influence their ability to orient themselves within the geography of the space and how the balances in light colours and luminous intensities affect their experience of directionality, distances, and scales. In short, the experience of being present in the space as well as one’s experience......Adaptive Lighting Design – Staged Experiences of Light The two installations, White Cube and White Box, enable experience-based studies as a form of perceptual activity, wherein lighting conditions are examined in a dialectical exchange between the system and the people participating. Adaptive...

  18. Motivating students to perform an experiment in technological design contexts

    NARCIS (Netherlands)

    Logman, P.S.W.M.; Kaper, W.H.; Ellermeijer, A.L.; Lindell, A.; Kähkönen, A.-L.; Viiri, J.

    2012-01-01

    In a teaching-learning sequence on the subject of energy we have tried technological design contexts to motivate students by using only context-based reasons to perform experiments on the subject of energy. We use these experiments to have the students reinvent practical laws of energy conservation

  19. ASIC design used in high energy physics experiments

    International Nuclear Information System (INIS)

    Zhang Hongyu; Lin Tao; Wu Ling; Zhao jingwei; Gu Shudi

    1997-01-01

    The author introduces an ASIC (Application Specific Integrated Circuit) design environment based on PC. Some design tools used in such environment are also introduced. A kind of ASIC chip used in high energy physics experiment, weighting mean timer, is being developed now

  20. On Design Experiment Teaching in Engineering Quality Cultivation

    Science.gov (United States)

    Chen, Xiao

    2008-01-01

    Design experiment refers to that designed and conducted by students independently and is surely an important method to cultivate students' comprehensive quality. According to the development and requirements of experimental teaching, this article carries out a study and analysis on the purpose, significance, denotation, connotation and…

  1. FFTF in-containment cell liner design and installation experience

    International Nuclear Information System (INIS)

    Umek, A.M.; Swenson, L.D.

    1980-01-01

    Design features and liner construction techniques are discussed. Cell leak-rate tests and the methods used to locate and repair leaks are described. A brief analysis of the overall experience at FFTF is provided, with recommendations for future plant designs

  2. Design of spatial experiments: Model fitting and prediction

    Energy Technology Data Exchange (ETDEWEB)

    Fedorov, V.V.

    1996-03-01

    The main objective of the paper is to describe and develop model oriented methods and algorithms for the design of spatial experiments. Unlike many other publications in this area, the approach proposed here is essentially based on the ideas of convex design theory.

  3. Building a Framework for Engineering Design Experiences in High School

    Science.gov (United States)

    Denson, Cameron D.; Lammi, Matthew

    2014-01-01

    In this article, Denson and Lammi put forth a conceptual framework that will help promote the successful infusion of engineering design experiences into high school settings. When considering a conceptual framework of engineering design in high school settings, it is important to consider the complex issue at hand. For the purposes of this…

  4. Application of design of experiments and artificial neural networks ...

    African Journals Online (AJOL)

    This paper discusses the use of Distance based optimal designs in the design of experiments (DOE) and artificial neural networks (ANN) in optimizing the stacking sequence for simply supported laminated composite plate under uniformly distributed load (UDL) for minimizing the deflections and stresses. A number of finite ...

  5. Thinking about "Design Thinking": A Study of Teacher Experiences

    Science.gov (United States)

    Retna, Kala S.

    2016-01-01

    Schools are continuously looking for new ways of enhancing student learning to equip students with skills that would enable them to cope with twenty-first century demands. One promising approach focuses on design thinking. This study examines teacher's perceptions, experiences and challenges faced in adopting design thinking. There is a lack of…

  6. Statistically designed experiments to screen chemical mixtures for possible interactions

    NARCIS (Netherlands)

    Groten, J.P.; Tajima, O.; Feron, V.J.; Schoen, E.D.

    1998-01-01

    For the accurate analysis of possible interactive effects of chemicals in a defined mixture, statistical designs are necessary to develop clear and manageable experiments. For instance, factorial designs have been successfully used to detect two-factor interactions. Particularly useful for this

  7. Incorporating operational experience and design changes in availability forecasts

    International Nuclear Information System (INIS)

    Norman, D.

    1988-01-01

    Reliability or availability forecasts which are based solely on past operating experience will be precise if the sample is large enough, and unbiased if nothing in the future design, environment, operating region or anything else changes. Unfortunately, life is never like that. This paper considers the methodology and philosophy of modifying forecasts based on past experience to take account also of changes in design, construction methods, operating philosophy, environments, operator training and so on, between the plants which provided the operating experience and the plant for which the forecast is being made. This emphasises the importance of collecting, assessing, and learning from past data and of a thorough knowledge of future designs, and procurement, operation, and maintenance policies. The difference between targets and central estimates is also discussed. The paper concludes that improvements in future availability can be made by learning from past experience, but that certain conditions must be fulfilled in order to do so. (author)

  8. The tissue microarray OWL schema: An open-source tool for sharing tissue microarray data

    Directory of Open Access Journals (Sweden)

    Hyunseok P Kang

    2010-01-01

    Full Text Available Background: Tissue microarrays (TMAs are enormously useful tools for translational research, but incompatibilities in database systems between various researchers and institutions prevent the efficient sharing of data that could help realize their full potential. Resource Description Framework (RDF provides a flexible method to represent knowledge in triples, which take the form Subject- Predicate-Object. All data resources are described using Uniform Resource Identifiers (URIs, which are global in scope. We present an OWL (Web Ontology Language schema that expands upon the TMA data exchange specification to address this issue and assist in data sharing and integration. Methods: A minimal OWL schema was designed containing only concepts specific to TMA experiments. More general data elements were incorporated from predefined ontologies such as the NCI thesaurus. URIs were assigned using the Linked Data format. Results: We present examples of files utilizing the schema and conversion of XML data (similar to the TMA DES to OWL. Conclusion: By utilizing predefined ontologies and global unique identifiers, this OWL schema provides a solution to the limitations of XML, which represents concepts defined in a localized setting. This will help increase the utilization of tissue resources, facilitating collaborative translational research efforts.

  9. Plant-pathogen interactions: what microarray tells about it?

    Science.gov (United States)

    Lodha, T D; Basak, J

    2012-01-01

    Plant defense responses are mediated by elementary regulatory proteins that affect expression of thousands of genes. Over the last decade, microarray technology has played a key role in deciphering the underlying networks of gene regulation in plants that lead to a wide variety of defence responses. Microarray is an important tool to quantify and profile the expression of thousands of genes simultaneously, with two main aims: (1) gene discovery and (2) global expression profiling. Several microarray technologies are currently in use; most include a glass slide platform with spotted cDNA or oligonucleotides. Till date, microarray technology has been used in the identification of regulatory genes, end-point defence genes, to understand the signal transduction processes underlying disease resistance and its intimate links to other physiological pathways. Microarray technology can be used for in-depth, simultaneous profiling of host/pathogen genes as the disease progresses from infection to resistance/susceptibility at different developmental stages of the host, which can be done in different environments, for clearer understanding of the processes involved. A thorough knowledge of plant disease resistance using successful combination of microarray and other high throughput techniques, as well as biochemical, genetic, and cell biological experiments is needed for practical application to secure and stabilize yield of many crop plants. This review starts with a brief introduction to microarray technology, followed by the basics of plant-pathogen interaction, the use of DNA microarrays over the last decade to unravel the mysteries of plant-pathogen interaction, and ends with the future prospects of this technology.

  10. Experience with quality assurance in fuel design and manufacturing

    International Nuclear Information System (INIS)

    Holzer, R.; Nilson, F.

    1984-01-01

    The Quality Assurance/Quality Control activities for nuclear fuel design and manufacturing described here are coordinated under a common ''Quality Assurance System For Fuel Assemblies and Associated Core Components'' which regulates the QA-functions of the development, design and manufacturing of fuel assemblies independent of the organizational assignment of the contributing technical groups. Some essential characteristics of the system are shown, using examples from design control, procurement, manufacturing and qualification of special processes. The experience is very good, it allowed a flexible and well controlled implementation of design and manufacturing innovations and contributed to the overall good fuel behavior. (orig.)

  11. Designing With Empathy: Humanizing Narratives for Inspired Healthcare Experiences.

    Science.gov (United States)

    Carmel-Gilfilen, Candy; Portillo, Margaret

    2016-01-01

    Designers can and should play a critical role in shaping a holistic healthcare experience by creating empathetic design solutions that foster a culture of care for patients, families, and staff. Using narrative inquiry as a design tool, this case study shares strategies for promoting empathy. Designing for patient-centered care infuses empathy into the creative process. Narrative inquiry offers a methodology to think about and create empathetic design that enhances awareness, responsiveness, and accountability. This article shares discoveries from a studio on empathetic design within an outpatient cancer care center. The studio engaged students in narrative techniques throughout the design process by incorporating aural, visual, and written storytelling. Benchmarking, observations, and interviews were merged with data drawn from scholarly evidence-based design literature reviews. Using an empathy-focused design process not only motivated students to be more engaged in the project but facilitated the generation of fresh and original ideas. Design solutions were innovative and impactful in supporting the whole person. Similarities as well as differences defined empathetic cancer care across projects and embodied concepts of design empowerment, design for the whole person, and design for healing. By becoming more conscious of empathy, those who create healthcare environments can better connect holistically to the user to take an experiential approach to design. Explicitly developing a mind-set that raises empathy to the forefront of the design process offers a breakthrough in design thinking that bridges the gap between what might be defined as "good design" and patient-centered care. © The Author(s) 2015.

  12. Book review. Design for Care: Innovating Healthcare Experience

    Directory of Open Access Journals (Sweden)

    Manuela Aguirre Ulloa

    2014-12-01

    Full Text Available Adapted from a review on the same book published by The Design Observer Group on April 4th, 2014. You can access the original publication online at http://designobserver.com/feature/design-for-care/38382/ Peter Jones´ recently published book represents a timely and comprehensive view of the value design brings to healthcare innovation. The book uses an empathic user story that conveys emotions and life to a structure that embraces the different meanings of Design for Care: Spanning from caring at the personal level to large-scale caring systems. The author has a main objective for each of its three main target audiences: Designers, companies and healthcare teams. Firstly, it allows designers to understand healthcare in a holistic and patient-centered way, breaking down specialized silos. Secondly, it shows how to design better care experiences across care continuums. Consequently, for companies serving the healthcare sector, the book presents how to humanize information technology (IT and services and meet the needs of health seekers. Finally, the book aims to inform healthcare teams (clinical practitioners and administrators the value design brings in research, co-creation and implementation of user and organizational experiences. It also proposes that healthcare teams learn and adopt design and systems thinking techniques so their innovation processes can be more participatory, holistic and user-centered.

  13. Structural Design Feasibility Study for the Global Climate Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Lewin,K.F.; Nagy, J.

    2008-12-01

    Neon, Inc. is proposing to establish a Global Change Experiment (GCE) Facility to increase our understanding of how ecological systems differ in their vulnerability to changes in climate and other relevant global change drivers, as well as provide the mechanistic basis for forecasting ecological change in the future. The experimental design was initially envisioned to consist of two complementary components; (A) a multi-factor experiment manipulating CO{sub 2}, temperature and water availability and (B) a water balance experiment. As the design analysis and cost estimates progressed, it became clear that (1) the technical difficulties of obtaining tight temperature control and maintaining elevated atmospheric carbon dioxide levels within an enclosure were greater than had been expected and (2) the envisioned study would not fit into the expected budget envelope if this was done in a partially or completely enclosed structure. After discussions between NEON management, the GCE science team, and Keith Lewin, NEON, Inc. requested Keith Lewin to expand the scope of this design study to include open-field exposure systems. In order to develop the GCE design to the point where it can be presented within a proposal for funding, a feasibility study of climate manipulation structures must be conducted to determine design approaches and rough cost estimates, and to identify advantages and disadvantages of these approaches including the associated experimental artifacts. NEON, Inc requested this design study in order to develop concepts for the climate manipulation structures to support the NEON Global Climate Experiment. This study summarizes the design concepts considered for constructing and operating the GCE Facility and their associated construction, maintenance and operations costs. Comparisons and comments about experimental artifacts, construction challenges and operational uncertainties are provided to assist in selecting the final facility design. The overall goal

  14. Design and the question of contemporary aesthetic experiences

    DEFF Research Database (Denmark)

    Folkmann, Mads Nygaard; Jensen, Hans-Christian

    2017-01-01

    The article raises the question of the historical relativism of aesthetic experiences and argues that aesthetic experiences have changed according to new conditions in the contemporary age of globalization, mediatization and consumer culture. In this context, design gains attention as a primary...... case for aesthetic evaluation as design objects are, more than ever, framed and staged to be experienced aesthetically. Basing on this starting point, the article argues that an understanding of contemporary aesthetic experiences requires a meeting of cultural theory and philosophical approaches....... On the one hand, cultural theory is required to understand the changed conditions of the production, circulation and consumption of aesthetic meaning in cultural forms of art and design. On the other, philosophical aesthetics gives access to understanding the mechanisms of aesthetic judgments and how...

  15. Optimal Design of Shock Tube Experiments for Parameter Inference

    KAUST Repository

    Bisetti, Fabrizio

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Research Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate expressions. The control parameters are the initial mixture composition and the temperature. The approach is based on first building a polynomial based surrogate model for the observables relevant to the shock tube experiments. Based on these surrogates, a novel MAP based approach is used to estimate the expected information gain in the proposed experiments, and to select the best experimental set-ups yielding the optimal expected information gains. The validity of the approach is tested using synthetic data generated by sampling the PC surrogate. We finally outline a methodology for validation using actual laboratory experiments, and extending experimental design methodology to the cases where the control parameters are noisy.

  16. Polyadenylation state microarray (PASTA) analysis.

    Science.gov (United States)

    Beilharz, Traude H; Preiss, Thomas

    2011-01-01

    Nearly all eukaryotic mRNAs terminate in a poly(A) tail that serves important roles in mRNA utilization. In the cytoplasm, the poly(A) tail promotes both mRNA stability and translation, and these functions are frequently regulated through changes in tail length. To identify the scope of poly(A) tail length control in a transcriptome, we developed the polyadenylation state microarray (PASTA) method. It involves the purification of mRNA based on poly(A) tail length using thermal elution from poly(U) sepharose, followed by microarray analysis of the resulting fractions. In this chapter we detail our PASTA approach and describe some methods for bulk and mRNA-specific poly(A) tail length measurements of use to monitor the procedure and independently verify the microarray data.

  17. Design and operating experiences with 50MW steam generator

    International Nuclear Information System (INIS)

    Kawara, M.; Yamaki, H.; Kanamori, A.; Tanaka, K.; Takahashi, T.

    1975-01-01

    The main purpose of the 50 MW steam generator is to have experiences of manufacturing and operation with large scale steam generator including necessary research and development works which can be reflected on the design and fabrication of 'Monju' (Japan 300 MWe prototype LMFBR). The detailed design of the 50 MW steam, generator was begun on March, 1972 and succeeded in the demonstration of 72 hours continuous operation with full power on June, 1974. It has been successfully operated since then, the performances of which have been evaluated through various kinds of tests. In this paper, the following items are mainly discussed system design, thermal and hydraulic design, structure and fabrication and some experiences on testing operation including cleaning and sodium flushing of equipment, sodium level control system, the behavior of hydrogen detection system and general outlook of the performance. (author)

  18. Photon Detection System Designs for the Deep Underground Neutrino Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Whittington, Denver [Indiana U.

    2015-11-19

    The Deep Underground Neutrino Experiment (DUNE) will be a premier facility for exploring long-standing questions about the boundaries of the standard model. Acting in concert with the liquid argon time projection chambers underpinning the far detector design, the DUNE photon detection system will capture ultraviolet scintillation light in order to provide valuable timing information for event reconstruction. To maximize the active area while maintaining a small photocathode coverage, the experiment will utilize a design based on plastic light guides coated with a wavelength-shifting compound, along with silicon photomultipliers, to collect and record scintillation light from liquid argon. This report presents recent preliminary performance measurements of this baseline design and several alternative designs which promise significant improvements in sensitivity to low-energy interactions.

  19. Design and operating experiences with 50MW steam generator

    Energy Technology Data Exchange (ETDEWEB)

    Kawara, M; Yamaki, H; Kanamori, A; Tanaka, K; Takahashi, T

    1975-07-01

    The main purpose of the 50 MW steam generator is to have experiences of manufacturing and operation with large scale steam generator including necessary research and development works which can be reflected on the design and fabrication of 'Monju' (Japan 300 MWe prototype LMFBR). The detailed design of the 50 MW steam, generator was begun on March, 1972 and succeeded in the demonstration of 72 hours continuous operation with full power on June, 1974. It has been successfully operated since then, the performances of which have been evaluated through various kinds of tests. In this paper, the following items are mainly discussed system design, thermal and hydraulic design, structure and fabrication and some experiences on testing operation including cleaning and sodium flushing of equipment, sodium level control system, the behavior of hydrogen detection system and general outlook of the performance. (author)

  20. Designing Meaningful Game Experiences for Rehabilitation and Sustainable Mobility Settings

    Directory of Open Access Journals (Sweden)

    Silvia Gabrielli

    2014-03-01

    Full Text Available This paper presents the approach followed in two ongoing research projects aimed to designing meaningful game-based experiences to support home rehabilitation, eco-sustainable mobility goals and more in general better daily lifestyles. We first introduce the need for designing meaningful game-based experiences that are well-connected to the relevant non-game settings and can be customized by/for users, then, we show examples of how this approach can be realized in the rehabilitation and sustainable mobility contexts.

  1. Linking probe thermodynamics to microarray quantification

    International Nuclear Information System (INIS)

    Li, Shuzhao; Pozhitkov, Alexander; Brouwer, Marius

    2010-01-01

    Understanding the difference in probe properties holds the key to absolute quantification of DNA microarrays. So far, Langmuir-like models have failed to link sequence-specific properties to hybridization signals in the presence of a complex hybridization background. Data from washing experiments indicate that the post-hybridization washing has no major effect on the specifically bound targets, which give the final signals. Thus, the amount of specific targets bound to probes is likely determined before washing, by the competition against nonspecific binding. Our competitive hybridization model is a viable alternative to Langmuir-like models. (comment)

  2. Design of a microwave calorimeter for the microwave tokamak experiment

    International Nuclear Information System (INIS)

    Marinak, M.

    1988-01-01

    The initial design of a microwave calorimeter for the Microwave Tokamak Experiment is presented. The design is optimized to measure the refraction and absorption of millimeter rf microwaves as they traverse the toroidal plasma of the Alcator C tokamak. Techniques utilized can be adapted for use in measuring high intensity pulsed output from a microwave device in an environment of ultra high vacuum, intense fields of ionizing and non-ionizing radiation and intense magnetic fields. 16 refs

  3. Designing Interactive Storytelling: A Virtual Environment for Personal Experience Narratives

    OpenAIRE

    Ladeira , Ilda; Marsden , Gary; Green , Lesley

    2011-01-01

    Part 1: Long and Short Papers; International audience; We describe an ongoing collaboration with the District Six Museum, in Cape Town, aimed at designing a storytelling prototype for preserving personal experience narratives. We detail the design of an interactive virtual environment (VE) which was inspired by a three month ethnography of real-life oral storytelling. The VE places the user as an audience member in a virtual group listening to two storytelling agents capable of two forms of i...

  4. Advanced plant design recommendations from Cook Nuclear Plant experience

    International Nuclear Information System (INIS)

    Zimmerman, W.L.

    1993-01-01

    A project in the American Electric Power Service Corporation to review operating and maintenance experience at Cook Nuclear Plant to identify recommendations for advanced nuclear plant design is described. Recommendations so gathered in the areas of plant fluid systems, instrument and control, testing and surveillance provisions, plant layout of equipment, provisions to enhance effective maintenance, ventilation systems, radiological protection, and construction, are presented accordingly. An example for a design review checklist for effective plant operations and maintenance is suggested

  5. Simulation Experiments in Practice: Statistical Design and Regression Analysis

    OpenAIRE

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...

  6. Applying operating experience to design the CANDU 3 process

    International Nuclear Information System (INIS)

    Harris, D.S.; Hinchley, E.M.; Pauksens, J.; Snell, V.; Yu, S.K.W.

    1991-01-01

    The CANDU 3 is an advanced, smaller (450 MWe), standardized version of the CANDU now being designed for service later in the decade and beyond. The design of this evolutionary nuclear power plant has been carefully planned and organized to gain maximum benefits from new technologies and from world experience to date in designing, building, commissioning and operating nuclear power stations. The good performance record of existing CANDU reactors makes consideration of operating experience from these plants a particularly vital component of the design process. Since the completion of the first four CANDU 6 stations in the early 1980s, and with the continuing evolution of the multi-unit CANDU station designs since then, AECL CANDU has devised several processes to ensure that such feedback is made available to designers. An important step was made in 1986 when a task force was set up to review and process ideas arising from the commissioning and early operation of the CANDU 6 reactors which were, by that time, operating successfully in Argentina and Korea, as well as the Canadian provinces of Quebec and New Brunswick. The task force issued a comprehensive report which, although aimed at the design of an improved CANDU 6 station, was made available to the CANDU 3 team. By that time also, the Institute of Power Operations (INPO) in the U.S., of which AECL is a Supplier Participant member, was starting to publish Good Practices and Guidelines related to the review and the use of operating experiences. In addition, details of significant events were being made available via the INPO SEE-IN (Significant Event Evaluation and Information Network) Program, and subsequently the CANNET network of the CANDU Owners' Group (COG). Systematic review was thus possible by designers of operations reports, significant event reports, and related documents in a continuing program of design improvement. Another method of incorporating operations feedback is to involve experienced utility

  7. Applying operating experience to design the CANDU 3 process

    Energy Technology Data Exchange (ETDEWEB)

    Harris, D S; Hinchley, E M; Pauksens, J; Snell, V; Yu, S K.W. [AECL-CANDU, Ontario (Canada)

    1991-04-01

    The CANDU 3 is an advanced, smaller (450 MWe), standardized version of the CANDU now being designed for service later in the decade and beyond. The design of this evolutionary nuclear power plant has been carefully planned and organized to gain maximum benefits from new technologies and from world experience to date in designing, building, commissioning and operating nuclear power stations. The good performance record of existing CANDU reactors makes consideration of operating experience from these plants a particularly vital component of the design process. Since the completion of the first four CANDU 6 stations in the early 1980s, and with the continuing evolution of the multi-unit CANDU station designs since then, AECL CANDU has devised several processes to ensure that such feedback is made available to designers. An important step was made in 1986 when a task force was set up to review and process ideas arising from the commissioning and early operation of the CANDU 6 reactors which were, by that time, operating successfully in Argentina and Korea, as well as the Canadian provinces of Quebec and New Brunswick. The task force issued a comprehensive report which, although aimed at the design of an improved CANDU 6 station, was made available to the CANDU 3 team. By that time also, the Institute of Power Operations (INPO) in the U.S., of which AECL is a Supplier Participant member, was starting to publish Good Practices and Guidelines related to the review and the use of operating experiences. In addition, details of significant events were being made available via the INPO SEE-IN (Significant Event Evaluation and Information Network) Program, and subsequently the CANNET network of the CANDU Owners' Group (COG). Systematic review was thus possible by designers of operations reports, significant event reports, and related documents in a continuing program of design improvement. Another method of incorporating operations feedback is to involve experienced utility

  8. Experience in Design and Learning Approaches – Enhancing the Framework for Experience

    OpenAIRE

    Merja L.M. Bauters

    2017-01-01

    In design and learning studies, an increasing amount of attention has been paid to experience. Many design approaches relate experience to embodiment and phenomenology. The growth in the number of applications that use the Internet of Things (IoT) has shifted human interactions from mobile devices and computers to tangible, material things. In education, the pressure to learn and update skills and knowledge, especially in work environments, has underlined the challenge of understanding how wo...

  9. The buffer/container experiment design and construction report

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, N.A.; Wan, A.W.L.; Roach, P.J

    1998-03-01

    The Buffer/Container Experiment was a full-scale in situ experiment, installed at a depth of 240 m in granitic rock at AECL's Underground Research Laboratory (URL). The experiment was designed to examine the performance of a compacted sand-bentonite buffer material under the influences of elevated temperature and in situ moisture conditions. Buffer material was compacted in situ into a 5-m-deep, 1.24-m-diameter borehole drilled into the floor of an excavation. A 2.3-m long heater, representative of a nuclear fuel waste container, was placed within the buffer, and instrumentation was installed to monitor changes in buffer moisture conditions, temperature and stress. The experiment was sealed at the top of the borehole and restrained against vertical displacement. Instrumentation in the rock monitored pore pressures, temperatures and rock displacement. The heater was operated at a constant power of 1200 W, which provided a heater skin temperature of approximately 85 degrees C. Experiment construction and installation required two years, followed by two and a half years of heater operation and two years of monitoring the rock conditions during cooling. The construction phase of the experiment included the design, construction and testing of a segmental heater and controller, geological and hydrogeological characterization of the rock, excavation of the experiment room, drilling of the emplacement borehole using high pressure water, mixing and in situ compaction of buffer material, installation of instrumentation in the rock, buffer and on the heater, and the construction of concrete curb and steel vertical restraint system at the top of emplacement borehole. Upon completion of the experiment, decommissioning sampling equipment was designed and constructed and sampling methods were developed which allowed approximately 2000 samples of buffer material to be taken over a 12-day period. Quality assurance procedures were developed for all aspects of experiment

  10. The buffer/container experiment design and construction report

    International Nuclear Information System (INIS)

    Chandler, N.A.; Wan, A.W.L.; Roach, P.J.

    1998-03-01

    The Buffer/Container Experiment was a full-scale in situ experiment, installed at a depth of 240 m in granitic rock at AECL's Underground Research Laboratory (URL). The experiment was designed to examine the performance of a compacted sand-bentonite buffer material under the influences of elevated temperature and in situ moisture conditions. Buffer material was compacted in situ into a 5-m-deep, 1.24-m-diameter borehole drilled into the floor of an excavation. A 2.3-m long heater, representative of a nuclear fuel waste container, was placed within the buffer, and instrumentation was installed to monitor changes in buffer moisture conditions, temperature and stress. The experiment was sealed at the top of the borehole and restrained against vertical displacement. Instrumentation in the rock monitored pore pressures, temperatures and rock displacement. The heater was operated at a constant power of 1200 W, which provided a heater skin temperature of approximately 85 degrees C. Experiment construction and installation required two years, followed by two and a half years of heater operation and two years of monitoring the rock conditions during cooling. The construction phase of the experiment included the design, construction and testing of a segmental heater and controller, geological and hydrogeological characterization of the rock, excavation of the experiment room, drilling of the emplacement borehole using high pressure water, mixing and in situ compaction of buffer material, installation of instrumentation in the rock, buffer and on the heater, and the construction of concrete curb and steel vertical restraint system at the top of emplacement borehole. Upon completion of the experiment, decommissioning sampling equipment was designed and constructed and sampling methods were developed which allowed approximately 2000 samples of buffer material to be taken over a 12-day period. Quality assurance procedures were developed for all aspects of experiment construction

  11. Bayesian optimal experimental design for the Shock-tube experiment

    International Nuclear Information System (INIS)

    Terejanu, G; Bryant, C M; Miki, K

    2013-01-01

    The sequential optimal experimental design formulated as an information-theoretic sensitivity analysis is applied to the ignition delay problem using real experimental. The optimal design is obtained by maximizing the statistical dependence between the model parameters and observables, which is quantified in this study using mutual information. This is naturally posed in the Bayesian framework. The study shows that by monitoring the information gain after each measurement update, one can design a stopping criteria for the experimental process which gives a minimal set of experiments to efficiently learn the Arrhenius parameters.

  12. The engineering design of the Tokamak Physics Experiment

    International Nuclear Information System (INIS)

    Schmidt, J.A.

    1994-01-01

    A mission and supporting physics objectives have been developed, which establishes an important role for the Tokamak Physics Experiment (TPX) in developing the physic basis for a future fusion reactor. The design of TPX include advanced physics features, such as shaping and profile control, along with the capability of operating for very long pulses. The development of the superconducting magnets, actively cooled internal hardware, and remote maintenance will be an important technology contribution to future fusion projects, such as ITER. The Conceptual Design and Management Systems for TPX have been developed and reviewed, and the project is beginning Preliminary Design. If adequately funded the construction project should be completed in the year 2000

  13. Design of experiments for test of fuel element reliability

    International Nuclear Information System (INIS)

    Boehmert, J.; Juettner, C.; Linek, J.

    1989-01-01

    Changes of fuel element design and modifications of the operational conditions have to be tested in experiments and pilot projects for nuclear safety. Experimental design is an useful statistical method minimizing costs and risks for this procedure. The main problem of our work was to investigate the connection between failure rate of fuel elements, sample size, confidence interval, and error probability. Using the statistic model of the binomial distribution appropriate relations were derived and discussed. A stepwise procedure based on a modified sequential analysis according to Wald was developed as a strategy of introduction for modifications of the fuel element design and of the operational conditions. (author)

  14. Design of Experiment Using Simulation of a Discrete Dynamical System

    Directory of Open Access Journals (Sweden)

    Mašek Jan

    2016-12-01

    Full Text Available The topic of the presented paper is a promising approach to achieve optimal Design of Experiment (DoE, i.e. spreading of points within a design domain, using a simulation of a discrete dynamical system of interacting particles within an n-dimensional design space. The system of mutually repelling particles represents a physical analogy of the Audze-Eglājs (AE optimization criterion and its periodical modification (PAE, respectively. The paper compares the performance of two approaches to implementation: a single-thread process using the JAVA language environment and a massively parallel solution employing the nVidia CUDA platform.

  15. Optimizing an experimental design for an electromagnetic experiment

    Science.gov (United States)

    Roux, Estelle; Garcia, Xavier

    2013-04-01

    Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on acquisition of suitable datasets. This can be done through the design of an optimal experiment. Optimizing an experimental design implies a compromise between maximizing the information we get about the target and reducing the cost of the experiment, considering a wide range of constraints (logistical, financial, experimental …). We are currently developing a method to design an optimal controlled-source electromagnetic (CSEM) experiment to detect a potential CO2 reservoir and monitor this reservoir during and after CO2 injection. Our statistical algorithm combines the use of linearized inverse theory (to evaluate the quality of one given design via the objective function) and stochastic optimization methods like genetic algorithm (to examine a wide range of possible surveys). The particularity of our method is that it uses a multi-objective genetic algorithm that searches for designs that fit several objective functions simultaneously. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our new experimental design algorithm has been tested with a realistic one-dimensional resistivity model of the Earth in the region of study (northern Spain CO2 sequestration test site). We show that a small number of well distributed observations have the potential to resolve the target. This simple test also points out the importance of a well chosen objective function. Finally, in the context of CO2 sequestration that motivates this study, we might be interested in maximizing the information we get about the reservoir layer. In that case, we show how the combination of two different objective functions considerably improve its resolution.

  16. Sustainability in Design Engineering Education; Experiences in Northern Europe

    NARCIS (Netherlands)

    Dewulf, K.; Wever, R.; Boks, C.; Bakker, C.; D'hulster, F.

    2009-01-01

    In recent years, the implementation of sustainability into the curricula of engineering has become increasingly important. This paper focuses on the experiences of integrating sustainability in Design Engineering education in the academic bachelor programs at Delft University of Technology in The

  17. Statistical aspects of quantitative real-time PCR experiment design

    Czech Academy of Sciences Publication Activity Database

    Kitchen, R.R.; Kubista, Mikael; Tichopád, Aleš

    2010-01-01

    Roč. 50, č. 4 (2010), s. 231-236 ISSN 1046-2023 R&D Projects: GA AV ČR IAA500520809 Institutional research plan: CEZ:AV0Z50520701 Keywords : Real-time PCR * Experiment design * Nested analysis of variance Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 4.527, year: 2010

  18. Educational Website Design Process: Changes in TPACK Competencies and Experiences

    Science.gov (United States)

    Önal, Nezih; Alemdag, Ecenaz

    2018-01-01

    The number of technological pedagogical and content knowledge (TPACK) studies has been increasing day by day; however, limited number of studies has provided both quantitative and qualitative findings based on teachers' learning by design experiences. This study aimed to reveal the changes in pre-service teachers' TPACK competencies in the…

  19. An experiment designed to verify the general theory of relativity

    International Nuclear Information System (INIS)

    Surdin, Maurice

    1960-01-01

    The project for an experiment which uses the effect of gravitation on Maser-type clocks placed on the ground at two different heights and which is designed to verify the general theory of relativity. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, sitting of 11 January 1960 [fr

  20. Design of experiment approach for the process optimization of ...

    African Journals Online (AJOL)

    Mulberry is considered as food-medicine herb, with specific nutritional and medicinal values. In this study, response surface methodology (RSM) was employed to optimize the ultrasonic-assisted extraction of total polysaccharide from mulberry using Box-Behnken design (BBD). Based on single factor experiments, a three ...

  1. The Design and Evaluation of Teaching Experiments in Computer Science.

    Science.gov (United States)

    Forcheri, Paola; Molfino, Maria Teresa

    1992-01-01

    Describes a relational model that was developed to provide a framework for the design and evaluation of teaching experiments for the introduction of computer science in secondary schools in Italy. Teacher training is discussed, instructional materials are considered, and use of the model for the evaluation process is described. (eight references)…

  2. From Content to Context: Videogames as Designed Experience

    Science.gov (United States)

    Squire, Kurt

    2006-01-01

    Interactive immersive entertainment, or videogame playing, has emerged as a major entertainment and educational medium. As research and development initiatives proliferate, educational researchers might benefit by developing more grounded theories about them. This article argues for framing game play as a "designed experience." Players'…

  3. Aircraft wind tunnel characterisation using modern design of experiments

    CSIR Research Space (South Africa)

    Dias, JF

    2013-04-01

    Full Text Available included a structural analysis, reinforcement and instrumentation of the model. As part of the MDOE technique, the experiment design of the test program had to be considered and analysed prior to the testing itself. The tests were then executed and the data...

  4. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic

  5. A system for designing and simulating particle physics experiments

    International Nuclear Information System (INIS)

    Zelazny, R.; Strzalkowski, P.

    1987-01-01

    In view of the rapid development of experimental facilities and their costs, the systematic design and preparation of particle physics experiments have become crucial. A software system is proposed as an aid for the experimental designer, mainly for experimental geometry analysis and experimental simulation. The following model is adopted: the description of an experiment is formulated in a language (here called XL) and put by its processor in a data base. The language is based on the entity-relationship-attribute approach. The information contained in the data base can be reported and analysed by an analyser (called XA) and modifications can be made at any time. In particular, the Monte Carlo methods can be used in experiment simulation for both physical phenomena in experimental set-up and detection analysis. The general idea of the system is based on the design concept of ISDOS project information systems. The characteristics of the simulation module are similar to those of the CERN Geant system, but some extensions are proposed. The system could be treated as a component of greater, integrated software environment for the design of particle physics experiments, their monitoring and data processing. (orig.)

  6. Designing Curricular Experiences that Promote Young Adolescents' Cognitive Growth

    Science.gov (United States)

    Brown, Dave F.; Canniff, Mary

    2007-01-01

    One of the most challenging daily experiences of teaching young adolescents is helping them transition from Piaget's concrete to the formal operational stage of cognitive development during the middle school years. Students who have reached formal operations can design and test hypotheses, engage in deductive reasoning, use flexible thinking,…

  7. An Introduction to MAMA (Meta-Analysis of MicroArray data) System.

    Science.gov (United States)

    Zhang, Zhe; Fenstermacher, David

    2005-01-01

    Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.

  8. Construction of a 21-Component Layered Mixture Experiment Design

    International Nuclear Information System (INIS)

    Piepel, Gregory F.; Cooley, Scott K.; Jones, Bradley

    2004-01-01

    This paper describes the solution to a unique and challenging mixture experiment design problem involving: (1) 19 and 21 components for two different parts of the design, (2) many single-component and multi-component constraints, (3) augmentation of existing data, (4) a layered design developed in stages, and (5) a no-candidate-point optimal design approach. The problem involved studying the liquidus temperature of spinel crystals as a function of nuclear waste glass composition. The statistical objective was to develop an experimental design by augmenting existing glasses with new nonradioactive and radioactive glasses chosen to cover the designated nonradioactive and radioactive experimental regions. The existing 144 glasses were expressed as 19-component nonradioactive compositions and then augmented with 40 new nonradioactive glasses. These included 8 glasses on the outer layer of the region, 27 glasses on an inner layer, 2 replicate glasses at the centroid, and one replicate each of three existing glasses. Then, the 144 + 40 = 184 glasses were expressed as 21-component radioactive compositions and augmented with 5 radioactive glasses. A D-optimal design algorithm was used to select the new outer layer, inner layer, and radioactive glasses. Several statistical software packages can generate D-optimal experimental designs, but nearly all require a set of candidate points (e.g., vertices) from which to select design points. The large number of components (19 or 21) and many constraints made it impossible to generate the huge number of vertices and other typical candidate points. JMP(R) was used to select design points without candidate points. JMP uses a coordinate-exchange algorithm modified for mixture experiments, which is discussed in the paper

  9. A strategic map for high-impact virtual experience design

    Science.gov (United States)

    Faste, Haakon; Bergamasco, Massimo

    2009-02-01

    We have employed methodologies of human centered design to inspire and guide the engineering of a definitive low-cost aesthetic multimodal experience intended to stimulate cultural growth. Using a combination of design research, trend analysis and the programming of immersive virtual 3D worlds, over 250 innovative concepts have been brainstormed, prototyped, evaluated and refined. These concepts have been used to create a strategic map for the development of highimpact virtual art experiences, the most promising of which have been incorporated into a multimodal environment programmed in the online interactive 3D platform XVR. A group of test users have evaluated the experience as it has evolved, using a multimodal interface with stereo vision, 3D audio and haptic feedback. This paper discusses the process, content, results, and impact on our engineering laboratory that this research has produced.

  10. Statistical aspects of quantitative real-time PCR experiment design.

    Science.gov (United States)

    Kitchen, Robert R; Kubista, Mikael; Tichopad, Ales

    2010-04-01

    Experiments using quantitative real-time PCR to test hypotheses are limited by technical and biological variability; we seek to minimise sources of confounding variability through optimum use of biological and technical replicates. The quality of an experiment design is commonly assessed by calculating its prospective power. Such calculations rely on knowledge of the expected variances of the measurements of each group of samples and the magnitude of the treatment effect; the estimation of which is often uninformed and unreliable. Here we introduce a method that exploits a small pilot study to estimate the biological and technical variances in order to improve the design of a subsequent large experiment. We measure the variance contributions at several 'levels' of the experiment design and provide a means of using this information to predict both the total variance and the prospective power of the assay. A validation of the method is provided through a variance analysis of representative genes in several bovine tissue-types. We also discuss the effect of normalisation to a reference gene in terms of the measured variance components of the gene of interest. Finally, we describe a software implementation of these methods, powerNest, that gives the user the opportunity to input data from a pilot study and interactively modify the design of the assay. The software automatically calculates expected variances, statistical power, and optimal design of the larger experiment. powerNest enables the researcher to minimise the total confounding variance and maximise prospective power for a specified maximum cost for the large study. Copyright 2010 Elsevier Inc. All rights reserved.

  11. Design choices and issues in fixed-target B experiments

    International Nuclear Information System (INIS)

    Camilleri, L.

    1993-01-01

    The main priority of any experiment on B physics in the years to come will be an endeavour to observe CP violation in the B sector. Such measurements imply the following requirements of the experiment. Trigger: a muon trigger will be sensitive to J/ψ reactions and muon tags; an electron trigger will double the number of lepton events; in order to include kaon tags and self-tagging reactions, the experiment must not rely entirely on lepton triggers. Secondary Vertex triggers and hadron p T triggers should be included in order to have the maximum flexibility. Detector: vertex detector; particle identification; good momentum resolution; electromagnetic and hadronic calorimeters; muon detector. In addition the following issues have to be addressed: Collider or fixed-target mode? If fixed target, extracted beam or internal target? If internal target, gas jet or wire target? If a gas jet, hydrogen or a heavy gas? Beam pipe design. Silicon microvertex design and radiation damage. K s 0 decay path. Particle identification. Momentum resolution. Order of detectors. No single method stands out as the open-quotes obvious one.close quotes An extracted beam yields better vertex resolution and an internal target easier triggering. A flexible and diverse triggering scheme is of prime importance in order to be sensitive to as many reactions as possible, the experiment should not be limited to lepton triggers only. Proposed experiments (P867, HERA B) at existing machines will be invaluable for testing new devices and strategies for the LHC and SSC experiments

  12. Microarrays in brain research: the good, the bad and the ugly.

    Science.gov (United States)

    Mirnics, K

    2001-06-01

    Making sense of microarray data is a complex process, in which the interpretation of findings will depend on the overall experimental design and judgement of the investigator performing the analysis. As a result, differences in tissue harvesting, microarray types, sample labelling and data analysis procedures make post hoc sharing of microarray data a great challenge. To ensure rapid and meaningful data exchange, we need to create some order out of the existing chaos. In these ground-breaking microarray standardization and data sharing efforts, NIH agencies should take a leading role

  13. Using interactive model simulations in co-design : An experiment in urban design

    NARCIS (Netherlands)

    Steen, M.G.D.; Arendsen, J.; Cremers, A.H.M.; Vries, A. de; Jong, J.M.G. de; Koning, N.M. de

    2013-01-01

    This paper presents an experiment in which people performed a co-design task in urban design, using a multi-user touch table application with or without interactive model simulations. We hypothesised that using the interactive model simulations would improve communication and co-operation between

  14. Explorations in Teaching Sustainable Design: A Studio Experience in Interior Design/Architecture

    Science.gov (United States)

    Gurel, Meltem O.

    2010-01-01

    This article argues that a design studio can be a dynamic medium to explore the creative potential of the complexity of sustainability from its technological to social ends. The study seeks to determine the impact of an interior design/architecture studio experience that was initiated to teach diverse meanings of sustainability and to engage the…

  15. Experience in Design and Learning Approaches – Enhancing the Framework for Experience

    Directory of Open Access Journals (Sweden)

    Merja L.M. Bauters

    2017-06-01

    Full Text Available In design and learning studies, an increasing amount of attention has been paid to experience. Many design approaches relate experience to embodiment and phenomenology. The growth in the number of applications that use the Internet of Things (IoT has shifted human interactions from mobile devices and computers to tangible, material things. In education, the pressure to learn and update skills and knowledge, especially in work environments, has underlined the challenge of understanding how workers learn from reflection while working. These directions have been fuelled by research findings in the neurosciences, embodied cognition, the extended phenomenological–cognitive system and the role of emotions in decision-making and meaning making. The perspective on experience in different disciplines varies, and the aim is often to categorise experience. These approaches provide a worthwhile view of the importance of experience in learning and design, such as the recent emphasis on conceptual and epistemological knowledge creation. In pragmatism, experience plays a considerable role in research, art, communication and reflection. Therefore, I rely on Peirce’s communicative theory of signs and Dewey’s philosophy of experience to examine how experience is connected to reflection and therefore how it is necessarily tangible.

  16. Enhancing the Therapy Experience Using Principles of Video Game Design.

    Science.gov (United States)

    Folkins, John Wm; Brackenbury, Tim; Krause, Miriam; Haviland, Allison

    2016-02-01

    This article considers the potential benefits that applying design principles from contemporary video games may have on enhancing therapy experiences. Six principles of video game design are presented, and their relevance for enriching clinical experiences is discussed. The motivational and learning benefits of each design principle have been discussed in the education literature as having positive impacts on student motivation and learning and are related here to aspects of clinical practice. The essential experience principle suggests connecting all aspects of the experience around a central emotion or cognitive connection. The discovery principle promotes indirect learning in focused environments. The risk-taking principle addresses the uncertainties clients face when attempting newly learned skills in novel situations. The generalization principle encourages multiple opportunities for skill transfer. The reward system principle directly relates to the scaffolding of frequent and varied feedback in treatment. Last, the identity principle can assist clients in using their newly learned communication skills to redefine self-perceptions. These principles highlight areas for research and interventions that may be used to reinforce or advance current practice.

  17. Optimal experiment design for identification of grey-box models

    DEFF Research Database (Denmark)

    Sadegh, Payman; Melgaard, Henrik; Madsen, Henrik

    1994-01-01

    Optimal experiment design is investigated for stochastic dynamic systems where the prior partial information about the system is given as a probability distribution function in the system parameters. The concept of information is related to entropy reduction in the system through Lindley's measur...... estimation results in a considerable reduction of the experimental length. Besides, it is established that the physical knowledge of the system enables us to design experiments, with the goal of maximizing information about the physical parameters of interest.......Optimal experiment design is investigated for stochastic dynamic systems where the prior partial information about the system is given as a probability distribution function in the system parameters. The concept of information is related to entropy reduction in the system through Lindley's measure...... of average information, and the relationship between the choice of information related criteria and some estimators (MAP and MLE) is established. A continuous time physical model of the heat dynamics of a building is considered and the results show that performing an optimal experiment corresponding to a MAP...

  18. Current Knowledge on Microarray Technology - An Overview

    African Journals Online (AJOL)

    Erah

    This paper reviews basics and updates of each microarray technology and serves to .... through protein microarrays. Protein microarrays also known as protein chips are nothing but grids that ... conditioned media, patient sera, plasma and urine. Clontech ... based antibody arrays) is similar to membrane-based antibody ...

  19. Diagnostic and analytical applications of protein microarrays

    DEFF Research Database (Denmark)

    Dufva, Hans Martin; Christensen, C.B.V.

    2005-01-01

    DNA microarrays have changed the field of biomedical sciences over the past 10 years. For several reasons, antibody and other protein microarrays have not developed at the same rate. However, protein and antibody arrays have emerged as a powerful tool to complement DNA microarrays during the post...

  20. Safeguards by Design - Experiences from New Nuclear Installation

    International Nuclear Information System (INIS)

    Okko, O.; Honkamaa, T.; Kuusi, A.; Rautjaervi, J.

    2010-01-01

    The experiences obtained from the current construction projects at Olkiluoto clearly point out the need to introduce the safeguards requirements into facility design process at an early stage. The early Design Information is completed, in principle, before the construction. However, during the design of containment, surveillance systems, and non-destructive assay equipment and their cabling, the design requirements for safeguards systems were not available either for the new reactor unit or for the disposal plant with a geological repository. Typically, the official Design Information documents are not available early enough for efficient integration of safeguards systems into new facilities. In case of the Olkiluoto projects, this was due to understandable reasons: at the new reactor unit the design acceptance by the ordering company and by the nuclear safety authorities was a long process, ongoing simultaneously with parts of the construction; and at the geological repository the national legislation assigns the repository the status of a nuclear facility only after the initial construction and research phase of the repository when the long-term safety of the disposal concept is demonstrated. As similar factors are likely to delay the completion of the official Design Information documents with any new reactor projects until the construction is well underway and efficient integration of safeguards systems is impossible. Therefore, the proliferation resistance of new nuclear installations should be addressed in the design phase before the official Design Information documents are finished. This approach was demonstrated with the enlargement of the Olkiluoto spent fuel storage building. For this approach to work, strong national contribution is needed to facilitate the early communication and exchange of information between the IAEA and the other stakeholders to enable the design of facilities that can be efficiently safeguarded. With the renaissance of nuclear

  1. A comprehensive comparison of random forests and support vector machines for microarray-based cancer classification

    Directory of Open Access Journals (Sweden)

    Wang Lily

    2008-07-01

    Full Text Available Abstract Background Cancer diagnosis and clinical outcome prediction are among the most important emerging applications of gene expression microarray technology with several molecular signatures on their way toward clinical deployment. Use of the most accurate classification algorithms available for microarray gene expression data is a critical ingredient in order to develop the best possible molecular signatures for patient care. As suggested by a large body of literature to date, support vector machines can be considered "best of class" algorithms for classification of such data. Recent work, however, suggests that random forest classifiers may outperform support vector machines in this domain. Results In the present paper we identify methodological biases of prior work comparing random forests and support vector machines and conduct a new rigorous evaluation of the two algorithms that corrects these limitations. Our experiments use 22 diagnostic and prognostic datasets and show that support vector machines outperform random forests, often by a large margin. Our data also underlines the importance of sound research design in benchmarking and comparison of bioinformatics algorithms. Conclusion We found that both on average and in the majority of microarray datasets, random forests are outperformed by support vector machines both in the settings when no gene selection is performed and when several popular gene selection methods are used.

  2. Design of a reacceleration experiment using the Choppertron

    International Nuclear Information System (INIS)

    Fiorentini, G.M.; Wang, C.; Houck, T.L.

    1993-01-01

    The Microwave Source Facility at the Lawrence Livermore National Laboratory is commencing a series of experiments involving reacceleration of a modulated beam alternating with extraction of energy in the form of X-band microwaves. The Choppertron, a high-power microwave generator, is used to modulate a 5-MV, 1-kA induction accelerator beam. The modulated beam is then passed through a series of traveling-wave output structures separated by induction cells. In this paper we report on computer simulations used in the design of these experiments. Simulations include analysis of beam transport, modulation, power extraction and transverse instabilities

  3. SSSFD manipulator engineering using statistical experiment design techniques

    Science.gov (United States)

    Barnes, John

    1991-01-01

    The Satellite Servicer System Flight Demonstration (SSSFD) program is a series of Shuttle flights designed to verify major on-orbit satellite servicing capabilities, such as rendezvous and docking of free flyers, Orbital Replacement Unit (ORU) exchange, and fluid transfer. A major part of this system is the manipulator system that will perform the ORU exchange. The manipulator must possess adequate toolplate dexterity to maneuver a variety of EVA-type tools into position to interface with ORU fasteners, connectors, latches, and handles on the satellite, and to move workpieces and ORUs through 6 degree of freedom (dof) space from the Target Vehicle (TV) to the Support Module (SM) and back. Two cost efficient tools were combined to perform a study of robot manipulator design parameters. These tools are graphical computer simulations and Taguchi Design of Experiment methods. Using a graphics platform, an off-the-shelf robot simulation software package, and an experiment designed with Taguchi's approach, the sensitivities of various manipulator kinematic design parameters to performance characteristics are determined with minimal cost.

  4. Design of a materials testing experiment for the INTOR

    International Nuclear Information System (INIS)

    Vogel, M.A.; Opperman, E.K.

    1981-01-01

    The United States, Japan, USSR and the European community are jointly participating in the design of an International Tokamak Reactor called INTOR. In support of the US contribution to the INTOR design, the features of an experiment for bulk neutron irradiation damage studies were developed. It is anticipated that materials testing will be an important part of the programmatic mission of INTOR and consequently the requirements for materials testing in INTOR must be identified early in the reactor design to insure compatibility. The design features of the experiment, called a Channel Test, are given in this paper. The major components of the channel test are the water cooled heat sink (channel module) and the specimen capsule. The temperature within each of the 153 specimen capsules is predetermined by engineering the thermal barrier between the specimen capsule and heat sink. Individual capsules can be independently accessed and are designed to operate at a predetermined temperature within the range of 50 to 700 0 C. The total irradiation volume within a single channel test is 45 liters. Features of the channel test that result in experimental versatility and simplified remote access and handling are discussed

  5. Calling biomarkers in milk using a protein microarray on your smartphone

    NARCIS (Netherlands)

    Ludwig, S.K.J.; Tokarski, Christian; Lang, Stefan N.; Ginkel, Van L.A.; Zhu, Hongying; Ozcan, Aydogan; Nielen, M.W.F.

    2015-01-01

    Here we present the concept of a protein microarray-based fluorescence immunoassay for multiple biomarker detection in milk extracts by an ordinary smartphone. A multiplex immunoassay was designed on a microarray chip, having built-in positive and negative quality controls. After the immunoassay

  6. CANDU 9 Design improvements based on experience feedback

    International Nuclear Information System (INIS)

    Yu, S. K. W.; Bonechi, M.; Snell, V. G.

    2000-01-01

    An evolutionary approach utilizing advance technologies has been implenented for the enhancement introduced in the CANDU 9 Nuclear Power Plant (NPP) design. The design of these systems and associated equipment has also benfited from experience feedback from operating CANDU stations and from including advanced products from CANDU engineering and research programs. This paper highlights the design features that contribute to the safety improvements of the CANDU 9 design, summarizes the analysis results which demonstrate the improved performance and also emphasizes design features which reduce operation and maintenance (Q and M) costs. The safety design features highlighted include the increased use of passive devices and heat sinks to achieve extensive system simplification; this also improves reliability and reduces maintenance workloads. System features that contribute to improved operability are also described. The CANDU 9 Control Center provides plant staff with enhanced operating, maintenance and diagnostics features which significantly improve operability, testing and maintainability due to the integration of human factors engineering with a systematic design process. (author)

  7. Scaling studies and conceptual experiment designs for NGNP CFD assessment

    Energy Technology Data Exchange (ETDEWEB)

    D. M. McEligot; G. E. McCreery

    2004-11-01

    The objective of this report is to document scaling studies and conceptual designs for flow and heat transfer experiments intended to assess CFD codes and their turbulence models proposed for application to prismatic NGNP concepts. The general approach of the project is to develop new benchmark experiments for assessment in parallel with CFD and coupled CFD/systems code calculations for the same geometry. Two aspects of the complex flow in an NGNP are being addressed: (1) flow and thermal mixing in the lower plenum ("hot streaking" issue) and (2) turbulence and resulting temperature distributions in reactor cooling channels ("hot channel" issue). Current prismatic NGNP concepts are being examined to identify their proposed flow conditions and geometries over the range from normal operation to decay heat removal in a pressurized cooldown. Approximate analyses have been applied to determine key non-dimensional parameters and their magnitudes over this operating range. For normal operation, the flow in the coolant channels can be considered to be dominant turbulent forced convection with slight transverse property variation. In a pressurized cooldown (LOFA) simulation, the flow quickly becomes laminar with some possible buoyancy influences. The flow in the lower plenum can locally be considered to be a situation of multiple hot jets into a confined crossflow -- with obstructions. Flow is expected to be turbulent with momentumdominated turbulent jets entering; buoyancy influences are estimated to be negligible in normal full power operation. Experiments are needed for the combined features of the lower plenum flows. Missing from the typical jet experiments available are interactions with nearby circular posts and with vertical posts in the vicinity of vertical walls - with near stagnant surroundings at one extreme and significant crossflow at the other. Two types of heat transfer experiments are being considered. One addresses the "hot channel" problem, if necessary

  8. Optimal color design of psychological counseling room by design of experiments and response surface methodology.

    Science.gov (United States)

    Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.

  9. Adaptive Lighting Design – Staged Experiences of Light

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin

    in ways that meaningfully adapt. In the two installations, two different aspects are at play. In White Cube, the light colours are balanced. In White Box, the light follows the movements of the people in the space. In situations with several people occupying the same space, social relations become......Adaptive Lighting Design – Staged Experiences of Light The two installations, White Cube and White Box, enable experience-based studies as a form of perceptual activity, wherein lighting conditions are examined in a dialectical exchange between the system and the people participating. Adaptive...... lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. Software can be seen to bear a communicative aesthetic, where the relation of user situations and the design intentions are controlled...

  10. QUALITY IMPROVEMENT IN MULTIRESPONSE EXPERIMENTS THROUGH ROBUST DESIGN METHODOLOGY

    Directory of Open Access Journals (Sweden)

    M. Shilpa

    2012-06-01

    Full Text Available Robust design methodology aims at reducing the variability in the product performance in the presence of noise factors. Experiments involving simultaneous optimization of more than one quality characteristic are known as multiresponse experiments which are used in the development and improvement of industrial processes and products. In this paper, robust design methodology is applied to optimize the process parameters during a particular operation of rotary driving shaft manufacturing process. The three important quality characteristics of the shaft considered here are of type Nominal-the-best, Smaller-the-better and Fraction defective. Simultaneous optimization of these responses is carried out by identifying the control parameters and conducting the experimentation using L9 orthogonal array.

  11. Exploring The Art of Urban Design as Sensorial Experience

    DEFF Research Database (Denmark)

    Smith, Shelley

    2015-01-01

    to their size, speed and flux. Relating to aesthetics as perception, the art of urban design in contemporary urbanity then seems to lie in accessing the ability to experience in a setting that makes it difficult to do so. This paper will explore the potential for aesthetic (sensorial) experience as the art......Contemporary urbanity is characterised by factors such as a large scale, hypermobility and momentary temporality. In terms of the physical spaces that these factors generate and the presence of users in them, this translates to spaces that challenge the ability to perceive and participate due...... of urban design, and look for that potential in the constituent elements and subsequent spaces of contemporary urbanity from the perspective of both the user and the practitioner....

  12. GenePublisher: automated analysis of DNA microarray data

    DEFF Research Database (Denmark)

    Knudsen, Steen; Workman, Christopher; Sicheritz-Ponten, T.

    2003-01-01

    GenePublisher, a system for automatic analysis of data from DNA microarray experiments, has been implemented with a web interface at http://www.cbs.dtu.dk/services/GenePublisher. Raw data are uploaded to the server together with aspecification of the data. The server performs normalization...

  13. Development of DNA Microarrays for Metabolic Pathway and Bioprocess Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Gregory Stephanopoulos

    2004-07-31

    Transcriptional profiling experiments utilizing DNA microarrays to study the intracellular accumulation of PHB in Synechocystis has proved difficult in large part because strains that show significant differences in PHB which would justify global analysis of gene expression have not been isolated.

  14. How Good is Your User Experience? Measuring and Designing Interactions

    Directory of Open Access Journals (Sweden)

    Wildner Raimund

    2015-11-01

    Full Text Available Form and function are important dimensions of consumer choice, but there is more in our increasingly digital world. It is not only products per se that need to be designed but the whole interaction between consumers and brands. The whole UX or user experience is more important than ever before. Digitalism nowadays is everywhere, and even mundane products are becoming more digital (e.g. ovens, while others evolve that are purely digital (e.g. PayPal.

  15. Paradigms for adaptive statistical information designs: practical experiences and strategies.

    Science.gov (United States)

    Wang, Sue-Jane; Hung, H M James; O'Neill, Robert

    2012-11-10

    design. We highlight the substantial risk of planning the sample size for confirmatory trials when information is very uninformative and stipulate the advantages of adaptive statistical information designs for planning exploratory trials. Practical experiences and strategies as lessons learned from more recent adaptive design proposals will be discussed to pinpoint the improved utilities of adaptive design clinical trials and their potential to increase the chance of a successful drug development. Published 2012. This article is a US Government work and is in the public domain in the USA.

  16. DNA microarray data and contextual analysis of correlation graphs

    Directory of Open Access Journals (Sweden)

    Hingamp Pascal

    2003-04-01

    Full Text Available Abstract Background DNA microarrays are used to produce large sets of expression measurements from which specific biological information is sought. Their analysis requires efficient and reliable algorithms for dimensional reduction, classification and annotation. Results We study networks of co-expressed genes obtained from DNA microarray experiments. The mathematical concept of curvature on graphs is used to group genes or samples into clusters to which relevant gene or sample annotations are automatically assigned. Application to publicly available yeast and human lymphoma data demonstrates the reliability of the method in spite of its simplicity, especially with respect to the small number of parameters involved. Conclusions We provide a method for automatically determining relevant gene clusters among the many genes monitored with microarrays. The automatic annotations and the graphical interface improve the readability of the data. A C++ implementation, called Trixy, is available from http://tagc.univ-mrs.fr/bioinformatics/trixy.html.

  17. The active phasing experiment: Part II. Design and developments

    Science.gov (United States)

    Gonte, F.; Yaitskova, N.; Derie, F.; Araujo, C.; Brast, R.; Delabre, B.; Dierickx, P.; Dupuy, C.; Frank, C.; Guisard, S.; Karban, R.; Noethe, L.; Sedghi, B.; Surdej, I.; Wilhelm, R.; Reyes, M.; Esposito, S.; Langlois, M.

    2006-06-01

    The purpose of the Active Phasing Experiment, designed under the lead of ESO, is to validate wavefront control concepts for ELT class telescopes. This instrument includes an Active Segmented Mirror, located in a pupil image. It will be mounted at a Nasmyth focus of one of the Unit Telescopes of the ESO VLT. APE contains four different types of phasing sensors, which are developed by Istituto Nazionale di Astrofisica in Arcetri, Instituto Astrofisica Canarias, Laboratoire d'Astrophysique de Marseille and ESO. These phasing sensors can be compared simultaneously under identical optical and environmental conditions. All sensors receive telecentric F/15 beams with identical optical quality and intensity. Each phasing sensor can measure segmentation errors of the active segmented mirror and correct them in closed loop. The phasing process is supervised by an Internal Metrology system developed by FOGALE Nanotech and capable of measuring piston steps with an accuracy of a few nanometers. The Active Phasing Experiment is equipped with a turbulence generator to simulate atmospheric seeing between 0.45 and 0.85 arcsec in the laboratory. In addition, the Active Phasing Experiment is designed to control simultaneously with the phasing corrections the guiding and the active optics of one of the VLT Unit Telescopes. This activity is supported by the European Community (Framework Programme 6, ELT Design Study, contract No 011863).

  18. A Fisheye Viewer for microarray-based gene expression data.

    Science.gov (United States)

    Wu, Min; Thao, Cheng; Mu, Xiangming; Munson, Ethan V

    2006-10-13

    Microarray has been widely used to measure the relative amounts of every mRNA transcript from the genome in a single scan. Biologists have been accustomed to reading their experimental data directly from tables. However, microarray data are quite large and are stored in a series of files in a machine-readable format, so direct reading of the full data set is not feasible. The challenge is to design a user interface that allows biologists to usefully view large tables of raw microarray-based gene expression data. This paper presents one such interface--an electronic table (E-table) that uses fisheye distortion technology. The Fisheye Viewer for microarray-based gene expression data has been successfully developed to view MIAME data stored in the MAGE-ML format. The viewer can be downloaded from the project web site http://polaris.imt.uwm.edu:7777/fisheye/. The fisheye viewer was implemented in Java so that it could run on multiple platforms. We implemented the E-table by adapting JTable, a default table implementation in the Java Swing user interface library. Fisheye views use variable magnification to balance magnification for easy viewing and compression for maximizing the amount of data on the screen. This Fisheye Viewer is a lightweight but useful tool for biologists to quickly overview the raw microarray-based gene expression data in an E-table.

  19. A fisheye viewer for microarray-based gene expression data

    Directory of Open Access Journals (Sweden)

    Munson Ethan V

    2006-10-01

    Full Text Available Abstract Background Microarray has been widely used to measure the relative amounts of every mRNA transcript from the genome in a single scan. Biologists have been accustomed to reading their experimental data directly from tables. However, microarray data are quite large and are stored in a series of files in a machine-readable format, so direct reading of the full data set is not feasible. The challenge is to design a user interface that allows biologists to usefully view large tables of raw microarray-based gene expression data. This paper presents one such interface – an electronic table (E-table that uses fisheye distortion technology. Results The Fisheye Viewer for microarray-based gene expression data has been successfully developed to view MIAME data stored in the MAGE-ML format. The viewer can be downloaded from the project web site http://polaris.imt.uwm.edu:7777/fisheye/. The fisheye viewer was implemented in Java so that it could run on multiple platforms. We implemented the E-table by adapting JTable, a default table implementation in the Java Swing user interface library. Fisheye views use variable magnification to balance magnification for easy viewing and compression for maximizing the amount of data on the screen. Conclusion This Fisheye Viewer is a lightweight but useful tool for biologists to quickly overview the raw microarray-based gene expression data in an E-table.

  20. Probe Selection for DNA Microarrays using OligoWiz

    DEFF Research Database (Denmark)

    Wernersson, Rasmus; Juncker, Agnieszka; Nielsen, Henrik Bjørn

    2007-01-01

    Nucleotide abundance measurements using DNA microarray technology are possible only if appropriate probes complementary to the target nucleotides can be identified. Here we present a protocol for selecting DNA probes for microarrays using the OligoWiz application. OligoWiz is a client-server appl......Nucleotide abundance measurements using DNA microarray technology are possible only if appropriate probes complementary to the target nucleotides can be identified. Here we present a protocol for selecting DNA probes for microarrays using the OligoWiz application. OligoWiz is a client......-server application that offers a detailed graphical interface and real-time user interaction on the client side, and massive computer power and a large collection of species databases (400, summer 2007) on the server side. Probes are selected according to five weighted scores: cross-hybridization, deltaT(m), folding...... computer skills and can be executed from any Internet-connected computer. The probe selection procedure for a standard microarray design targeting all yeast transcripts can be completed in 1 h....

  1. Design of a cavity heat pipe receiver experiment

    Science.gov (United States)

    Schneider, Michael G.; Brege, Mark H.; Greenlee, William J.

    1992-01-01

    A cavity heat pipe experiment has been designed to test the critical issues involved with incorporating thermal energy storage canisters into a heat pipe. The experiment is a replication of the operation of a heat receiver for a Brayton solar dynamic power cycle. The heat receiver is composed of a cylindrical receptor surface and an annular heat pipe with thermal energy storage canisters and gaseous working fluid heat exchanger tubes surrounding it. Hardware for the cavity heat pipe experiment will consist of a sector of the heat pipe, complete with gas tube and thermal energy storage canisters. Thermal cycling tests will be performed on the heat pipe sector to simulate the normal energy charge/discharge cycle of the receiver in a spacecraft application.

  2. A Parallel Software Pipeline for DMET Microarray Genotyping Data Analysis

    Directory of Open Access Journals (Sweden)

    Giuseppe Agapito

    2018-06-01

    Full Text Available Personalized medicine is an aspect of the P4 medicine (predictive, preventive, personalized and participatory based precisely on the customization of all medical characters of each subject. In personalized medicine, the development of medical treatments and drugs is tailored to the individual characteristics and needs of each subject, according to the study of diseases at different scales from genotype to phenotype scale. To make concrete the goal of personalized medicine, it is necessary to employ high-throughput methodologies such as Next Generation Sequencing (NGS, Genome-Wide Association Studies (GWAS, Mass Spectrometry or Microarrays, that are able to investigate a single disease from a broader perspective. A side effect of high-throughput methodologies is the massive amount of data produced for each single experiment, that poses several challenges (e.g., high execution time and required memory to bioinformatic software. Thus a main requirement of modern bioinformatic softwares, is the use of good software engineering methods and efficient programming techniques, able to face those challenges, that include the use of parallel programming and efficient and compact data structures. This paper presents the design and the experimentation of a comprehensive software pipeline, named microPipe, for the preprocessing, annotation and analysis of microarray-based Single Nucleotide Polymorphism (SNP genotyping data. A use case in pharmacogenomics is presented. The main advantages of using microPipe are: the reduction of errors that may happen when trying to make data compatible among different tools; the possibility to analyze in parallel huge datasets; the easy annotation and integration of data. microPipe is available under Creative Commons license, and is freely downloadable for academic and not-for-profit institutions.

  3. Label and Label-Free Detection Techniques for Protein Microarrays

    Directory of Open Access Journals (Sweden)

    Amir Syahir

    2015-04-01

    Full Text Available Protein microarray technology has gone through numerous innovative developments in recent decades. In this review, we focus on the development of protein detection methods embedded in the technology. Early microarrays utilized useful chromophores and versatile biochemical techniques dominated by high-throughput illumination. Recently, the realization of label-free techniques has been greatly advanced by the combination of knowledge in material sciences, computational design and nanofabrication. These rapidly advancing techniques aim to provide data without the intervention of label molecules. Here, we present a brief overview of this remarkable innovation from the perspectives of label and label-free techniques in transducing nano‑biological events.

  4. Calling Biomarkers in Milk Using a Protein Microarray on Your Smartphone

    Science.gov (United States)

    Ludwig, Susann K. J.; Tokarski, Christian; Lang, Stefan N.; van Ginkel, Leendert A.; Zhu, Hongying; Ozcan, Aydogan; Nielen, Michel W. F.

    2015-01-01

    Here we present the concept of a protein microarray-based fluorescence immunoassay for multiple biomarker detection in milk extracts by an ordinary smartphone. A multiplex immunoassay was designed on a microarray chip, having built-in positive and negative quality controls. After the immunoassay procedure, the 48 microspots were labelled with Quantum Dots (QD) depending on the protein biomarker levels in the sample. QD-fluorescence was subsequently detected by the smartphone camera under UV light excitation from LEDs embedded in a simple 3D-printed opto-mechanical smartphone attachment. The somewhat aberrant images obtained under such conditions, were corrected by newly developed Android-based software on the same smartphone, and protein biomarker profiles were calculated. The indirect detection of recombinant bovine somatotropin (rbST) in milk extracts based on altered biomarker profile of anti-rbST antibodies was selected as a real-life challenge. RbST-treated and untreated cows clearly showed reproducible treatment-dependent biomarker profiles in milk, in excellent agreement with results from a flow cytometer reference method. In a pilot experiment, anti-rbST antibody detection was multiplexed with the detection of another rbST-dependent biomarker, insulin-like growth factor 1 (IGF-1). Milk extract IGF-1 levels were found to be increased after rbST treatment and correlated with the results obtained from the reference method. These data clearly demonstrate the potential of the portable protein microarray concept towards simultaneous detection of multiple biomarkers. We envisage broad application of this ‘protein microarray on a smartphone’-concept for on-site testing, e.g., in food safety, environment and health monitoring. PMID:26308444

  5. Calling Biomarkers in Milk Using a Protein Microarray on Your Smartphone.

    Directory of Open Access Journals (Sweden)

    Susann K J Ludwig

    Full Text Available Here we present the concept of a protein microarray-based fluorescence immunoassay for multiple biomarker detection in milk extracts by an ordinary smartphone. A multiplex immunoassay was designed on a microarray chip, having built-in positive and negative quality controls. After the immunoassay procedure, the 48 microspots were labelled with Quantum Dots (QD depending on the protein biomarker levels in the sample. QD-fluorescence was subsequently detected by the smartphone camera under UV light excitation from LEDs embedded in a simple 3D-printed opto-mechanical smartphone attachment. The somewhat aberrant images obtained under such conditions, were corrected by newly developed Android-based software on the same smartphone, and protein biomarker profiles were calculated. The indirect detection of recombinant bovine somatotropin (rbST in milk extracts based on altered biomarker profile of anti-rbST antibodies was selected as a real-life challenge. RbST-treated and untreated cows clearly showed reproducible treatment-dependent biomarker profiles in milk, in excellent agreement with results from a flow cytometer reference method. In a pilot experiment, anti-rbST antibody detection was multiplexed with the detection of another rbST-dependent biomarker, insulin-like growth factor 1 (IGF-1. Milk extract IGF-1 levels were found to be increased after rbST treatment and correlated with the results obtained from the reference method. These data clearly demonstrate the potential of the portable protein microarray concept towards simultaneous detection of multiple biomarkers. We envisage broad application of this 'protein microarray on a smartphone'-concept for on-site testing, e.g., in food safety, environment and health monitoring.

  6. An industrial approach to design compelling VR and AR experience

    Science.gov (United States)

    Richir, Simon; Fuchs, Philippe; Lourdeaux, Domitile; Buche, Cédric; Querrec, Ronan

    2013-03-01

    The convergence of technologies currently observed in the field of VR, AR, robotics and consumer electronic reinforces the trend of new applications appearing every day. But when transferring knowledge acquired from research to businesses, research laboratories are often at a loss because of a lack of knowledge of the design and integration processes in creating an industrial scale product. In fact, the innovation approaches that take a good idea from the laboratory to a successful industrial product are often little known to researchers. The objective of this paper is to present the results of the work of several research teams that have finalized a working method for researchers and manufacturers that allow them to design virtual or augmented reality systems and enable their users to enjoy "a compelling VR experience". That approach, called "the I2I method", present 11 phases from "Establishing technological and competitive intelligence and industrial property" to "Improvements" through the "Definition of the Behavioral Interface, Virtual Environment and Behavioral Software Assistance". As a result of the experience gained by various research teams, this design approach benefits from contributions from current VR and AR research. Our objective is to validate and continuously move such multidisciplinary design team methods forward.

  7. Microarray-Based Identification of Transcription Factor Target Genes

    NARCIS (Netherlands)

    Gorte, M.; Horstman, A.; Page, R.B.; Heidstra, R.; Stromberg, A.; Boutilier, K.A.

    2011-01-01

    Microarray analysis is widely used to identify transcriptional changes associated with genetic perturbation or signaling events. Here we describe its application in the identification of plant transcription factor target genes with emphasis on the design of suitable DNA constructs for controlling TF

  8. HTGR nuclear heat source component design and experience

    International Nuclear Information System (INIS)

    Peinado, C.O.; Wunderlich, R.G.; Simon, W.A.

    1982-05-01

    The high-temperature gas-cooled reactor (HTGR) nuclear heat source components have been under design and development since the mid-1950's. Two power plants have been designed, constructed, and operated: the Peach Bottom Atomic Power Station and the Fort St. Vrain Nuclear Generating Station. Recently, development has focused on the primary system components for a 2240-MW(t) steam cycle HTGR capable of generating about 900 MW(e) electric power or alternately producing high-grade steam and cogenerating electric power. These components include the steam generators, core auxiliary heat exchangers, primary and auxiliary circulators, reactor internals, and thermal barrier system. A discussion of the design and operating experience of these components is included

  9. Heat experiment design to estimate temperature dependent thermal properties

    International Nuclear Information System (INIS)

    Romanovski, M

    2008-01-01

    Experimental conditions are studied to optimize transient experiments for estimating temperature dependent thermal conductivity and volumetric heat capacity. A mathematical model of a specimen is the one-dimensional heat equation with boundary conditions of the second kind. Thermal properties are assumed to vary nonlinearly with temperature. Experimental conditions refer to the thermal loading scheme, sampling times and sensor location. A numerical model of experimental configurations is studied to elicit the optimal conditions. The numerical solution of the design problem is formulated on a regularization scheme with a stabilizer minimization without a regularization parameter. An explicit design criterion is used to reveal the optimal sensor location, heating duration and flux magnitude. Results obtained indicate that even the strongly nonlinear experimental design problem admits the aggregation of its solution and has a strictly defined optimal measurement scheme. Additional region of temperature measurements with allowable identification error is revealed.

  10. AMDA: an R package for the automated microarray data analysis

    Directory of Open Access Journals (Sweden)

    Foti Maria

    2006-07-01

    Full Text Available Abstract Background Microarrays are routinely used to assess mRNA transcript levels on a genome-wide scale. Large amount of microarray datasets are now available in several databases, and new experiments are constantly being performed. In spite of this fact, few and limited tools exist for quickly and easily analyzing the results. Microarray analysis can be challenging for researchers without the necessary training and it can be time-consuming for service providers with many users. Results To address these problems we have developed an automated microarray data analysis (AMDA software, which provides scientists with an easy and integrated system for the analysis of Affymetrix microarray experiments. AMDA is free and it is available as an R package. It is based on the Bioconductor project that provides a number of powerful bioinformatics and microarray analysis tools. This automated pipeline integrates different functions available in the R and Bioconductor projects with newly developed functions. AMDA covers all of the steps, performing a full data analysis, including image analysis, quality controls, normalization, selection of differentially expressed genes, clustering, correspondence analysis and functional evaluation. Finally a LaTEX document is dynamically generated depending on the performed analysis steps. The generated report contains comments and analysis results as well as the references to several files for a deeper investigation. Conclusion AMDA is freely available as an R package under the GPL license. The package as well as an example analysis report can be downloaded in the Services/Bioinformatics section of the Genopolis http://www.genopolis.it/

  11. PERLE. Powerful energy recovery linac for experiments. Conceptual design report

    Science.gov (United States)

    Angal-Kalinin, D.; Arduini, G.; Auchmann, B.; Bernauer, J.; Bogacz, A.; Bordry, F.; Bousson, S.; Bracco, C.; Brüning, O.; Calaga, R.; Cassou, K.; Chetvertkova, V.; Cormier, E.; Daly, E.; Douglas, D.; Dupraz, K.; Goddard, B.; Henry, J.; Hutton, A.; Jensen, E.; Kaabi, W.; Klein, M.; Kostka, P.; Lasheras, N.; Levichev, E.; Marhauser, F.; Martens, A.; Milanese, A.; Militsyn, B.; Peinaud, Y.; Pellegrini, D.; Pietralla, N.; Pupkov, Y.; Rimmer, R.; Schirm, K.; Schulte, D.; Smith, S.; Stocchi, A.; Valloni, A.; Welsch, C.; Willering, G.; Wollmann, D.; Zimmermann, F.; Zomer, F.

    2018-06-01

    A conceptual design is presented of a novel energy-recovering linac (ERL) facility for the development and application of the energy recovery technique to linear electron accelerators in the multi-turn, large current and large energy regime. The main characteristics of the powerful energy recovery linac experiment facility (PERLE) are derived from the design of the Large Hadron electron Collider, an electron beam upgrade under study for the LHC, for which it would be the key demonstrator. PERLE is thus projected as a facility to investigate efficient, high current (HC) (>10 mA) ERL operation with three re-circulation passages through newly designed SCRF cavities, at 801.58 MHz frequency, and following deceleration over another three re-circulations. In its fully equipped configuration, PERLE provides an electron beam of approximately 1 GeV energy. A physics programme possibly associated with PERLE is sketched, consisting of high precision elastic electron–proton scattering experiments, as well as photo-nuclear reactions of unprecedented intensities with up to 30 MeV photon beam energy as may be obtained using Fabry–Perot cavities. The facility has further applications as a general technology test bed that can investigate and validate novel superconducting magnets (beam induced quench tests) and superconducting RF structures (structure tests with HC beams, beam loading and transients). Besides a chapter on operation aspects, the report contains detailed considerations on the choices for the SCRF structure, optics and lattice design, solutions for arc magnets, source and injector and on further essential components. A suitable configuration derived from the here presented design concept may next be moved forward to a technical design and possibly be built by an international collaboration which is being established.

  12. Comparing transformation methods for DNA microarray data

    Directory of Open Access Journals (Sweden)

    Zwinderman Aeilko H

    2004-06-01

    Full Text Available Abstract Background When DNA microarray data are used for gene clustering, genotype/phenotype correlation studies, or tissue classification the signal intensities are usually transformed and normalized in several steps in order to improve comparability and signal/noise ratio. These steps may include subtraction of an estimated background signal, subtracting the reference signal, smoothing (to account for nonlinear measurement effects, and more. Different authors use different approaches, and it is generally not clear to users which method they should prefer. Results We used the ratio between biological variance and measurement variance (which is an F-like statistic as a quality measure for transformation methods, and we demonstrate a method for maximizing that variance ratio on real data. We explore a number of transformations issues, including Box-Cox transformation, baseline shift, partial subtraction of the log-reference signal and smoothing. It appears that the optimal choice of parameters for the transformation methods depends on the data. Further, the behavior of the variance ratio, under the null hypothesis of zero biological variance, appears to depend on the choice of parameters. Conclusions The use of replicates in microarray experiments is important. Adjustment for the null-hypothesis behavior of the variance ratio is critical to the selection of transformation method.

  13. An Affymetrix Microarray Design for Microbial Genotyping

    Science.gov (United States)

    2009-10-01

    les échantillons qui ne se prêtent pas aux méthodes culturales de la microbiologie classique. La puce à ADN est une technologie qui permet la... area of microbial genotyping there are multiple platforms that can identify one or a few microbial targets in a single assay iteration. For most

  14. Creating meaningful learning experiences: Understanding students' perspectives of engineering design

    Science.gov (United States)

    Aleong, Richard James Chung Mun

    , relevance, and transfer. With this framework of student learning, engineering educators can enhance learning experiences by engaging all three levels of students' understanding. The curriculum studies orientation applied the three holistic elements of curriculum---subject matter, society, and the individual---to conceptualize design considerations for engineering curriculum and teaching practice. This research supports the characterization of students' learning experiences to help educators and students optimize their teaching and learning of design education.

  15. Designing for Motivation, Engagement and Wellbeing in Digital Experience

    Directory of Open Access Journals (Sweden)

    Dorian Peters

    2018-05-01

    Full Text Available Research in psychology has shown that both motivation and wellbeing are contingent on the satisfaction of certain psychological needs. Yet, despite a long-standing pursuit in human-computer interaction (HCI for design strategies that foster sustained engagement, behavior change and wellbeing, the basic psychological needs shown to mediate these outcomes are rarely taken into account. This is possibly due to the lack of a clear model to explain these needs in the context of HCI. Herein we introduce such a model: Motivation, Engagement and Thriving in User Experience (METUX. The model provides a framework grounded in psychological research that can allow HCI researchers and practitioners to form actionable insights with respect to how technology designs support or undermine basic psychological needs, thereby increasing motivation and engagement, and ultimately, improving user wellbeing. We propose that in order to address wellbeing, psychological needs must be considered within five different spheres of analysis including: at the point of technology adoption, during interaction with the interface, as a result of engagement with technology-specific tasks, as part of the technology-supported behavior, and as part of an individual's life overall. These five spheres of experience sit within a sixth, society, which encompasses both direct and collateral effects of technology use as well as non-user experiences. We build this model based on existing evidence for basic psychological need satisfaction, including evidence within the context of the workplace, computer games, and health. We extend and hone these ideas to provide practical advice for designers along with real world examples of how to apply the model to design practice.

  16. Interim report on updated microarray probes for the LLNL Burkholderia pseudomallei SNP array

    Energy Technology Data Exchange (ETDEWEB)

    Gardner, S; Jaing, C

    2012-03-27

    The overall goal of this project is to forensically characterize 100 unknown Burkholderia isolates in the US-Australia collaboration. We will identify genome-wide single nucleotide polymorphisms (SNPs) from B. pseudomallei and near neighbor species including B. mallei, B. thailandensis and B. oklahomensis. We will design microarray probes to detect these SNP markers and analyze 100 Burkholderia genomic DNAs extracted from environmental, clinical and near neighbor isolates from Australian collaborators on the Burkholderia SNP microarray. We will analyze the microarray genotyping results to characterize the genetic diversity of these new isolates and triage the samples for whole genome sequencing. In this interim report, we described the SNP analysis and the microarray probe design for the Burkholderia SNP microarray.

  17. Designing Experiments to Discriminate Families of Logic Models.

    Science.gov (United States)

    Videla, Santiago; Konokotina, Irina; Alexopoulos, Leonidas G; Saez-Rodriguez, Julio; Schaub, Torsten; Siegel, Anne; Guziolowski, Carito

    2015-01-01

    Logic models of signaling pathways are a promising way of building effective in silico functional models of a cell, in particular of signaling pathways. The automated learning of Boolean logic models describing signaling pathways can be achieved by training to phosphoproteomics data, which is particularly useful if it is measured upon different combinations of perturbations in a high-throughput fashion. However, in practice, the number and type of allowed perturbations are not exhaustive. Moreover, experimental data are unavoidably subjected to noise. As a result, the learning process results in a family of feasible logical networks rather than in a single model. This family is composed of logic models implementing different internal wirings for the system and therefore the predictions of experiments from this family may present a significant level of variability, and hence uncertainty. In this paper, we introduce a method based on Answer Set Programming to propose an optimal experimental design that aims to narrow down the variability (in terms of input-output behaviors) within families of logical models learned from experimental data. We study how the fitness with respect to the data can be improved after an optimal selection of signaling perturbations and how we learn optimal logic models with minimal number of experiments. The methods are applied on signaling pathways in human liver cells and phosphoproteomics experimental data. Using 25% of the experiments, we obtained logical models with fitness scores (mean square error) 15% close to the ones obtained using all experiments, illustrating the impact that our approach can have on the design of experiments for efficient model calibration.

  18. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    Science.gov (United States)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  19. Trends in integrated circuit design for particle physics experiments

    International Nuclear Information System (INIS)

    Atkin, E V

    2017-01-01

    Integrated circuits are one of the key complex units available to designers of multichannel detector setups. A whole number of factors makes Application Specific Integrated Circuits (ASICs) valuable for Particle Physics and Astrophysics experiments. Among them the most important ones are: integration scale, low power dissipation, radiation tolerance. In order to make possible future experiments in the intensity, cosmic, and energy frontiers today ASICs should provide new level of functionality at a new set of constraints and trade-offs, like low-noise high-dynamic range amplification and pulse shaping, high-speed waveform sampling, low power digitization, fast digital data processing, serialization and data transmission. All integrated circuits, necessary for physical instrumentation, should be radiation tolerant at an earlier not reached level (hundreds of Mrad) of total ionizing dose and allow minute almost 3D assemblies. The paper is based on literary source analysis and presents an overview of the state of the art and trends in nowadays chip design, using partially own ASIC lab experience. That shows a next stage of ising micro- and nanoelectronics in physical instrumentation. (paper)

  20. PARAMETRIC MODELING, CREATIVITY, AND DESIGN: TWO EXPERIENCES WITH ARCHITECTURE’ STUDENTS

    Directory of Open Access Journals (Sweden)

    Wilson Florio

    2012-02-01

    Full Text Available The aim of this article is to reflect on the use of the parametric modeling in two didactic experiences. The first experiment involved resources of the Paracloud program and its relation with the Rhinoceros program, that resulted in the production of physical models produced with the aid of the laser cutting. In the second experiment, the students had produced algorithms in the Grasshopper, resulting in families of structures and coverings. The study objects are both the physical models and digital algorithms resultants from this experimentation. For the analysis and synthesis of the results, we adopted four important assumptions: 1. the value of attitudes and environment of work; 2. the importance of experimentation and improvisation; 3. understanding of the design process as a situated act and as a ill-defined problem; 4. the inclusion of creative and critical thought in the disciplines. The results allow us to affirm that the parametric modeling stimulates creativity, therefore allowing combination of different parameters, that result in unexpected discoveries. Keywords: Teach-Learning, Parametric Modeling, Laser Cutter, Grasshopper, Design Process, Creativity.

  1. A flexible whole-genome microarray for transcriptomics in three-spine stickleback (Gasterosteus aculeatus

    Directory of Open Access Journals (Sweden)

    Primmer Craig R

    2009-09-01

    Full Text Available Abstract Background The use of microarray technology for describing changes in mRNA expression to address ecological and evolutionary questions is becoming increasingly popular. Since three-spine stickleback are an important ecological and evolutionary model-species as well as an emerging model for eco-toxicology, the ability to have a functional and flexible microarray platform for transcriptome studies will greatly enhance the research potential in these areas. Results We designed 43,392 unique oligonucleotide probes representing 19,274 genes (93% of the estimated total gene number, and tested the hybridization performance of both DNA and RNA from different populations to determine the efficacy of probe design for transcriptome analysis using the Agilent array platform. The majority of probes were functional as evidenced by the DNA hybridization success, and 30,946 probes (14,615 genes had a signal that was significantly above background for RNA isolated from liver tissue. Genes identified as being expressed in liver tissue were grouped into functional categories for each of the three Gene Ontology groups: biological process, molecular function, and cellular component. As expected, the highest proportions of functional categories belonged to those associated with metabolic functions: metabolic process, binding, catabolism, and organelles. Conclusion The probe and microarray design presented here provides an important step facilitating transcriptomics research for this important research organism by providing a set of over 43,000 probes whose hybridization success and specificity to liver expression has been demonstrated. Probes can easily be added or removed from the current design to tailor the array to specific experiments and additional flexibility lies in the ability to perform either one-color or two-color hybridizations.

  2. Teachers as Designers: Social Design Experiments as Vehicles for Developing Antideficit English Education

    Science.gov (United States)

    Fowler-Amato, Michelle; Warrington, Amber

    2017-01-01

    In this article, we explore data from two studies that demonstrate how inviting teachers to take on the role of codesigners of interventions in social design experiments created opportunities for them to consider their own positionality and privilege as well as negotiate deficit and antideficit discourses underlying and shaping English-language…

  3. CELSS experiment model and design concept of gas recycle system

    Science.gov (United States)

    Nitta, K.; Oguchi, M.; Kanda, S.

    1986-01-01

    In order to prolong the duration of manned missions around the Earth and to expand the human existing region from the Earth to other planets such as a Lunar Base or a manned Mars flight mission, the controlled ecological life support system (CELSS) becomes an essential factor of the future technology to be developed through utilization of space station. The preliminary system engineering and integration efforts regarding CELSS have been carried out by the Japanese CELSS concept study group for clarifying the feasibility of hardware development for Space station experiments and for getting the time phased mission sets after FY 1992. The results of these studies are briefly summarized and the design and utilization methods of a Gas Recycle System for CELSS experiments are discussed.

  4. The HRMT27 (Rodtarg) Experiment: Design, Operation and First Results

    CERN Document Server

    Torregrosa Martin, Claudio Leopoldo; Calviani, Marco; Butcher, Mark; Horvath, David; Fornasiere, Elvis; Gentini, Luca

    2016-01-01

    The HRMT27-Rodtarg- experiment used the HiRadMat facility at CERN to impact intense 440 GeV proton beams onto thin rods - 8 mm diameter, 140 length - made of high-density materials such as Ir, W, Ta, Mo among others. The purpose of the experiment was to reduce uncertainties on the CERN antiproton target material response and assess the material selection for its future redesign. The experiment was designed to recreate the extreme conditions reached in the target, estimated as an increase of temperature above 2000 ºC in less than 0.5 µs and a subsequent compressive-to-tensile pressure wave of several GPa. This document includes a detailed summary of the experimental setup and online recorded data. Results suggest that all the irradiated materials except tantalum suffered internal damage from conditions 5-7 lower than those reached in the AD-Target, while tantalum targets clearly showed the best dynamic response, remaining un-cracked during the experiment. Foreseen post irradiation examinations will complete ...

  5. Development of a novel multiplex DNA microarray for Fusarium graminearum and analysis of azole fungicide responses

    Directory of Open Access Journals (Sweden)

    Deising Holger B

    2011-01-01

    Full Text Available Abstract Background The toxigenic fungal plant pathogen Fusarium graminearum compromises wheat production worldwide. Azole fungicides play a prominent role in controlling this pathogen. Sequencing of its genome stimulated the development of high-throughput technologies to study mechanisms of coping with fungicide stress and adaptation to fungicides at a previously unprecedented precision. DNA-microarrays have been used to analyze genome-wide gene expression patterns and uncovered complex transcriptional responses. A recently developed one-color multiplex array format allowed flexible, effective, and parallel examinations of eight RNA samples. Results We took advantage of the 8 × 15 k Agilent format to design, evaluate, and apply a novel microarray covering the whole F. graminearum genome to analyze transcriptional responses to azole fungicide treatment. Comparative statistical analysis of expression profiles uncovered 1058 genes that were significantly differentially expressed after azole-treatment. Quantitative RT-PCR analysis for 31 selected genes indicated high conformity to results from the microarray hybridization. Among the 596 genes with significantly increased transcript levels, analyses using GeneOntology and FunCat annotations detected the ergosterol-biosynthesis pathway genes as the category most significantly responding, confirming the mode-of-action of azole fungicides. Cyp51A, which is one of the three F. graminearum paralogs of Cyp51 encoding the target of azoles, was the most consistently differentially expressed gene of the entire study. A molecular phylogeny analyzing the relationships of the three CYP51 proteins in the context of 38 fungal genomes belonging to the Pezizomycotina indicated that CYP51C (FGSG_11024 groups with a new clade of CYP51 proteins. The transcriptional profiles for genes encoding ABC transporters and transcription factors suggested several involved in mechanisms alleviating the impact of the fungicide

  6. Signage by Design: A Design-Thinking Approach to Library User Experience

    Directory of Open Access Journals (Sweden)

    Edward Luca

    2016-01-01

    Full Text Available Signage is a powerful visual tool for communication and a crucial component of the library user experience. Signage can welcome, guide, instruct, and delight users, helping them navigate the complex information world of any library. In practice, however, signage can be problematic, revealing tensions between various stakeholders, and contributing to visual noise through information overload; this often leads to signage blindness, library anxiety, and confusion. This article explores how libraries can use a design-thinking approach to improve the user experience in physical library spaces, particularly with respect to signage, based on our experience at the UTS Library, a university library in Australia that serves the University of Technology Sydney (UTS. We found that a design-thinking approach that uses the processes of empathy, problem definition, solution ideation, prototyping, and testing, can help libraries make significant and meaningful changes that can be adopted at relatively low cost.

  7. Gadolinia experience and design for PWR fuel cycles

    International Nuclear Information System (INIS)

    Stephenson, L. C.

    2000-01-01

    The purpose of this paper is to describe Siemens Power Corporation's (SPC) current experience with the burnable absorber gadolinia in PWR fuel assemblies, including optimized features of SPC's PWR gadolinia designs, and comparisons with other burnable absorbers. Siemens is the world leader in PWR gadolinia experience. More than 5,900 Siemens PWR gadolinia-bearing fuel assemblies have been irradiated. The use of gadolinia-bearing fuel provides significant flexibility in fuel cycle designs, allows for low radial leakage fuel management and extended operating cycles, and reduces BOC (beginning-of-cycle) soluble boron concentrations. The optimized use of an integral burnable neutron absorber is a design feature which provides improved economic performance for PWR fuel assemblies. This paper includes a comparison between three different types of integral burnable absorbers: gadolinia, Zirconium diboride and erbia. Fuel cycle design studies performed by Siemens have shown that the enrichment requirements for 18-24 month fuel cycles utilizing gadolinia or zirconium diboride integral fuel burnable absorbers can be approximately the same. Although a typical gadolinia residual penalty for a cycle design of this length is as low as 0.02-0.03 wt% U-235, the design flexibility of gadolinia allows for very aggressive low-leakage core loading plans which reduces the enrichment requirements for gadolinia-bearing fuel. SPC has optimized its use of gadolinia in PWR fuel cycles. Typically, low (2-4) weight percent Gd 2 O 3 is used for beginning to middle of cycle reactivity hold down as well as soluble boron concentration holddown at BOC. Higher concentrations of Gd 2 O 3 , such as 6 and 8 wt%, are used to control power peaking in assemblies later in the cycle. SPC has developed core strategies that maximize the use of lower gadolinia concentrations which significantly reduces the gadolinia residual reactivity penalty. This optimization includes minimizing the number of rods with

  8. Design improvements, construction and operating experience with BWRs in Japan

    International Nuclear Information System (INIS)

    Uchigasaki, G.; Yokomi, M.; Sasaki, M.; Aoki, R.; Hashimoto, H.

    1983-01-01

    (1) The first domestic-made 1100-MW(e) BWR in Japan commenced commercial operation in April 1982. The unit is the leading one of the subsequent three in Fukushima Daini nuclear power station owned by the Tokyo Electric Power Company Inc. (Tepco). Based on the accumulated construction and operation experience of 500-MW(e) and 800-MW(e) class BWRs, improvements in various aspects during both the design and construction stages were introduced in core and fuel design with advanced gadolinia distribution, reactor feedwater treatment technology for crud reduction, a radwaste island, control and instrumentation to cope with the lessons learned through Three Mile Island assessment etc. (2) Based on many operating experiences with BWRs, an improved BWR core, which has easier operability and higher load factor than the conventional core, has been developed. The characteristic of the improved core is ''axially two-zoned uranium enrichment distribution''; the enrichment of the upper part of the fuel is slightly higher than that of the lower part. Through the improved core it became possible to optimize the axial power flattening and core reactivity control separately by axial enrichment distribution and burnable poison content. The improved fuels were loaded into operating BWRs and successfully proved the performance by this experience. (3) To shorten annual outage time, to reduce radiation exposure, to save manpower, and to achieve high reliability and safety of inspection operation, the remote automatic service and inspection equipment were developed in Japan. This paper presents the concept, distinctive features, and actual operation experience of the automatic refuelling machine, control-rod drive (CRD) remote-handling machine, improved main steam line isolation plug, and the automated ultrasonic inspection system with a computerized data processing unit, which have been developed by Hitachi, Ltd. with excellent results. (author)

  9. Science, technology and mission design for LATOR experiment

    Science.gov (United States)

    Turyshev, Slava G.; Shao, Michael; Nordtvedt, Kenneth L.

    2017-11-01

    The Laser Astrometric Test of Relativity (LATOR) is a Michelson-Morley-type experiment designed to test the Einstein's general theory of relativity in the most intense gravitational environment available in the solar system - the close proximity to the Sun. By using independent time-series of highly accurate measurements of the Shapiro time-delay (laser ranging accurate to 1 cm) and interferometric astrometry (accurate to 0.1 picoradian), LATOR will measure gravitational deflection of light by the solar gravity with accuracy of 1 part in a billion, a factor {30,000 better than currently available. LATOR will perform series of highly-accurate tests of gravitation and cosmology in its search for cosmological remnants of scalar field in the solar system. We present science, technology and mission design for the LATOR mission.

  10. New designs of LMJ targets for early ignition experiments

    International Nuclear Information System (INIS)

    Clerouin, C; Bonnefille, M; Dattolo, E; Fremerye, P; Galmiche, D; Gauthier, P; Giorla, J; Laffite, S; Liberatore, S; Loiseau, P; Malinie, G; Masse, L; Poggi, F; Seytor, P

    2008-01-01

    The LMJ experimental plans include the attempt of ignition and burn of an ICF capsule with 40 laser quads, delivering up to 1.4MJ and 380TW. New targets needing reduced laser energy with only a small decrease in robustness are then designed for this purpose. A first strategy is to use scaled-down cylindrical hohlraums and capsules, taking advantage of our better understanding of the problem, set on theoretical modelling, simulations and experiments. Another strategy is to work specifically on the coupling efficiency parameter, i.e. the ratio of the energy absorbed by the capsule to the laser energy, which is with parametric instabilities a crucial drawback of indirect drive. An alternative design is proposed, made up of the nominal 60 quads capsule, named A1040, in a rugby-shaped hohlraum. Robustness evaluations of these different targets are in progress

  11. New designs of LMJ targets for early ignition experiments

    Energy Technology Data Exchange (ETDEWEB)

    Clerouin, C; Bonnefille, M; Dattolo, E; Fremerye, P; Galmiche, D; Gauthier, P; Giorla, J; Laffite, S; Liberatore, S; Loiseau, P; Malinie, G; Masse, L; Poggi, F; Seytor, P [Commissariat a l' Energie Atomique, DAM-Ile de France, BP 12 91680 Bruyeres-le-Chatel (France)], E-mail: catherine.cherfils@cea.fr

    2008-05-15

    The LMJ experimental plans include the attempt of ignition and burn of an ICF capsule with 40 laser quads, delivering up to 1.4MJ and 380TW. New targets needing reduced laser energy with only a small decrease in robustness are then designed for this purpose. A first strategy is to use scaled-down cylindrical hohlraums and capsules, taking advantage of our better understanding of the problem, set on theoretical modelling, simulations and experiments. Another strategy is to work specifically on the coupling efficiency parameter, i.e. the ratio of the energy absorbed by the capsule to the laser energy, which is with parametric instabilities a crucial drawback of indirect drive. An alternative design is proposed, made up of the nominal 60 quads capsule, named A1040, in a rugby-shaped hohlraum. Robustness evaluations of these different targets are in progress.

  12. Calibration Device Designed for proof ring used in SCC Experiment

    Science.gov (United States)

    Hu, X. Y.; Kang, Z. Y.; Yu, Y. L.

    2017-11-01

    In this paper, a calibration device for proof ring used in SCC (Stress Corrosion Cracking) experiment was designed. A compact size loading device was developed to replace traditional force standard machine or a long screw nut. The deformation of the proof ring was measured by a CCD (Charge-Coupled Device) during the calibration instead of digital caliper or a dial gauge. The calibration device was verified at laboratory that the precision of force loading is ±0.1% and the precision of deformation measurement is ±0.002mm.

  13. How a huge HEP experiment is designed course

    CERN Multimedia

    CERN. Geneva HR-FAS

    2007-01-01

    More than twenty years after the idea of building the LHC machine was discussed in a workshop in Lausanne in 1984 for the first time, it is instructive to look back on the historical process which has led the community to where we are today with four huge detectors being commissioned and eagerly awaiting first beam collisions in 2008. The main design principles, detector features and performance characteristics of the ATLAS and CMS detectors will be briefly covered in these two lectures with, as an interlude, a wonderful DVD from ATLAS outreach depicting how particles interact and are detected in the various components of the experiments.

  14. Fisher information in the design of computer simulation experiments

    Energy Technology Data Exchange (ETDEWEB)

    StehlIk, Milan; Mueller, Werner G [Department of Applied Statistics, Johannes-Kepler-University Linz Freistaedter Strasse 315, A-4040 Linz (Austria)], E-mail: Milan.Stehlik@jku.at, E-mail: Werner.Mueller@jku.at

    2008-11-01

    The concept of Fisher information is conveniently used as a basis for designing efficient experiments. However, if the output stems from computer simulations they are often approximated as realizations of correlated random fields. Consequently, the conditions under which Fisher information may be suitable must be restated. In the paper we intend to give some simple but illuminating examples for these cases. 'Random phenomena have increasing importance in Engineering and Physics, therefore theoretical results are strongly needed. But there is a gap between the probability theory used by mathematicians and practitioners. Two very different languages have been generated in this way...' (Paul Kree, Paris 1995)

  15. Fisher information in the design of computer simulation experiments

    International Nuclear Information System (INIS)

    StehlIk, Milan; Mueller, Werner G

    2008-01-01

    The concept of Fisher information is conveniently used as a basis for designing efficient experiments. However, if the output stems from computer simulations they are often approximated as realizations of correlated random fields. Consequently, the conditions under which Fisher information may be suitable must be restated. In the paper we intend to give some simple but illuminating examples for these cases. 'Random phenomena have increasing importance in Engineering and Physics, therefore theoretical results are strongly needed. But there is a gap between the probability theory used by mathematicians and practitioners. Two very different languages have been generated in this way...' (Paul Kree, Paris 1995)

  16. Cooling tower drift: experiment design for comprehensive case study

    International Nuclear Information System (INIS)

    Laulainen, N.S.

    1978-01-01

    A drift experiment program to develop a data base which can be used for validation of drift deposition models has been formulated. The first field effort is designed for a suitable mechanical-draft cooling tower to be selected after site visits have been conducted. The discussion here demonstrates the importance of characterizing the droplet size spectrum emitted from the tower and to accurately account for droplet evaporation, because the downwind droplet deposition patterns and near-surface airborne concentrations are extremely sensitive to these parameters

  17. Design and experiment of a new solar air heating collector

    International Nuclear Information System (INIS)

    Shams, S.M.N.; Mc Keever, M.; Mc Cormack, S.; Norton, B.

    2016-01-01

    This paper presents the design and experiment of a CTAH (Concentrating Transpired Air Heating) system. A newly designed solar air heating collector comprised of an inverted perforated absorber and an asymmetric compound parabolic concentrator was applied to increase the intensity of solar radiation incident on the perforated absorber. An extensive literature review was carried out to find the vital factors to improve optical and thermal efficiency of solar air heating systems. A stationary optical concentrator has been designed and experimented. Experimental thermal efficiency remained high at higher air flow rates. The average thermal efficiency was found to be approximately 55%–65% with average radiation above 400 W/m"2 for flow rates in the range of 0.03 kg/s/m"2 to 0.09 kg/s/m"2. Experimental results at air flow rates of 0.03 kg/s/m"2 and 0.09 kg/s/m"2 showed temperature rise of 38 °C and 19.6 °C respectively at a solar radiation intensity of 1000 W/m"2. A comparative performance study shows the thermal performance of CTAH. As the absorber of the CTAH facing downward, it avoids radiation loss and the perforated absorber with tertiary concentrator reduces thermal losses from the system. - Highlights: • Literature review was carried out to improve SAH system performance. • Optimisation factors were optical efficiency; heat loss, weight and cost. • Concentrator was designed to concentrate radiation for 6–7 h. • The highest efficiency of CTAH can be 73%. • It can work as efficient as 60% for a temperature rise of 70 °C.

  18. High-Throughput Quantification of SH2 Domain-Phosphopeptide Interactions with Cellulose-Peptide Conjugate Microarrays.

    Science.gov (United States)

    Engelmann, Brett W

    2017-01-01

    The Src Homology 2 (SH2) domain family primarily recognizes phosphorylated tyrosine (pY) containing peptide motifs. The relative affinity preferences among competing SH2 domains for phosphopeptide ligands define "specificity space," and underpins many functional pY mediated interactions within signaling networks. The degree of promiscuity exhibited and the dynamic range of affinities supported by individual domains or phosphopeptides is best resolved by a carefully executed and controlled quantitative high-throughput experiment. Here, I describe the fabrication and application of a cellulose-peptide conjugate microarray (CPCMA) platform to the quantitative analysis of SH2 domain specificity space. Included herein are instructions for optimal experimental design with special attention paid to common sources of systematic error, phosphopeptide SPOT synthesis, microarray fabrication, analyte titrations, data capture, and analysis.

  19. Deciphering cellular morphology and biocompatibility using polymer microarrays

    International Nuclear Information System (INIS)

    Pernagallo, Salvatore; Unciti-Broceta, Asier; DIaz-Mochon, Juan Jose; Bradley, Mark

    2008-01-01

    A quantitative and qualitative analysis of cellular adhesion, morphology and viability is essential in understanding and designing biomaterials such as those involved in implant surfaces or as tissue-engineering scaffolds. As a means to simultaneously perform these studies in a high-throughput (HT) manner, we report a normalized protocol which allows the rapid analysis of a large number of potential cell binding substrates using polymer microarrays and high-content fluorescence microscopy. The method was successfully applied to the discovery of optimal polymer substrates from a 214-member polyurethane library with mouse fibroblast cells (L929), as well as simultaneous evaluation of cell viability and cellular morphology. Analysis demonstrated high biocompatibility of the binding polymers and permitted the identification of several different cellular morphologies, showing that specific polymer interactions may provoke changes in cell shape. In addition, SAR studies showed a clear correspondence between cellular adhesion and polymer structure. The approach can be utilized to perform multiple experiments (up to 1024 single experiments per slide) in a highly reproducible manner, leading to the generation of vast amounts of data in a short time period (48-72 h) while reducing dramatically the quantities of polymers, reagents and cells used

  20. MicroArray Facility: a laboratory information management system with extended support for Nylon based technologies

    Directory of Open Access Journals (Sweden)

    Beaudoing Emmanuel

    2006-09-01

    Full Text Available Abstract Background High throughput gene expression profiling (GEP is becoming a routine technique in life science laboratories. With experimental designs that repeatedly span thousands of genes and hundreds of samples, relying on a dedicated database infrastructure is no longer an option. GEP technology is a fast moving target, with new approaches constantly broadening the field diversity. This technology heterogeneity, compounded by the informatics complexity of GEP databases, means that software developments have so far focused on mainstream techniques, leaving less typical yet established techniques such as Nylon microarrays at best partially supported. Results MAF (MicroArray Facility is the laboratory database system we have developed for managing the design, production and hybridization of spotted microarrays. Although it can support the widely used glass microarrays and oligo-chips, MAF was designed with the specific idiosyncrasies of Nylon based microarrays in mind. Notably single channel radioactive probes, microarray stripping and reuse, vector control hybridizations and spike-in controls are all natively supported by the software suite. MicroArray Facility is MIAME supportive and dynamically provides feedback on missing annotations to help users estimate effective MIAME compliance. Genomic data such as clone identifiers and gene symbols are also directly annotated by MAF software using standard public resources. The MAGE-ML data format is implemented for full data export. Journalized database operations (audit tracking, data anonymization, material traceability and user/project level confidentiality policies are also managed by MAF. Conclusion MicroArray Facility is a complete data management system for microarray producers and end-users. Particular care has been devoted to adequately model Nylon based microarrays. The MAF system, developed and implemented in both private and academic environments, has proved a robust solution for

  1. Experience of upgrading existing Russian designed nuclear plants

    International Nuclear Information System (INIS)

    Yanev, P.I.; Facer, R.I.

    1993-01-01

    From the reviewed experiences of upgrading existing Russian designed nuclear plants both of WWER and RBMK type the conclusions drawn are as follows. For the countries operating Russian designed plants it is necessary to adopt a pragmatic approach where all changes must be demonstrated to improve the safety of the plant and safety must be demonstrably improving. Care must be taken to avoid the pitfalls of excessive regulatory demands which are not satisfied and the development of an attitude of disregarding requirements on the basis that they are not enforced. For the lending countries and organizations, it is necessary to ensure that assistance is given to the operating organizations so that the most effective use of funds can be achieved. The experience in the West is that over-regulation and excessive expenditure do not necessarily lead to improved safety. They can lead to significant waste of resources. The use of western technology is recommended but where it is necessary and where it provides the greatest benefit

  2. Design and optimization of reverse-transcription quantitative PCR experiments.

    Science.gov (United States)

    Tichopad, Ales; Kitchen, Rob; Riedmaier, Irmgard; Becker, Christiane; Ståhlberg, Anders; Kubista, Mikael

    2009-10-01

    Quantitative PCR (qPCR) is a valuable technique for accurately and reliably profiling and quantifying gene expression. Typically, samples obtained from the organism of study have to be processed via several preparative steps before qPCR. We estimated the errors of sample withdrawal and extraction, reverse transcription (RT), and qPCR that are introduced into measurements of mRNA concentrations. We performed hierarchically arranged experiments with 3 animals, 3 samples, 3 RT reactions, and 3 qPCRs and quantified the expression of several genes in solid tissue, blood, cell culture, and single cells. A nested ANOVA design was used to model the experiments, and relative and absolute errors were calculated with this model for each processing level in the hierarchical design. We found that intersubject differences became easily confounded by sample heterogeneity for single cells and solid tissue. In cell cultures and blood, the noise from the RT and qPCR steps contributed substantially to the overall error because the sampling noise was less pronounced. We recommend the use of sample replicates preferentially to any other replicates when working with solid tissue, cell cultures, and single cells, and we recommend the use of RT replicates when working with blood. We show how an optimal sampling plan can be calculated for a limited budget. .

  3. Target designs for energetics experiments on the National Ignition Facility

    International Nuclear Information System (INIS)

    Meezan, N B; Glenzer, S H; Suter, L J

    2008-01-01

    The goal of the first hohlraum energetics experiments on the National Ignition Facility (NIF) [G. H. Miller et al, Optical Eng. 43, 2841 (2004)] is to select the hohlraum design for the first ignition experiments. Sub-scale hohlraums heated by 96 of the 192 laser beams on the NIF are used to emulate the laser-plasma interaction behavior of ignition hohlraums. These 'plasma emulator' targets are 70% scale versions of the 1.05 MJ, 300 eV ignition hohlraum and have the same energy-density as the full-scale ignition designs. Radiation-hydrodynamics simulations show that the sub-scale target is a good emulator of plasma conditions inside the ignition hohlraum, reproducing density n e within 10% and temperature T e within 15% along a laser beam path. Linear backscatter gain analysis shows the backscatter risk to be comparable to that of the ignition target. A successful energetics campaign will allow the National Ignition Campaign to focus its efforts on optimizing ignition hohlraums with efficient laser coupling

  4. Broad spectrum microarray for fingerprint-based bacterial species identification

    Directory of Open Access Journals (Sweden)

    Frey Jürg E

    2010-02-01

    Full Text Available Abstract Background Microarrays are powerful tools for DNA-based molecular diagnostics and identification of pathogens. Most target a limited range of organisms and are based on only one or a very few genes for specific identification. Such microarrays are limited to organisms for which specific probes are available, and often have difficulty discriminating closely related taxa. We have developed an alternative broad-spectrum microarray that employs hybridisation fingerprints generated by high-density anonymous markers distributed over the entire genome for identification based on comparison to a reference database. Results A high-density microarray carrying 95,000 unique 13-mer probes was designed. Optimized methods were developed to deliver reproducible hybridisation patterns that enabled confident discrimination of bacteria at the species, subspecies, and strain levels. High correlation coefficients were achieved between replicates. A sub-selection of 12,071 probes, determined by ANOVA and class prediction analysis, enabled the discrimination of all samples in our panel. Mismatch probe hybridisation was observed but was found to have no effect on the discriminatory capacity of our system. Conclusions These results indicate the potential of our genome chip for reliable identification of a wide range of bacterial taxa at the subspecies level without laborious prior sequencing and probe design. With its high resolution capacity, our proof-of-principle chip demonstrates great potential as a tool for molecular diagnostics of broad taxonomic groups.

  5. Design and modeling of precision solid liner experiments on Pegasus

    International Nuclear Information System (INIS)

    Bowers, R.L.; Brownell, J.H.; Lee, H.; McLenithan, K.D.; Scannapieco, A.J.; Shanahan, W.R.

    1998-01-01

    Pulsed power driven solid liners may be used for a variety of physics experiments involving materials at high stresses. These include shock formation and propagation, material strain-rate effects, material melt, instability growth, and ejecta from shocked surfaces. We describe the design and performance of a cylindrical solid liner that can attain velocities in the several mm/μs regime, and that can be used to drive high-stress experiments. An approximate theoretical analysis of solid liner implosions is used to establish the basic parameters (mass, materials, and initial radius) of the driver. We then present one-dimensional and two-dimensional simulations of magnetically driven, liner implosions which include resistive heating and elastic endash plastic behavior. The two-dimensional models are used to study the effects of electrode glide planes on the liner close-quote s performance, to examine sources of perturbations of the liner, and to assess possible effects of instability growth during the implosion. Finally, simulations are compared with experimental data to show that the solid liner performed as predicted computationally. Experimental data indicate that the liner imploded from an initial radius of 2.4 cm to a target radius of 1.5 cm, and that it was concentric and cylindrical to better than the experimental resolution (60 μm) at the target. The results demonstrate that a precision solid liner can be produced for high-stress, pulsed power applications experiments. copyright 1998 American Institute of Physics

  6. Design and implementation of new design of numerical experiments for non linear models

    International Nuclear Information System (INIS)

    Gazut, St.

    2007-03-01

    This thesis addresses the problem of the construction of surrogate models in numerical simulation. Whenever numerical experiments are costly, the simulation model is complex and difficult to use. It is important then to select the numerical experiments as efficiently as possible in order to minimize their number. In statistics, the selection of experiments is known as optimal experimental design. In the context of numerical simulation where no measurement uncertainty is present, we describe an alternative approach based on statistical learning theory and re-sampling techniques. The surrogate models are constructed using neural networks and the generalization error is estimated by leave-one-out, cross-validation and bootstrap. It is shown that the bootstrap can control the over-fitting and extend the concept of leverage for non linear in their parameters surrogate models. The thesis describes an iterative method called LDR for Learner Disagreement from experiment Re-sampling, based on active learning using several surrogate models constructed on bootstrap samples. The method consists in adding new experiments where the predictors constructed from bootstrap samples disagree most. We compare the LDR method with other methods of experimental design such as D-optimal selection. (author)

  7. Mission and design of the Fusion Ignition Research Experiment (FIRE)

    International Nuclear Information System (INIS)

    Meade, D.M.; Jardin, S.C.; Schmidt, J.

    2001-01-01

    Experiments are needed to test and extend present understanding of confinement, macroscopic stability, alpha-driven instabilities, and particle/power exhaust in plasmas dominated by alpha heating. A key issue is to what extent pressure profile evolution driven by strong alpha heating will act to self-organize advanced configurations with large bootstrap current fractions and internal transport barriers. A design study of a Fusion Ignition Research Experiment (FIRE) is underway to assess near term opportunities for advancing the scientific understanding of self-heated fusion plasmas. The emphasis is on understanding the behavior of fusion plasmas dominated by alpha heating (Q≥5) that are sustained for durations comparable to the characteristic plasma time scales (≥20 τ E and ∼τ skin , where τ skin is the time for the plasma current profile to redistribute at fixed current). The programmatic mission of FIRE is to attain, explore, understand and optimize alpha-dominated plasmas to provide knowledge for the design of attractive magnetic fusion energy systems. The programmatic strategy is to access the alpha-heating-dominated regime with confidence using the present advanced tokamak data base (e.g., Elmy-H-mode, ≤0.75 Greenwald density) while maintaining the flexibility for accessing and exploring other advanced tokamak modes (e. g., reversed shear, pellet enhanced performance) at lower magnetic fields and fusion power for longer durations in later stages of the experimental program. A major goal is to develop a design concept that could meet these physics objectives with a construction cost in the range of $1B. (author)

  8. Operating experience and design criteria of sodium valves

    International Nuclear Information System (INIS)

    Markford, D.

    1974-01-01

    The information presented refers to sodium valve development for KNK and SNR-300 as well as for sodium test facilities on the INTERATOM site at Bensberg. Well in advance of KNK-I a number of sodium test facilities have been operated containing small and medium size valves of different design and manufacturer. The more stringent requirements for long range safe and reliable operation in KNK-I put forth a development program for the main primary and secondary circuit sodium valves. Operational experience gave rise to modification of the stem seal arrangement mainly, so KNK-II (which is the fast core for KNK reactor) will be run with modified sodium valves. Main pipe diameters in SNR-300 are in the range of 600 mm. Valve designs with rising shafts would require excessive space in the primary circuit cavities, therefore efforts have been directed towards introduction of different type valves. Due to the requirements of after-heat-removal a valve type with control capability had to be chosen. A special design of butterfly valves was selected for the primary and secondary circuits of SNR-300. The development and tests performed with this type of valve are described. In the field of small sodium valves, tests with a 50 mm diameter freeze-seal valve are reported, and the current status of bellows-seal-valves to be inserted into SNR-300 is discussed. (U.S.)

  9. arrayCGHbase: an analysis platform for comparative genomic hybridization microarrays

    Directory of Open Access Journals (Sweden)

    Moreau Yves

    2005-05-01

    Full Text Available Abstract Background The availability of the human genome sequence as well as the large number of physically accessible oligonucleotides, cDNA, and BAC clones across the entire genome has triggered and accelerated the use of several platforms for analysis of DNA copy number changes, amongst others microarray comparative genomic hybridization (arrayCGH. One of the challenges inherent to this new technology is the management and analysis of large numbers of data points generated in each individual experiment. Results We have developed arrayCGHbase, a comprehensive analysis platform for arrayCGH experiments consisting of a MIAME (Minimal Information About a Microarray Experiment supportive database using MySQL underlying a data mining web tool, to store, analyze, interpret, compare, and visualize arrayCGH results in a uniform and user-friendly format. Following its flexible design, arrayCGHbase is compatible with all existing and forthcoming arrayCGH platforms. Data can be exported in a multitude of formats, including BED files to map copy number information on the genome using the Ensembl or UCSC genome browser. Conclusion ArrayCGHbase is a web based and platform independent arrayCGH data analysis tool, that allows users to access the analysis suite through the internet or a local intranet after installation on a private server. ArrayCGHbase is available at http://medgen.ugent.be/arrayCGHbase/.

  10. UX, XD & UXD. User Experience, Experience Design og User Experience Design. 8 paradokser - og 8 forsøg på (op)løsninger. Mod fælles forståelser og definitioner

    DEFF Research Database (Denmark)

    Jensen, Jens F.

    experience, experience design og user experience design. Disse begreber er beslægtede og i nogle sammenhænge tæt sammenvævede, men har dog også separate betydninger. I denne publikations sammenhæng vil vi både tale om user experience, experience design og user experience design som et samlet felt og om de...

  11. Design of experiments (DoE) in pharmaceutical development.

    Science.gov (United States)

    N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios

    2017-06-01

    At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.

  12. Washing scaling of GeneChip microarray expression

    Directory of Open Access Journals (Sweden)

    Krohn Knut

    2010-05-01

    Full Text Available Abstract Background Post-hybridization washing is an essential part of microarray experiments. Both the quality of the experimental washing protocol and adequate consideration of washing in intensity calibration ultimately affect the quality of the expression estimates extracted from the microarray intensities. Results We conducted experiments on GeneChip microarrays with altered protocols for washing, scanning and staining to study the probe-level intensity changes as a function of the number of washing cycles. For calibration and analysis of the intensity data we make use of the 'hook' method which allows intensity contributions due to non-specific and specific hybridization of perfect match (PM and mismatch (MM probes to be disentangled in a sequence specific manner. On average, washing according to the standard protocol removes about 90% of the non-specific background and about 30-50% and less than 10% of the specific targets from the MM and PM, respectively. Analysis of the washing kinetics shows that the signal-to-noise ratio doubles roughly every ten stringent washing cycles. Washing can be characterized by time-dependent rate constants which reflect the heterogeneous character of target binding to microarray probes. We propose an empirical washing function which estimates the survival of probe bound targets. It depends on the intensity contribution due to specific and non-specific hybridization per probe which can be estimated for each probe using existing methods. The washing function allows probe intensities to be calibrated for the effect of washing. On a relative scale, proper calibration for washing markedly increases expression measures, especially in the limit of small and large values. Conclusions Washing is among the factors which potentially distort expression measures. The proposed first-order correction method allows direct implementation in existing calibration algorithms for microarray data. We provide an experimental

  13. Factorial microarray analysis of zebra mussel (Dreissena polymorpha: Dreissenidae, Bivalvia adhesion

    Directory of Open Access Journals (Sweden)

    Faisal Mohamed

    2010-05-01

    Full Text Available Abstract Background The zebra mussel (Dreissena polymorpha has been well known for its expertise in attaching to substances under the water. Studies in past decades on this underwater adhesion focused on the adhesive protein isolated from the byssogenesis apparatus of the zebra mussel. However, the mechanism of the initiation, maintenance, and determination of the attachment process remains largely unknown. Results In this study, we used a zebra mussel cDNA microarray previously developed in our lab and a factorial analysis to identify the genes that were involved in response to the changes of four factors: temperature (Factor A, current velocity (Factor B, dissolved oxygen (Factor C, and byssogenesis status (Factor D. Twenty probes in the microarray were found to be modified by one of the factors. The transcription products of four selected genes, DPFP-BG20_A01, EGP-BG97/192_B06, EGP-BG13_G05, and NH-BG17_C09 were unique to the zebra mussel foot based on the results of quantitative reverse transcription PCR (qRT-PCR. The expression profiles of these four genes under the attachment and non-attachment were also confirmed by qRT-PCR and the result is accordant to that from microarray assay. The in situ hybridization with the RNA probes of two identified genes DPFP-BG20_A01 and EGP-BG97/192_B06 indicated that both of them were expressed by a type of exocrine gland cell located in the middle part of the zebra mussel foot. Conclusions The results of this study suggested that the changes of D. polymorpha byssogenesis status and the environmental factors can dramatically affect the expression profiles of the genes unique to the foot. It turns out that the factorial design and analysis of the microarray experiment is a reliable method to identify the influence of multiple factors on the expression profiles of the probesets in the microarray; therein it provides a powerful tool to reveal the mechanism of zebra mussel underwater attachment.

  14. Factorial microarray analysis of zebra mussel (Dreissena polymorpha: Dreissenidae, Bivalvia) adhesion.

    Science.gov (United States)

    Xu, Wei; Faisal, Mohamed

    2010-05-28

    The zebra mussel (Dreissena polymorpha) has been well known for its expertise in attaching to substances under the water. Studies in past decades on this underwater adhesion focused on the adhesive protein isolated from the byssogenesis apparatus of the zebra mussel. However, the mechanism of the initiation, maintenance, and determination of the attachment process remains largely unknown. In this study, we used a zebra mussel cDNA microarray previously developed in our lab and a factorial analysis to identify the genes that were involved in response to the changes of four factors: temperature (Factor A), current velocity (Factor B), dissolved oxygen (Factor C), and byssogenesis status (Factor D). Twenty probes in the microarray were found to be modified by one of the factors. The transcription products of four selected genes, DPFP-BG20_A01, EGP-BG97/192_B06, EGP-BG13_G05, and NH-BG17_C09 were unique to the zebra mussel foot based on the results of quantitative reverse transcription PCR (qRT-PCR). The expression profiles of these four genes under the attachment and non-attachment were also confirmed by qRT-PCR and the result is accordant to that from microarray assay. The in situ hybridization with the RNA probes of two identified genes DPFP-BG20_A01 and EGP-BG97/192_B06 indicated that both of them were expressed by a type of exocrine gland cell located in the middle part of the zebra mussel foot. The results of this study suggested that the changes of D. polymorpha byssogenesis status and the environmental factors can dramatically affect the expression profiles of the genes unique to the foot. It turns out that the factorial design and analysis of the microarray experiment is a reliable method to identify the influence of multiple factors on the expression profiles of the probesets in the microarray; therein it provides a powerful tool to reveal the mechanism of zebra mussel underwater attachment.

  15. Design of experiments for microencapsulation applications: A review.

    Science.gov (United States)

    Paulo, Filipa; Santos, Lúcia

    2017-08-01

    Microencapsulation techniques have been intensively explored by many research sectors such as pharmaceutical and food industries. Microencapsulation allows to protect the active ingredient from the external environment, mask undesired flavours, a possible controlled release of compounds among others. The purpose of this review is to provide a background of design of experiments in microencapsulation research context. Optimization processes are required for an accurate research in these fields and therefore, the right implementation of micro-sized techniques at industrial scale. This article critically reviews the use of the response surface methodologies in pharmaceutical and food microencapsulation research areas. A survey of optimization procedures in the literature, in the last few years is also presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Experiments simulation and design to set traffic lights operation rules

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez Garcia, J.A.

    2016-07-01

    In this paper it is used the experimental design to minimize the travel time of motor vehicles, in one of the most important avenues of Celaya City in Guanajuato, Mexico, by means of optimal synchronization of existing traffic lights. In the optimization process three factors are considered: the traffic lights’ cycle times, the synchrony defined as stepped, parallel and actual, and speed limit, each one with 3 evaluation levels. The response variables to consider are: motor vehicles’ travel time, fuel consumption and greenhouse effect gas (CO2) emissions. The different experiments are performed using the simulation model developed in the PTV-VISSIM software, which represents the vehicle traffic system. The obtained results for the different proposed scenarios allow to find proper levels at which the vehicle traffic system must be operated in order to improve mobility, to reduce contamination rates and decrease the fuel consumption for the different motor vehicles that use the avenue. (Author)

  17. Design and operational experience with a portable tritium cleanup system

    International Nuclear Information System (INIS)

    Maienschein, J.L.; Wilson, S.W.; Garcia, F.

    1991-06-01

    We built a portable tritium cleanup system to scavenge tritium from contaminated gases in any tritium-containing system in the LLNL Tritium Facility. The cleanup system uses standard catalytic oxidation of tritium to water followed by water removal with a molecular sieve dryer. The cleanup unit, complete with instrumentation, is contained in a portable cart that is rolled into place and connected to the apparatus to be cleaned. The cleanup systems is effective, low-tech, simple, and reliable. The nominal flow rate of the system is 30 liters/minute, and the decontamination factor is > 1000. In this paper we will show design information on our portable cleanup system, and will discuss our operational experience with it over the past several years

  18. Integrating Biological Perspectives:. a Quantum Leap for Microarray Expression Analysis

    Science.gov (United States)

    Wanke, Dierk; Kilian, Joachim; Bloss, Ulrich; Mangelsen, Elke; Supper, Jochen; Harter, Klaus; Berendzen, Kenneth W.

    2009-02-01

    Biologists and bioinformatic scientists cope with the analysis of transcript abundance and the extraction of meaningful information from microarray expression data. By exploiting biological information accessible in public databases, we try to extend our current knowledge over the plant model organism Arabidopsis thaliana. Here, we give two examples of increasing the quality of information gained from large scale expression experiments by the integration of microarray-unrelated biological information: First, we utilize Arabidopsis microarray data to demonstrate that expression profiles are usually conserved between orthologous genes of different organisms. In an initial step of the analysis, orthology has to be inferred unambiguously, which then allows comparison of expression profiles between orthologs. We make use of the publicly available microarray expression data of Arabidopsis and barley, Hordeum vulgare. We found a generally positive correlation in expression trajectories between true orthologs although both organisms are only distantly related in evolutionary time scale. Second, extracting clusters of co-regulated genes implies similarities in transcriptional regulation via similar cis-regulatory elements (CREs). Vice versa approaches, where co-regulated gene clusters are found by investigating on CREs were not successful in general. Nonetheless, in some cases the presence of CREs in a defined position, orientation or CRE-combinations is positively correlated with co-regulated gene clusters. Here, we make use of genes involved in the phenylpropanoid biosynthetic pathway, to give one positive example for this approach.

  19. Kernel Based Nonlinear Dimensionality Reduction and Classification for Genomic Microarray

    Directory of Open Access Journals (Sweden)

    Lan Shu

    2008-07-01

    Full Text Available Genomic microarrays are powerful research tools in bioinformatics and modern medicinal research because they enable massively-parallel assays and simultaneous monitoring of thousands of gene expression of biological samples. However, a simple microarray experiment often leads to very high-dimensional data and a huge amount of information, the vast amount of data challenges researchers into extracting the important features and reducing the high dimensionality. In this paper, a nonlinear dimensionality reduction kernel method based locally linear embedding(LLE is proposed, and fuzzy K-nearest neighbors algorithm which denoises datasets will be introduced as a replacement to the classical LLE’s KNN algorithm. In addition, kernel method based support vector machine (SVM will be used to classify genomic microarray data sets in this paper. We demonstrate the application of the techniques to two published DNA microarray data sets. The experimental results confirm the superiority and high success rates of the presented method.

  20. Microarray expression profiling of human dental pulp from single subject.

    Science.gov (United States)

    Tete, Stefano; Mastrangelo, Filiberto; Scioletti, Anna Paola; Tranasi, Michelangelo; Raicu, Florina; Paolantonio, Michele; Stuppia, Liborio; Vinci, Raffaele; Gherlone, Enrico; Ciampoli, Cristian; Sberna, Maria Teresa; Conti, Pio

    2008-01-01

    Microarray is a recently developed simultaneous analysis of expression patterns of thousand of genes. The aim of this research was to evaluate the expression profile of human healthy dental pulp in order to find the presence of genes activated and encoding for proteins involved in the physiological process of human dental pulp. We report data obtained by analyzing expression profiles of human tooth pulp from single subjects, using an approach based on the amplification of the total RNA. Experiments were performed on a high-density array able to analyse about 21,000 oligonucleotide sequences of about 70 bases in duplicate, using an approach based on the amplification of the total RNA from the pulp of a single tooth. Obtained data were analyzed using the S.A.M. system (Significance Analysis of Microarray) and genes were merged according to their molecular functions and biological process by the Onto-Express software. The microarray analysis revealed 362 genes with specific pulp expression. Genes showing significant high expression were classified in genes involved in tooth development, protoncogenes, genes of collagen, DNAse, Metallopeptidases and Growth factors. We report a microarray analysis, carried out by extraction of total RNA from specimens of healthy human dental pulp tissue. This approach represents a powerful tool in the study of human normal and pathological pulp, allowing minimization of the genetic variability due to the pooling of samples from different individuals.

  1. PV-Diesel Hybrid SCADA Experiment Network Design

    Science.gov (United States)

    Kalu, Alex; Durand, S.; Emrich, Carol; Ventre, G.; Wilson, W.; Acosta, R.

    1999-01-01

    The essential features of an experimental network for renewable power system satellite based supervisory, control and data acquisition (SCADA) are communication links, controllers, diagnostic equipment and a hybrid power system. Required components for implementing the network consist of two satellite ground stations, to satellite modems, two 486 PCs, two telephone receivers, two telephone modems, two analog telephone lines, one digital telephone line, a hybrid-power system equipped with controller and a satellite spacecraft. In the technology verification experiment (TVE) conducted by Savannah State University and Florida Solar Energy Center, the renewable energy hybrid system is the Apex-1000 Mini-Hybrid which is equipped with NGC3188 for user interface and remote control and the NGC2010 for monitoring and basic control tasks. This power system is connected to a satellite modem via a smart interface, RS232. Commands are sent to the power system control unit through a control PC designed as PC1. PC1 is thus connected to a satellite model through RS232. A second PC, designated PC2, the diagnostic PC is connected to both satellite modems via separate analog telephone lines for checking modems'health. PC2 is also connected to PC1 via a telephone line. Due to the unavailability of a second ground station for the ACTS, one ground station is used to serve both the sending and receiving functions in this experiment. Signal is sent from the control PC to the Hybrid system at a frequency f(sub 1), different from f(sub 2), the signal from the hybrid system to the control PC. f(sub l) and f(sub 2) are sufficiently separated to avoid interference.

  2. Designing an experiment to measure cellular interaction forces

    Science.gov (United States)

    McAlinden, Niall; Glass, David G.; Millington, Owain R.; Wright, Amanda J.

    2013-09-01

    Optical trapping is a powerful tool in Life Science research and is becoming common place in many microscopy laboratories and facilities. The force applied by the laser beam on the trapped object can be accurately determined allowing any external forces acting on the trapped object to be deduced. We aim to design a series of experiments that use an optical trap to measure and quantify the interaction force between immune cells. In order to cause minimum perturbation to the sample we plan to directly trap T cells and remove the need to introduce exogenous beads to the sample. This poses a series of challenges and raises questions that need to be answered in order to design a set of effect end-point experiments. A typical cell is large compared to the beads normally trapped and highly non-uniform - can we reliably trap such objects and prevent them from rolling and re-orientating? In this paper we show how a spatial light modulator can produce a triple-spot trap, as opposed to a single-spot trap, giving complete control over the object's orientation and preventing it from rolling due, for example, to Brownian motion. To use an optical trap as a force transducer to measure an external force you must first have a reliably calibrated system. The optical trapping force is typically measured using either the theory of equipartition and observing the Brownian motion of the trapped object or using an escape force method, e.g. the viscous drag force method. In this paper we examine the relationship between force and displacement, as well as measuring the maximum displacement from equilibrium position before an object falls out of the trap, hence determining the conditions under which the different calibration methods should be applied.

  3. Design, Construction, Alignment, and Calibration of a Compact Velocimetry Experiment

    International Nuclear Information System (INIS)

    Morris I Kaufman; Robert M Malone; Brent C Frogget; David L Esquibel; Vincent T Romero; Gregory A Lare; Bart Briggs; Adam J Iverson; Daniel K Frayer; Douglas DeVore Brian Cata

    2007-01-01

    A velocimetry experiment has been designed to measure shock properties for small cylindrical metal targets (8-mm-diameter by 2-mm thick). A target is accelerated by high explosives, caught, and retrieved for later inspection. The target is expected to move at a velocity of 0.1 to 3 km/sec. The complete experiment canister is approximately 105 mm in diameter and 380 mm long. Optical velocimetry diagnostics include the Velocity Interferometer System for Any Reflector (VISAR) and Photon Doppler Velocimetry (PDV). The packaging of the velocity diagnostics is not allowed to interfere with the catchment or an X-ray imaging diagnostic. A single optical relay, using commercial lenses, collects Doppler-shifted light for both VISAR and PDV. The use of fiber optics allows measurement of point velocities on the target surface during accelerations occurring over 15 mm of travel. The VISAR operates at 532 nm and has separate illumination fibers requiring alignment. The PDV diagnostic operates at 1550 nm, but is aligned and focused at 670 nm. The VISAR and PDV diagnostics are complementary measurements and they image spots in close proximity on the target surface. Because the optical relay uses commercial glass, the axial positions of the optical fibers for PDV and VISAR are offset to compensate for chromatic aberrations. The optomechanical design requires careful attention to fiber management, mechanical assembly and disassembly, positioning of the foam catchment, and X-ray diagnostic field-of-view. Calibration and alignment data are archived at each stage of the assembly sequence

  4. Design of thermodynamic experiments and analyses of thermodynamic relationships

    International Nuclear Information System (INIS)

    Oezer Arnas, A.

    2009-01-01

    In teaching of thermodynamics, a certain textbook is followed internationally whatever language it is written in. However, although some do a very good job, most are not correct and precise and furthermore NONE discuss at all the need for and importance of designing thermodynamic experiments although experimentation in engineering is considered to be the back bone of analyses, not pursued much these days, or numerical studies, so very predominant these days. Here some thermodynamic experiments along with physical interpretation of phenomena through simple mathematics will be discussed that are straightforward, meaningful and which can be performed by any undergraduate/graduate student. Another important topic for discussion is the fact that the thermodynamic state principle demands uniqueness of results. It has been found in literature that this fact is not well understood by those who attempt to apply it loosely and end up with questionable results. Thermodynamics is the fundamental science that clarifies all these issues if well understood, applied and interpreted. The attempt of this paper is to clarify these situations and offer alternative methods for analyses. (author)

  5. Exploiting fluorescence for multiplex immunoassays on protein microarrays

    International Nuclear Information System (INIS)

    Herbáth, Melinda; Balogh, Andrea; Matkó, János; Papp, Krisztián; Prechl, József

    2014-01-01

    Protein microarray technology is becoming the method of choice for identifying protein interaction partners, detecting specific proteins, carbohydrates and lipids, or for characterizing protein interactions and serum antibodies in a massively parallel manner. Availability of the well-established instrumentation of DNA arrays and development of new fluorescent detection instruments promoted the spread of this technique. Fluorescent detection has the advantage of high sensitivity, specificity, simplicity and wide dynamic range required by most measurements. Fluorescence through specifically designed probes and an increasing variety of detection modes offers an excellent tool for such microarray platforms. Measuring for example the level of antibodies, their isotypes and/or antigen specificity simultaneously can offer more complex and comprehensive information about the investigated biological phenomenon, especially if we take into consideration that hundreds of samples can be measured in a single assay. Not only body fluids, but also cell lysates, extracted cellular components, and intact living cells can be analyzed on protein arrays for monitoring functional responses to printed samples on the surface. As a rapidly evolving area, protein microarray technology offers a great bulk of information and new depth of knowledge. These are the features that endow protein arrays with wide applicability and robust sample analyzing capability. On the whole, protein arrays are emerging new tools not just in proteomics, but glycomics, lipidomics, and are also important for immunological research. In this review we attempt to summarize the technical aspects of planar fluorescent microarray technology along with the description of its main immunological applications. (topical review)

  6. OPTIMUM DESIGN OF EXPERIMENTS FOR ACCELERATED RELIABILITY TESTING

    Directory of Open Access Journals (Sweden)

    Sebastian Marian ZAHARIA

    2014-05-01

    Full Text Available In this paper is presented a case study that demonstrates how design to experiments (DOE information can be used to design better accelerated reliability tests. In the case study described in this paper, will be done a comparison and optimization between main accelerated reliability test plans (3 Level Best Standard Plan, 3 Level Best Compromise Plan, 3 Level Best Equal Expected Number Failing Plan, 3 Level 4:2:1 Allocation Plan. Before starting an accelerated reliability test, it is advisable to have a plan that helps in accurately estimating reliability at operating conditions while minimizing test time and costs. A test plan should be used to decide on the appropriate stress levels that should be used (for each stress type and the amount of the test units that need to be allocated to the different stress levels (for each combination of the different stress types' levels. For the case study it used ALTA 7 software what provides a complete analysis for data from accelerated reliability tests

  7. Entombment Using Cementitious Materials: Design Considerations and International Experience

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, Roger Ray

    2002-08-01

    Cementitious materials have physical and chemical properties that are well suited for the requirements of radioactive waste management. Namely, the materials have low permeability and durability that is consistent with the time frame required for short-lived radionuclides to decay. Furthermore, cementitious materials can provide a long-term chemical environment that substantially reduces the mobility of some long-lived radionuclides of concern for decommissioning (e.g., C-14, Ni-63, Ni-59). Because of these properties, cementitious materials are common in low-level radioactive waste disposal facilities throughout the world and are an attractive option for entombment of nuclear facilities. This paper describes design considerations for cementitious barriers in the context of performance over time frames of a few hundreds of years (directed toward short-lived radionuclides) and time frames of thousands of years (directed towards longer-lived radionuclides). The emphasis is on providing an overview of concepts for entombment that take advantage of the properties of cementitious materials and experience from the design of low-level radioactive waste disposal facilities. A few examples of the previous use of cementitious materials for entombment of decommissioned nuclear facilities and proposals for the use in future decommissioning of nuclear reactors in a few countries are also included to provide global perspective.

  8. Entombment Using Cementitious Materials: Design Considerations and International Experience

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R.R.

    2002-05-15

    Cementitious materials have physical and chemical properties that are well suited for the requirements of radioactive waste management. Namely, the materials have low permeability and durability that is consistent with the time frame required for short-lived radionuclides to decay. Furthermore, cementitious materials can provide a long-term chemical environment that substantially reduces the mobility of some long-lived radionuclides of concern for decommissioning (e.g., C-14, Ni-63, Ni-59). Because of these properties, cementitious materials are common in low-level radioactive waste disposal facilities throughout the world and are an attractive option for entombment of nuclear facilities. This paper describes design considerations for cementitious barriers in the context of performance over time frames of a few hundreds of years (directed toward short-lived radionuclides) and time frames of thousands of years (directed towards longer-lived radionuclides). The emphasis is on providing a n overview of concepts for entombment that take advantage of the properties of cementitious materials and experience from the design of low-level radioactive waste disposal facilities. A few examples of the previous use of cementitious materials for entombment of decommissioned nuclear facilities and proposals for the use in future decommissioning of nuclear reactors in a few countries are also included to provide global perspective.

  9. Entombment Using Cementitious Materials: Design Considerations and International Experience

    International Nuclear Information System (INIS)

    Seitz, R.R.

    2002-01-01

    Cementitious materials have physical and chemical properties that are well suited for the requirements of radioactive waste management. Namely, the materials have low permeability and durability that is consistent with the time frame required for short-lived radionuclides to decay. Furthermore, cementitious materials can provide a long-term chemical environment that substantially reduces the mobility of some long-lived radionuclides of concern for decommissioning (e.g., C-14, Ni-63, Ni-59). Because of these properties, cementitious materials are common in low-level radioactive waste disposal facilities throughout the world and are an attractive option for entombment of nuclear facilities. This paper describes design considerations for cementitious barriers in the context of performance over time frames of a few hundreds of years (directed toward short-lived radionuclides) and time frames of thousands of years (directed towards longer-lived radionuclides). The emphasis is on providing a n overview of concepts for entombment that take advantage of the properties of cementitious materials and experience from the design of low-level radioactive waste disposal facilities. A few examples of the previous use of cementitious materials for entombment of decommissioned nuclear facilities and proposals for the use in future decommissioning of nuclear reactors in a few countries are also included to provide global perspective

  10. Design experiences for medical irradiation field at the musashi reactor

    International Nuclear Information System (INIS)

    Aizawa, Otohiko

    1994-01-01

    The design of the medical irradiation field at the Musashi reactor was carried out from 1974 to 1975, about 20 years ago. Various numerical analyses have been carried out recently, and it is astonishing to find out that the performance close to the optimum as a 100 kW reactor has been obtained. The reason for this is that the design was carried out by dividing into the stationary part and the moving part, and as for the moving part, the structure was determined by repeating trial and error and experiments. In this paper, the comparison of the analysis carried out later with the experimental data and the change of the absorbed dose at the time of medical irradiation accompanying the change of neutron energy spectra are reported. As the characteristics of the medical irradiation field at the Musashi reactor, the neutron energy spectra and the absorbed dose and mean medical irradiation time are shown. As the problems in boron neutron capture therapy, the neutron fluence required for the therapy, the way of thinking on background dose, and the problem of determining the irradiation time are discussed. The features of epithermal neutron beam are explained. (K.I.)

  11. Detail design of the beam source for the SPIDER experiment

    International Nuclear Information System (INIS)

    Marcuzzi, D.; Agostinetti, P.; Dalla Palma, M.; Degli Agostini, F.; Pavei, M.; Rizzolo, A.; Tollin, M.; Trevisan, L.

    2010-01-01

    The ITER Neutral Beam Test Facility (PRIMA-Padova Research on Injector Megavolt Accelerated) is planned to be built at Consorzio RFX (Padova, Italy). PRIMA includes two experimental devices: a full size plasma source with low voltage extraction called SPIDER (Source for Production of Ion of Deuterium Extracted from RF plasma) and a full size neutral beam injector at full beam power called MITICA (Megavolt ITER Injector Concept Advancement). SPIDER is the first experimental device to be built and operated, aiming at testing the extraction of a negative ion beam (made of H - and in a later stage D - ions) from an ITER size ion source. The main requirements of this experiment are a H - /D - current of approximately 70 A/50 A and an energy of 100 keV. This paper presents an overview of the SPIDER beam source design, with a particular focus on the main design choices, aiming at reaching the best compromise between physics, optics, thermo-mechanical, cooling, assembly and electrical requirements.

  12. Design, development and operating experience with wet steam turbines

    International Nuclear Information System (INIS)

    Bolter, J.R.

    1989-01-01

    The paper first describes the special characteristics of wet steam units. It then goes on to discuss the principal features of the units manufactured by the author's company, the considerations on which the designs were based, and the development work carried out to validate them. Some of the design features such as the separator/reheater units and the arrangements for water extraction in the high pressure turbine are unconventional. An important characteristic of all nuclear plant is the combination of high capital cost and low fuel cost, and the consequent emphasis placed on high availability. The paper describes some service problems experienced with wet steam plant and how these were overcome with minimum loss of generation. The paper also describes a number of the developments for future wet steam plant which have evolved from these experiences, and from research and development programmes aimed at increasing the efficiency and reliability of both conventional and wet steam units. Blading, rotor construction and separator/reheater units are considered. (author)

  13. National Spherical Torus Experiment (NSTX) Torus Design, Fabrication and Assembly

    International Nuclear Information System (INIS)

    Neumeyer, C.; Barnes, G.; Chrzanowski, J.H.; Heitzenroeder, P.

    1999-01-01

    The National Spherical Torus Experiment (NSTX) is a low aspect ratio spherical torus (ST) located at Princeton Plasma Physics Laboratory (PPPL). Fabrication, assembly, and initial power tests were completed in February of 1999. The majority of the design and construction efforts were constructed on the Torus system components. The Torus system includes the centerstack assembly, external Poloidal and Toroidal coil systems, vacuum vessel, torus support structure and plasma facing components (PFC's). NSTX's low aspect ratio required that the centerstack be made with the smallest radius possible. This, and the need to bake NSTXs carbon-carbon composite plasma facing components at 350 degrees C, was major drivers in the design of NSTX. The Centerstack Assembly consists of the inner legs of the Toroidal Field (TF) windings, the Ohmic Heating (OH) solenoid and its associated tension cylinder, three inner Poloidal Field (PF) coils, thermal insulation, diagnostics and an Inconel casing which forms the inner wall of the vacuum vessel boundary. It took approximately nine months to complete the assembly of the Centerstack. The tight radial clearances and the extreme length of the major components added complexity to the assembly of the Centerstack components. The vacuum vessel was constructed of 304-stainless steel and required approximately seven months to complete and deliver to the Test Cell. Several of the issues associated with the construction of the vacuum vessel were control of dimensional stability following welding and controlling the permeability of the welds. A great deal of time and effort was devoted to defining the correct weld process and material selection to meet our design requirements. The PFCs will be baked out at 350 degrees C while the vessel is maintained at 150 degrees C. This required care in designing the supports so they can accommodate the high electromagnetic loads resulting from plasma disruptions and the resulting relative thermal expansions

  14. Cross-platform comparison of microarray data using order restricted inference

    Science.gov (United States)

    Klinglmueller, Florian; Tuechler, Thomas; Posch, Martin

    2013-01-01

    Motivation Titration experiments measuring the gene expression from two different tissues, along with total RNA mixtures of the pure samples, are frequently used for quality evaluation of microarray technologies. Such a design implies that the true mRNA expression of each gene, is either constant or follows a monotonic trend between the mixtures, applying itself to the use of order restricted inference procedures. Exploiting only the postulated monotonicity of titration designs, we propose three statistical analysis methods for the validation of high-throughput genetic data and corresponding preprocessing techniques. Results Our methods allow for inference of accuracy, repeatability and cross-platform agreement, with minimal required assumptions regarding the underlying data generating process. Therefore, they are readily applicable to all sorts of genetic high-throughput data independent of the degree of preprocessing. An application to the EMERALD dataset was used to demonstrate how our methods provide a rich spectrum of easily interpretable quality metrics and allow the comparison of different microarray technologies and normalization methods. The results are on par with previous work, but provide additional new insights that cast doubt on the utility of popular preprocessing techniques, specifically concerning the EMERALD projects dataset. Availability All datasets are available on EBI’s ArrayExpress web site (http://www.ebi.ac.uk/microarray-as/ae/) under accession numbers E-TABM-536, E-TABM-554 and E-TABM-555. Source code implemented in C and R is available at: http://statistics.msi.meduniwien.ac.at/float/cross_platform/. Methods for testing and variance decomposition have been made available in the R-package orQA, which can be downloaded and installed from CRAN http://cran.r-project.org. PMID:21317143

  15. Exploring matrix factorization techniques for significant genes identification of Alzheimer’s disease microarray gene expression data

    Directory of Open Access Journals (Sweden)

    Hu Xiaohua

    2011-07-01

    Full Text Available Abstract Background The wide use of high-throughput DNA microarray technology provide an increasingly detailed view of human transcriptome from hundreds to thousands of genes. Although biomedical researchers typically design microarray experiments to explore specific biological contexts, the relationships between genes are hard to identified because they are complex and noisy high-dimensional data and are often hindered by low statistical power. The main challenge now is to extract valuable biological information from the colossal amount of data to gain insight into biological processes and the mechanisms of human disease. To overcome the challenge requires mathematical and computational methods that are versatile enough to capture the underlying biological features and simple enough to be applied efficiently to large datasets. Methods Unsupervised machine learning approaches provide new and efficient analysis of gene expression profiles. In our study, two unsupervised knowledge-based matrix factorization methods, independent component analysis (ICA and nonnegative matrix factorization (NMF are integrated to identify significant genes and related pathways in microarray gene expression dataset of Alzheimer’s disease. The advantage of these two approaches is they can be performed as a biclustering method by which genes and conditions can be clustered simultaneously. Furthermore, they can group genes into different categories for identifying related diagnostic pathways and regulatory networks. The difference between these two method lies in ICA assume statistical independence of the expression modes, while NMF need positivity constrains to generate localized gene expression profiles. Results In our work, we performed FastICA and non-smooth NMF methods on DNA microarray gene expression data of Alzheimer’s disease respectively. The simulation results shows that both of the methods can clearly classify severe AD samples from control samples, and

  16. Advanced microarray technologies for clinical diagnostics

    NARCIS (Netherlands)

    Pierik, Anke

    2011-01-01

    DNA microarrays become increasingly important in the field of clinical diagnostics. These microarrays, also called DNA chips, are small solid substrates, typically having a maximum surface area of a few cm2, onto which many spots are arrayed in a pre-determined pattern. Each of these spots contains

  17. Carbohydrate Microarrays in Plant Science

    DEFF Research Database (Denmark)

    Fangel, Jonatan Ulrik; Pedersen, H.L.; Vidal-Melgosa, S.

    2012-01-01

    Almost all plant cells are surrounded by glycan-rich cell walls, which form much of the plant body and collectively are the largest source of biomass on earth. Plants use polysaccharides for support, defense, signaling, cell adhesion, and as energy storage, and many plant glycans are also important...... industrially and nutritionally. Understanding the biological roles of plant glycans and the effective exploitation of their useful properties requires a detailed understanding of their structures, occurrence, and molecular interactions. Microarray technology has revolutionized the massively high...... for plant research and can be used to map glycan populations across large numbers of samples to screen antibodies, carbohydrate binding proteins, and carbohydrate binding modules and to investigate enzyme activities....

  18. Interim Service ISDN Satellite (ISIS) hardware experiment development for advanced ISDN satellite designs and experiments

    Science.gov (United States)

    Pepin, Gerard R.

    1992-01-01

    The Interim Service Integrated Service Digital Network (ISDN) Satellite (ISIS) Hardware Experiment Development for Advanced Satellite Designs describes the development of the ISDN Satellite Terminal Adapter (ISTA) capable of translating ISDN protocol traffic into Time Division Multiple Access (TDMA) signals for use by a communications satellite. The ISTA connects the Type 1 Network Termination (NT1) via the U-interface on the line termination side of the CPE to the RS-499 interface for satellite uplink. The same ISTA converts in the opposite direction the RS-499 to U-interface data with a simple switch setting.

  19. Translating microarray data for diagnostic testing in childhood leukaemia

    International Nuclear Information System (INIS)

    Hoffmann, Katrin; Firth, Martin J; Beesley, Alex H; Klerk, Nicholas H de; Kees, Ursula R

    2006-01-01

    and with microarray experiments being performed by a different research team

  20. The Role of Fiction in Experiments within Design, Art & Architecture

    DEFF Research Database (Denmark)

    Knutz, Eva; Markussen, Thomas; Christensen, Poul Rind

    2013-01-01

    This paper offers a typology for understanding design fiction as a new approach in design research. The typology allows design researchers to explain design fictions according to 5 criteria: (1) “What if scenarios” as the basic construal principle of design fiction; (2) the manifestation of criti...

  1. The Role of Fiction in Experiments within Design, Art & Architecture

    DEFF Research Database (Denmark)

    Knutz, Eva; Markussen, Thomas; Christensen, Poul Rind

    2014-01-01

    This paper offers a typology for understanding design fiction as a new approach in design research. The typology allows design researchers to explain design fictions according to 5 criteria.The typology is premised on the idea that fiction may integrate with reality in many different ways in design...

  2. Interlopers 3D: experiences designing a stereoscopic game

    Science.gov (United States)

    Weaver, James; Holliman, Nicolas S.

    2014-03-01

    Background In recent years 3D-enabled televisions, VR headsets and computer displays have become more readily available in the home. This presents an opportunity for game designers to explore new stereoscopic game mechanics and techniques that have previously been unavailable in monocular gaming. Aims To investigate the visual cues that are present in binocular and monocular vision, identifying which are relevant when gaming using a stereoscopic display. To implement a game whose mechanics are so reliant on binocular cues that the game becomes impossible or at least very difficult to play in non-stereoscopic mode. Method A stereoscopic 3D game was developed whose objective was to shoot down advancing enemies (the Interlopers) before they reached their destination. Scoring highly required players to make accurate depth judgments and target the closest enemies first. A group of twenty participants played both a basic and advanced version of the game in both monoscopic 2D and stereoscopic 3D. Results The results show that in both the basic and advanced game participants achieved higher scores when playing in stereoscopic 3D. The advanced game showed that by disrupting the depth from motion cue the game became more difficult in monoscopic 2D. Results also show a certain amount of learning taking place over the course of the experiment, meaning that players were able to score higher and finish the game faster over the course of the experiment. Conclusions Although the game was not impossible to play in monoscopic 2D, participants results show that it put them at a significant disadvantage when compared to playing in stereoscopic 3D.

  3. DNA microarray-based PCR ribotyping of Clostridium difficile.

    Science.gov (United States)

    Schneeberg, Alexander; Ehricht, Ralf; Slickers, Peter; Baier, Vico; Neubauer, Heinrich; Zimmermann, Stefan; Rabold, Denise; Lübke-Becker, Antina; Seyboldt, Christian

    2015-02-01

    This study presents a DNA microarray-based assay for fast and simple PCR ribotyping of Clostridium difficile strains. Hybridization probes were designed to query the modularly structured intergenic spacer region (ISR), which is also the template for conventional and PCR ribotyping with subsequent capillary gel electrophoresis (seq-PCR) ribotyping. The probes were derived from sequences available in GenBank as well as from theoretical ISR module combinations. A database of reference hybridization patterns was set up from a collection of 142 well-characterized C. difficile isolates representing 48 seq-PCR ribotypes. The reference hybridization patterns calculated by the arithmetic mean were compared using a similarity matrix analysis. The 48 investigated seq-PCR ribotypes revealed 27 array profiles that were clearly distinguishable. The most frequent human-pathogenic ribotypes 001, 014/020, 027, and 078/126 were discriminated by the microarray. C. difficile strains related to 078/126 (033, 045/FLI01, 078, 126, 126/FLI01, 413, 413/FLI01, 598, 620, 652, and 660) and 014/020 (014, 020, and 449) showed similar hybridization patterns, confirming their genetic relatedness, which was previously reported. A panel of 50 C. difficile field isolates was tested by seq-PCR ribotyping and the DNA microarray-based assay in parallel. Taking into account that the current version of the microarray does not discriminate some closely related seq-PCR ribotypes, all isolates were typed correctly. Moreover, seq-PCR ribotypes without reference profiles available in the database (ribotype 009 and 5 new types) were correctly recognized as new ribotypes, confirming the performance and expansion potential of the microarray. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  4. Dhruva: Main design features, operational experience and utilization

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, S.K. [Reactor Group, BARC, Trombay, Mumbai 400085 (India)]. E-mail: swarajagarwal2000@yahoo.com; Karhadkar, C.G. [Reactor Group, BARC, Trombay, Mumbai 400085 (India); Zope, A.K. [Reactor Group, BARC, Trombay, Mumbai 400085 (India); Singh, Kanchhi [Reactor Group, BARC, Trombay, Mumbai 400085 (India)

    2006-04-15

    Dhruva is a product of technological initiatives taken during mid seventies when a need was felt for another research reactor having a high neutron flux to meet the growing demands of research and development in the frontier areas of science and engineering. In addition production of radioisotopes of high specific activity and the diverse requirements of a broad based user community had to be synthesized into a viable system, which could be engineered within the limited means available in the country. This high neutron flux reactor was designed, constructed and commissioned entirely by Indian scientists and engineers and it reflects the country's resolve to achieve self-reliance in the nuclear reactor technology. Dhruva is a 100 MW (thermal) research reactor with metallic natural uranium as fuel, heavy water as moderator, coolant and reflector, giving a maximum thermal neutron flux of 1.8 x 10{sup 14} n/cm{sup 2}/s. Since its first criticality on 8th August 1985, a number of experimental facilities have been added which have proven to be highly attractive for universities and industrial researchers for their scientific merits in various fields. One of the major utilization areas has been the neutron beam research using several neutron spectrometers, all of which were built in-house. A guide tube facility comprising of two neutron guides and another experimental set-up with a multi-instrument beam line have enabled further enhancement of the utilization of this National Facility by the academic institutions in the country. Production of radioisotopes of high specific activity and in increased quantity has fulfilled growing demands for many applications. The write-up provides an overview of the reactor covering its design; layout, safety features, utilization and operating experience along with description of some of the specific experimental facilities.

  5. Interactive Effects of Environmental Experience and Innovative Cognitive Style on Student Creativity in Product Design

    Science.gov (United States)

    Lu, Chia-Chen

    2017-01-01

    Environmental experience can enhance the ideas of design students. Thus, this type of experience may interfere with the influence of design students' cognitive style on creativity. The aim of this study was to examine the influence of environmental experience on the relationship between innovative cognitive style and industrial design students'…

  6. Physics Design of the National Compact Stellarator Experiment

    International Nuclear Information System (INIS)

    Neilson, G.H.; Zarnstorff, M.C.; Lyon, J.F.

    2002-01-01

    Compact quasi-axisymmetric stellarators offer the possibility of combining the steady-state low-recirculating power, external control, and disruption resilience of previous stellarators with the low-aspect ratio, high beta-limit, and good confinement of advanced tokamaks. Quasi-axisymmetric equilibria have been developed for the proposed National Compact Stellarator Experiment (NCSX) with average aspect ratio approximately 4.4 and average elongation approximately 1.8. Even with bootstrap-current consistent profiles, they are passively stable to the ballooning, kink, vertical, Mercier, and neoclassical-tearing modes for b > 4%, without the need for external feedback or conducting walls. The bootstrap current generates only 1/4 of the magnetic rotational transform at b = 4% (the rest is from the coils). Transport simulations show adequate fast-ion confinement and thermal neoclassical transport similar to equivalent tokamaks. Modular coils have been designed which reproduce the physics properties, provide good flux surfaces, and allow flexible variation of the plasma shape to control the predicted MHD stability and transport properties

  7. Problematic Smartphone Use: Investigating Contemporary Experiences Using a Convergent Design

    Directory of Open Access Journals (Sweden)

    Daria J. Kuss

    2018-01-01

    Full Text Available Internet-enabled smartphones are increasingly ubiquitous in the Western world. Research suggests a number of problems can result from mobile phone overuse, including dependence, dangerous and prohibited use. For over a decade, this has been measured by the Problematic Mobile Phone Use Questionnaire (PMPU-Q. Given the rapid developments in mobile technologies, changes of use patterns and possible problematic and addictive use, the aim of the present study was to investigate and validate an updated contemporary version of the PMPU-Q (PMPU-Q-R. A mixed methods convergent design was employed, including a psychometric survey (N = 512 alongside qualitative focus groups (N = 21, to elicit experiences and perceptions of problematic smartphone use. The results suggest the PMPU-Q-R factor structure can be updated to include smartphone dependence, dangerous driving, and antisocial smartphone use factors. Theories of problematic mobile phone use require consideration of the ubiquity and indispensability of smartphones in the present day and age, particularly regarding use whilst driving and in social interactions.

  8. Problematic Smartphone Use: Investigating Contemporary Experiences Using a Convergent Design.

    Science.gov (United States)

    Kuss, Daria J; Harkin, Lydia; Kanjo, Eiman; Billieux, Joel

    2018-01-16

    Internet-enabled smartphones are increasingly ubiquitous in the Western world. Research suggests a number of problems can result from mobile phone overuse, including dependence, dangerous and prohibited use. For over a decade, this has been measured by the Problematic Mobile Phone Use Questionnaire (PMPU-Q). Given the rapid developments in mobile technologies, changes of use patterns and possible problematic and addictive use, the aim of the present study was to investigate and validate an updated contemporary version of the PMPU-Q (PMPU-Q-R). A mixed methods convergent design was employed, including a psychometric survey ( N = 512) alongside qualitative focus groups ( N = 21), to elicit experiences and perceptions of problematic smartphone use. The results suggest the PMPU-Q-R factor structure can be updated to include smartphone dependence, dangerous driving, and antisocial smartphone use factors. Theories of problematic mobile phone use require consideration of the ubiquity and indispensability of smartphones in the present day and age, particularly regarding use whilst driving and in social interactions.

  9. Problematic Smartphone Use: Investigating Contemporary Experiences Using a Convergent Design

    Science.gov (United States)

    Harkin, Lydia

    2018-01-01

    Internet-enabled smartphones are increasingly ubiquitous in the Western world. Research suggests a number of problems can result from mobile phone overuse, including dependence, dangerous and prohibited use. For over a decade, this has been measured by the Problematic Mobile Phone Use Questionnaire (PMPU-Q). Given the rapid developments in mobile technologies, changes of use patterns and possible problematic and addictive use, the aim of the present study was to investigate and validate an updated contemporary version of the PMPU-Q (PMPU-Q-R). A mixed methods convergent design was employed, including a psychometric survey (N = 512) alongside qualitative focus groups (N = 21), to elicit experiences and perceptions of problematic smartphone use. The results suggest the PMPU-Q-R factor structure can be updated to include smartphone dependence, dangerous driving, and antisocial smartphone use factors. Theories of problematic mobile phone use require consideration of the ubiquity and indispensability of smartphones in the present day and age, particularly regarding use whilst driving and in social interactions. PMID:29337883

  10. Irradiation Experiment Conceptual Design Parameters for NBSR Fuel Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Brown, N. R. [Brookhaven National Lab. (BNL), Upton, NY (United States). Nuclear Science and Technology Dept.; Brown, N. R. [Brookhaven National Lab. (BNL), Upton, NY (United States). Nuclear Science and Technology Dept.; Baek, J. S [Brookhaven National Lab. (BNL), Upton, NY (United States). Nuclear Science and Technology Dept.; Hanson, A. L. [Brookhaven National Lab. (BNL), Upton, NY (United States). Nuclear Science and Technology Dept.; Cuadra, A. [Brookhaven National Lab. (BNL), Upton, NY (United States). Nuclear Science and Technology Dept.; Cheng, L. Y. [Brookhaven National Lab. (BNL), Upton, NY (United States). Nuclear Science and Technology Dept.; Diamond, D. J. [Brookhaven National Lab. (BNL), Upton, NY (United States). Nuclear Science and Technology Dept.

    2014-04-30

    It has been proposed to convert the National Institute of Standards and Technology (NIST) research reactor, known as the NBSR, from high-enriched uranium (HEU) fuel to low-Enriched uranium (LEU) fuel. The motivation to convert the NBSR to LEU fuel is to reduce the risk of proliferation of special nuclear material. This report is a compilation of relevant information from recent studies related to the proposed conversion using a metal alloy of LEU with 10 w/o molybdenum. The objective is to inform the design of the mini-plate and full-size-Plate irradiation experiments that are being planned. This report provides relevant dimensions of the fuel elements, and the following parameters at steady state: average and maximum fission rate density and fission density, fuel temperature distribution for the plate with maximum local temperature, and two-dimensional heat flux profiles of fuel plates with high power densities. The latter profiles are given for plates in both the inner and outer core zones and for cores with both fresh and depleted shim arms (reactivity control devices). A summary of the methodology to obtain these results is presented. Fuel element tolerance assumptions and hot channel factors used in the safety analysis are also given.

  11. MAGMA: analysis of two-channel microarrays made easy.

    Science.gov (United States)

    Rehrauer, Hubert; Zoller, Stefan; Schlapbach, Ralph

    2007-07-01

    The web application MAGMA provides a simple and intuitive interface to identify differentially expressed genes from two-channel microarray data. While the underlying algorithms are not superior to those of similar web applications, MAGMA is particularly user friendly and can be used without prior training. The user interface guides the novice user through the most typical microarray analysis workflow consisting of data upload, annotation, normalization and statistical analysis. It automatically generates R-scripts that document MAGMA's entire data processing steps, thereby allowing the user to regenerate all results in his local R installation. The implementation of MAGMA follows the model-view-controller design pattern that strictly separates the R-based statistical data processing, the web-representation and the application logic. This modular design makes the application flexible and easily extendible by experts in one of the fields: statistical microarray analysis, web design or software development. State-of-the-art Java Server Faces technology was used to generate the web interface and to perform user input processing. MAGMA's object-oriented modular framework makes it easily extendible and applicable to other fields and demonstrates that modern Java technology is also suitable for rather small and concise academic projects. MAGMA is freely available at www.magma-fgcz.uzh.ch.

  12. Analytical software design : introduction and industrial experience report

    NARCIS (Netherlands)

    Osaiweran, A.A.H.; Boosten, M.; Mousavi, M.R.

    2010-01-01

    Analytical Software Design (ASD) is a design approach that combines formal and empirical methods for developing mathematically verified software systems. Unlike conventional design methods, the design phase is extended with more formal techniques, so that flaws are detected earlier, thereby reducing

  13. Mind the gap : probing exertion experience with experiential design landscapes

    NARCIS (Netherlands)

    Ren, X.; Lu, Y.; Brombacher, A.C.; Bogers, S.J.A.

    2016-01-01

    In this paper, we report our study on applying Experiential Design Landscapes as the basis of design process to support the design of exertion games. We approach this question by setting up an 8-day interaction design module with 7 groups of students. The methods of our module were developed based

  14. A novel synthetic peptide microarray assay detects Chlamydia species-specific antibodies in animal and human sera.

    Science.gov (United States)

    Sachse, Konrad; Rahman, Kh Shamsur; Schnee, Christiane; Müller, Elke; Peisker, Madlen; Schumacher, Thomas; Schubert, Evelyn; Ruettger, Anke; Kaltenboeck, Bernhard; Ehricht, Ralf

    2018-03-16

    Serological analysis of Chlamydia (C.) spp. infections is still mainly based on micro-immunofluorescence and ELISA. To overcome the limitations of conventional serology, we have designed a novel microarray carrying 52 synthetic peptides representing B-cell epitopes from immunodominant proteins of all 11 chlamydial species. The new assay has been validated using monospecific mouse hyperimmune sera. Subsequently, serum samples from cattle, sheep and humans with a known history of chlamydial infection were examined. For instance, the specific humoral response of sheep to treatment with a C. abortus vaccine has been visualized against a background of C. pecorum carriership. In samples from humans, dual infection with C. trachomatis and C. pneumoniae could be demonstrated. The experiments revealed that the peptide microarray assay was capable of simultaneously identifying specific antibodies to each Chlamydia spp. The actual assay represents an open platform test that can be complemented through future advances in Chlamydia proteome research. The concept of the highly parallel multi-antigen microarray proven in this study has the potential to enhance our understanding of antibody responses by defining not only a single quantitative response, but also the pattern of this response. The added value of using peptide antigens will consist in unprecedented serodiagnostic specificity.

  15. Hierarchical information representation and efficient classification of gene expression microarray data

    OpenAIRE

    Bosio, Mattia

    2014-01-01

    In the field of computational biology, microarryas are used to measure the activity of thousands of genes at once and create a global picture of cellular function. Microarrays allow scientists to analyze expression of many genes in a single experiment quickly and eficiently. Even if microarrays are a consolidated research technology nowadays and the trends in high-throughput data analysis are shifting towards new technologies like Next Generation Sequencing (NGS), an optimum method for sample...

  16. MARS: Microarray analysis, retrieval, and storage system

    Directory of Open Access Journals (Sweden)

    Scheideler Marcel

    2005-04-01

    Full Text Available Abstract Background Microarray analysis has become a widely used technique for the study of gene-expression patterns on a genomic scale. As more and more laboratories are adopting microarray technology, there is a need for powerful and easy to use microarray databases facilitating array fabrication, labeling, hybridization, and data analysis. The wealth of data generated by this high throughput approach renders adequate database and analysis tools crucial for the pursuit of insights into the transcriptomic behavior of cells. Results MARS (Microarray Analysis and Retrieval System provides a comprehensive MIAME supportive suite for storing, retrieving, and analyzing multi color microarray data. The system comprises a laboratory information management system (LIMS, a quality control management, as well as a sophisticated user management system. MARS is fully integrated into an analytical pipeline of microarray image analysis, normalization, gene expression clustering, and mapping of gene expression data onto biological pathways. The incorporation of ontologies and the use of MAGE-ML enables an export of studies stored in MARS to public repositories and other databases accepting these documents. Conclusion We have developed an integrated system tailored to serve the specific needs of microarray based research projects using a unique fusion of Web based and standalone applications connected to the latest J2EE application server technology. The presented system is freely available for academic and non-profit institutions. More information can be found at http://genome.tugraz.at.

  17. Simulation of microarray data with realistic characteristics

    Directory of Open Access Journals (Sweden)

    Lehmussola Antti

    2006-07-01

    Full Text Available Abstract Background Microarray technologies have become common tools in biological research. As a result, a need for effective computational methods for data analysis has emerged. Numerous different algorithms have been proposed for analyzing the data. However, an objective evaluation of the proposed algorithms is not possible due to the lack of biological ground truth information. To overcome this fundamental problem, the use of simulated microarray data for algorithm validation has been proposed. Results We present a microarray simulation model which can be used to validate different kinds of data analysis algorithms. The proposed model is unique in the sense that it includes all the steps that affect the quality of real microarray data. These steps include the simulation of biological ground truth data, applying biological and measurement technology specific error models, and finally simulating the microarray slide manufacturing and hybridization. After all these steps are taken into account, the simulated data has realistic biological and statistical characteristics. The applicability of the proposed model is demonstrated by several examples. Conclusion The proposed microarray simulation model is modular and can be used in different kinds of applications. It includes several error models that have been proposed earlier and it can be used with different types of input data. The model can be used to simulate both spotted two-channel and oligonucleotide based single-channel microarrays. All this makes the model a valuable tool for example in validation of data analysis algorithms.

  18. Implementation of mutual information and bayes theorem for classification microarray data

    Science.gov (United States)

    Dwifebri Purbolaksono, Mahendra; Widiastuti, Kurnia C.; Syahrul Mubarok, Mohamad; Adiwijaya; Aminy Ma’ruf, Firda

    2018-03-01

    Microarray Technology is one of technology which able to read the structure of gen. The analysis is important for this technology. It is for deciding which attribute is more important than the others. Microarray technology is able to get cancer information to diagnose a person’s gen. Preparation of microarray data is a huge problem and takes a long time. That is because microarray data contains high number of insignificant and irrelevant attributes. So, it needs a method to reduce the dimension of microarray data without eliminating important information in every attribute. This research uses Mutual Information to reduce dimension. System is built with Machine Learning approach specifically Bayes Theorem. This theorem uses a statistical and probability approach. By combining both methods, it will be powerful for Microarray Data Classification. The experiment results show that system is good to classify Microarray data with highest F1-score using Bayesian Network by 91.06%, and Naïve Bayes by 88.85%.

  19. Radioactive cDNA microarray in neurospsychiatry

    International Nuclear Information System (INIS)

    Choe, Jae Gol; Shin, Kyung Ho; Lee, Min Soo; Kim, Meyoung Kon

    2003-01-01

    Microarray technology allows the simultaneous analysis of gene expression patterns of thousands of genes, in a systematic fashion, under a similar set of experimental conditions, thus making the data highly comparable. In some cases arrays are used simply as a primary screen leading to downstream molecular characterization of individual gene candidates. In other cases, the goal of expression profiling is to begin to identify complex regulatory networks underlying developmental processes and disease states. Microarrays were originally used with cell lines or other simple model systems. More recently, microarrays have been used in the analysis of more complex biological tissues including neural systems and the brain. The application of cDNA arrays in neuropsychiatry has lagged behind other fields for a number of reasons. These include a requirement for a large amount of input probe RNA in fluorescent-glass based array systems and the cellular complexity introduced by multicellular brain and neural tissues. An additional factor that impacts the general use of microarrays in neuropsychiatry is the lack of availability of sequenced clone sets from model systems. While human cDNA clones have been widely available, high quality rat, mouse, and drosophilae, among others are just becoming widely available. A final factor in the application of cDNA microarrays in neuropsychiatry is cost of commercial arrays. As academic microarray facilitates become more commonplace custom made arrays will become more widely available at a lower cost allowing more widespread applications. In summary, microarray technology is rapidly having an impact on many areas of biomedical research. Radioisotope-nylon based microarrays offer alternatives that may in some cases be more sensitive, flexible, inexpensive, and universal as compared to other array formats, such as fluorescent-glass arrays. In some situations of limited RNA or exotic species, radioactive membrane microarrays may be the most

  20. Radioactive cDNA microarray in neurospsychiatry

    Energy Technology Data Exchange (ETDEWEB)

    Choe, Jae Gol; Shin, Kyung Ho; Lee, Min Soo; Kim, Meyoung Kon [Korea University Medical School, Seoul (Korea, Republic of)

    2003-02-01

    Microarray technology allows the simultaneous analysis of gene expression patterns of thousands of genes, in a systematic fashion, under a similar set of experimental conditions, thus making the data highly comparable. In some cases arrays are used simply as a primary screen leading to downstream molecular characterization of individual gene candidates. In other cases, the goal of expression profiling is to begin to identify complex regulatory networks underlying developmental processes and disease states. Microarrays were originally used with cell lines or other simple model systems. More recently, microarrays have been used in the analysis of more complex biological tissues including neural systems and the brain. The application of cDNA arrays in neuropsychiatry has lagged behind other fields for a number of reasons. These include a requirement for a large amount of input probe RNA in fluorescent-glass based array systems and the cellular complexity introduced by multicellular brain and neural tissues. An additional factor that impacts the general use of microarrays in neuropsychiatry is the lack of availability of sequenced clone sets from model systems. While human cDNA clones have been widely available, high quality rat, mouse, and drosophilae, among others are just becoming widely available. A final factor in the application of cDNA microarrays in neuropsychiatry is cost of commercial arrays. As academic microarray facilitates become more commonplace custom made arrays will become more widely available at a lower cost allowing more widespread applications. In summary, microarray technology is rapidly having an impact on many areas of biomedical research. Radioisotope-nylon based microarrays offer alternatives that may in some cases be more sensitive, flexible, inexpensive, and universal as compared to other array formats, such as fluorescent-glass arrays. In some situations of limited RNA or exotic species, radioactive membrane microarrays may be the most

  1. An overview of the design and analysis of simulation experiments for sensitivity analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    Sensitivity analysis may serve validation, optimization, and risk analysis of simulation models. This review surveys 'classic' and 'modern' designs for experiments with simulation models. Classic designs were developed for real, non-simulated systems in agriculture, engineering, etc. These designs

  2. Liquid metal cooled reactors: Experience in design and operation

    International Nuclear Information System (INIS)

    2007-12-01

    on key fast reactor technology aspects in an integrative sense useful to engineers, scientists, managers, university students and professors. This publication has been prepared to contribute toward the IAEA activity to preserve the knowledge gained in the liquid metal cooled fast reactor (LMFR) technology development. This technology development and experience include aspects addressing not only experimental and demonstration reactors, but also all activities from reactor construction to decommissioning. This publication provides a survey of worldwide experience gained over the past five decades in LMFR development, design, operation and decommissioning, which has been accumulated through the IAEA programmes carried out within the framework of the TWG-FR and the Agency's INIS and NKMS

  3. DNA Microarrays: a Powerful Genomic Tool for Biomedical and Clinical Research

    OpenAIRE

    Trevino, Victor; Falciani, Francesco; Barrera-Saldaña, Hugo A

    2007-01-01

    Among the many benefits of the Human Genome Project are new and powerful tools such as the genome-wide hybridization devices referred to as microarrays. Initially designed to measure gene transcriptional levels, microarray technologies are now used for comparing other genome features among individuals and their tissues and cells. Results provide valuable information on disease subcategories, disease prognosis, and treatment outcome. Likewise, they reveal differences in genetic makeup, regulat...

  4. Metric learning for DNA microarray data analysis

    International Nuclear Information System (INIS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  5. Distributed Experiments in Design Sciences, a Next Step in Design Observation Studies?

    CERN Document Server

    Kriesi, Carlo; Aalto-Setala, Laura; Anvik, Anders; Balters, Stephanie; Baracchi, Alessia; Jensen, Bisballe Matilde; Bjorkli, Leif Erik; Buzzaccaro, Nicolo; Cortesi, Dario; D'Onghia, Francesco; Dosi, Clio; Franchini, Giulia; Fuchs, Matt; Gerstenberg, Achim; Hansen, Erik; Hiekkanen, Karri Matias; Hyde, David; Ituarte, Inigo; Kalasniemi, Jani; Kurikka, Joona; Lanza, Irene; Laurila, Anssi; Lee, Tik Ho; Lonvik, Siri; Mansikka-Aho, Anniina; Nordberg, Markus; Oinonen, Paivi; Pedrelli, Luca; Pekuri, Anna; Rane, Enna; Reime, Thov; Repokari, Lauri; Ronningen, Martin; Rowlands, Stephanie; Sjoman, Heikki; Slattsveen, Kristoffer; Strachan, Andy; Stromstad, Kirsti; Suren, Stian; Tapio, Peter; Utriainen, Tuuli; Vignoli, Matteo; Vijaykumar, Saurabh; Welo, Torgeir; Wulvik, Andreas

    2015-01-01

    This paper describes and proposes a new method for conducting globally distributed design research. Instead of using e.g. a software we tried out a completely analogue approach: Five carefully prepared packages, containing all the necessary materials and instructions for a design challenge, were sent out to supervisors in Norway, Finland, Italy, and Australia. These local supervisors then conducted the egg-drop exercise with students that are part of an international course held at CERN. As the task is conducted according to a previously tested protocol, the results gathered with this new method can then be benchmarked with this available data. This new approach to globally conducted engineering design activities avoids local bias and enables for gathering large amounts of diverse data points. One can also think of a research community where every member can send out one experiment per year and, in return, receives data points from across the world. Based on the feedback from the supervisors we can say that ...

  6. Conceptual designs parameters for MURR LEU U-Mo fuel conversion design demonstration experiment. Revision 1

    International Nuclear Information System (INIS)

    Stillman, J.; Feldman, E.; Stevens, J.

    2013-01-01

    The design parameters for the conceptual design of a fuel assembly containing U-10Mo fuel foils with low-enriched uranium (LEU) for the University of Missouri Research Reactor (MURR) are described. The Design Demonstration Experiment (MURR-DDE) will use a prototypic MURR-LEU element manufactured according to the parameters specified here. Also provided are calculated performance parameters for the LEU element in the MURR, and a set of goals for the MURR-DDE related to those parameters. The conversion objectives are to develop a fuel element design that will ensure safe reactor operations, as well as maintaining existing performance. The element was designed by staff members of the Global Threat Reduction Initiative (GTRI) Reactor Conversion Program at the Argonne National Laboratory (ANL) and the MURR Facility. A set of manufacturing assumptions were provided by the Fuel Development (FD) and Fuel Fabrication Capability (FFC) pillars of the GTRI Reduced Enrichment for Research and Test Reactors (RERTR) program to reliably manufacture the fuel plates. The proposed LEU fuel element has an overall design and exterior dimensions that are similar to those of the current highly-enriched uranium (HEU) fuel elements. There are 23 fuel plates in the LEU design. The overall thickness of each plate is 44 mil, except for the exterior plate that is furthest from the center flux trap (plate 23), which is 49 mil thick. The proposed LEU fuel plates have U-10Mo monolithic fuel foils with a 235U enrichment of 19.75% varying from 9 mil to 20 mil thick, and clad with Al-6061 aluminum. A thin layer of zirconium exists between the fuel foils and the aluminum as a diffusion barrier. The thinnest nominal combined zirconium and aluminum clad thickness on each side of the fuel plates is 12 mil. The LEU U-10Mo monolithic fuel is not yet qualified as driver fuel in research reactors, but is under intense development under the auspices of the GTRI FD and FFC programs.

  7. Experience in design and construction of the Log tunnel

    Directory of Open Access Journals (Sweden)

    Jovičić Vojkan

    2017-09-01

    Full Text Available A twin highway Log tunnel is a part of a new motorway connection between Maribor and Zagreb, section Draženci-Gruškovje, which is located towards the border crossing between Slovenia and Croatia. The tunnel is currently under construction, and only the excavation works have been completed during the writing of this paper. The terrain in the area of the Log tunnel is diverse, and the route of the highway in its vicinity is characterised by deep excavations, bridges or viaducts. The Log tunnel is approximately 250 m long, partly constructed as a gallery. The geological conditions are dominated by Miocene base rock, featuring layers of well-connected clastic rocks, which are covered by diluvium clays, silts, sands and gravels of different thicknesses. Due to the short length of the tunnel, the usual separation of the motorway route to the left and the right tunnel axes was not carried out. Thus, the tunnel was constructed with an intermediate pillar and was designed as a three-lane tunnel, including the stopping lane. The construction of the tunnel was carried out using the New Austrian tunnelling method (NATM, in which the central adit was excavated first and the intermediate pillar was constructed within it. The excavation of the main tubes followed and was divided into the top heading, bench and the invert, enabling the intermediate pillar to take the load off the top heading of both tubes. The secondary lining of the tunnel is currently under construction. The experience of the tunnel construction gathered so far is presented in the paper. The main emphasis is on the construction of the intermediate pillar, which had to take the significant and asymmetrical ground load.

  8. Experience in design and construction of the Log tunnel

    Science.gov (United States)

    Jovičić, Vojkan; Goleš, Niko; Tori, Matija; Peternel, Miha; Vajović, Stanojle; Muhić, Elvir

    2017-09-01

    A twin highway Log tunnel is a part of a new motorway connection between Maribor and Zagreb, section Draženci-Gru\\vskovje, which is located towards the border crossing between Slovenia and Croatia. The tunnel is currently under construction, and only the excavation works have been completed during the writing of this paper. The terrain in the area of the Log tunnel is diverse, and the route of the highway in its vicinity is characterised by deep excavations, bridges or viaducts. The Log tunnel is approximately 250 m long, partly constructed as a gallery. The geological conditions are dominated by Miocene base rock, featuring layers of well-connected clastic rocks, which are covered by diluvium clays, silts, sands and gravels of different thicknesses. Due to the short length of the tunnel, the usual separation of the motorway route to the left and the right tunnel axes was not carried out. Thus, the tunnel was constructed with an intermediate pillar and was designed as a three-lane tunnel, including the stopping lane. The construction of the tunnel was carried out using the New Austrian tunnelling method (NATM), in which the central adit was excavated first and the intermediate pillar was constructed within it. The excavation of the main tubes followed and was divided into the top heading, bench and the invert, enabling the intermediate pillar to take the load off the top heading of both tubes. The secondary lining of the tunnel is currently under construction. The experience of the tunnel construction gathered so far is presented in the paper. The main emphasis is on the construction of the intermediate pillar, which had to take the significant and asymmetrical ground load.

  9. Controller tuning of district heating networks using experiment design techniques

    International Nuclear Information System (INIS)

    Dobos, Laszlo; Abonyi, Janos

    2011-01-01

    There are various governmental policies aimed at reducing the dependence on fossil fuels for space heating and the reduction in its associated emission of greenhouse gases. DHNs (District heating networks) could provide an efficient method for house and space heating by utilizing residual industrial waste heat. In such systems, heat is produced and/or thermally upgraded in a central plant and then distributed to the end users through a pipeline network. The control strategies of these networks are rather difficult thanks to the non-linearity of the system and the strong interconnection between the controlled variables. That is why a NMPC (non-linear model predictive controller) could be applied to be able to fulfill the heat demand of the consumers. The main objective of this paper is to propose a tuning method for the applied NMPC to fulfill the control goal as soon as possible. The performance of the controller is characterized by an economic cost function based on pre-defined operation ranges. A methodology from the field of experiment design is applied to tune the model predictive controller to reach the best performance. The efficiency of the proposed methodology is proven throughout a case study of a simulated NMPC controlled DHN. -- Highlights: → To improve the energetic and economic efficiency of a DHN an appropriate control system is necessary. → The time consumption of transitions can be shortened with the proper control system. → A NLMPC is proposed as control system. → The NLMPC is tuned by utilization of simplex methodology, using an economic oriented cost function. → The proposed NLMPC needs a detailed model of the DHN based on the physical description.

  10. Design of the plutonium facility for animal experiments and its management experience

    International Nuclear Information System (INIS)

    Koizumi, Akira; Fukuda, Satoshi

    1998-01-01

    Design and radiation control of authors' facility which was made as a nuclear fuel laboratory for animal experiments were described. Before construction, the animals thought to be used were rats, mice, beagle dogs and monkeys. 239 Pu and certain other radioisotopes were to be used. At present, 200 dogs and 1800 small animals can be maintained. The points for design were tolerability against quake, reduced-pressure management and permanent storage of waste containing Pu. The facility building composed from 2nd, 4th, and 6th laboratory floors and between them, from the so-called mechanical floors which are spaces for ducts. The latter floors are quite useful. The system for reduced pressure is of 3 patterns of rooms without hood, with ordinary hood and with air-curtain hood. For animal maintenance, there are 3 types of maintenance means: Glove box, hood and ordinary animal room. There are drainage equipment where Pu can be removed by precipitation and charcoal adsorption and incineration equipment which is necessary for reducing the waste volume. In the latter, HEPA filters are finally used for releasing the gas. There is no particular problem in the radiation control. For the personnel control, lung-monitoring is performed before and at the end of personnel registration. Environmental monitoring of Pu is optionally performed. Removal of Pu particles generated in the inhalation experiments could be attained by the use of ULPA and HEPA filters to the level less than 1/10 17 times the reference level. Keeping the technology level enough high for facility maintenance and management was considered to be important at present and in future. (K.H.)

  11. Designing fractional factorial split-plot experiments using integer programming

    DEFF Research Database (Denmark)

    Capehart, Shay R.; Keha, Ahmet; Kulahci, Murat

    2011-01-01

    factorial (FF) design, with the restricted randomisation structure to account for the whole plots and subplots. We discuss the formulation of FFSP designs using integer programming (IP) to achieve various design criteria. We specifically look at the maximum number of clear two-factor interactions...

  12. Ten ways to design for disgust, sadness, and other enjoyments: A design approach to enrich product experiences with negative emotions

    OpenAIRE

    Fokkinga, S.F.; Desmet, P.M.A.

    2013-01-01

    This paper demonstrates how designers can enrich user experiences by purposefully involving negative emotions in user-product interaction. This approach is derived from a framework of rich experience, which explains how and under what circumstances negative emotions make a product experience richer and enjoyable. The approach consists of three steps, where the designer decides 1) which negative emotion is most appropriate for the user context; 2) how and when this emotion is best elicited; an...

  13. Printing Proteins as Microarrays for High-Throughput Function Determination

    Science.gov (United States)

    MacBeath, Gavin; Schreiber, Stuart L.

    2000-09-01

    Systematic efforts are currently under way to construct defined sets of cloned genes for high-throughput expression and purification of recombinant proteins. To facilitate subsequent studies of protein function, we have developed miniaturized assays that accommodate extremely low sample volumes and enable the rapid, simultaneous processing of thousands of proteins. A high-precision robot designed to manufacture complementary DNA microarrays was used to spot proteins onto chemically derivatized glass slides at extremely high spatial densities. The proteins attached covalently to the slide surface yet retained their ability to interact specifically with other proteins, or with small molecules, in solution. Three applications for protein microarrays were demonstrated: screening for protein-protein interactions, identifying the substrates of protein kinases, and identifying the protein targets of small molecules.

  14. Ten ways to design for disgust, sadness, and other enjoyments : A design approach to enrich product experiences with negative emotions

    NARCIS (Netherlands)

    Fokkinga, S.F.; Desmet, P.M.A.

    2013-01-01

    This paper demonstrates how designers can enrich user experiences by purposefully involving negative emotions in user-product interaction. This approach is derived from a framework of rich experience, which explains how and under what circumstances negative emotions make a product experience richer

  15. Validation of scaffold design optimization in bone tissue engineering: finite element modeling versus designed experiments.

    Science.gov (United States)

    Uth, Nicholas; Mueller, Jens; Smucker, Byran; Yousefi, Azizeh-Mitra

    2017-02-21

    This study reports the development of biological/synthetic scaffolds for bone tissue engineering (TE) via 3D bioplotting. These scaffolds were composed of poly(L-lactic-co-glycolic acid) (PLGA), type I collagen, and nano-hydroxyapatite (nHA) in an attempt to mimic the extracellular matrix of bone. The solvent used for processing the scaffolds was 1,1,1,3,3,3-hexafluoro-2-propanol. The produced scaffolds were characterized by scanning electron microscopy, microcomputed tomography, thermogravimetric analysis, and unconfined compression test. This study also sought to validate the use of finite-element optimization in COMSOL Multiphysics for scaffold design. Scaffold topology was simplified to three factors: nHA content, strand diameter, and strand spacing. These factors affect the ability of the scaffold to bear mechanical loads and how porous the structure can be. Twenty four scaffolds were constructed according to an I-optimal, split-plot designed experiment (DE) in order to generate experimental models of the factor-response relationships. Within the design region, the DE and COMSOL models agreed in their recommended optimal nHA (30%) and strand diameter (460 μm). However, the two methods disagreed by more than 30% in strand spacing (908 μm for DE; 601 μm for COMSOL). Seven scaffolds were 3D-bioplotted to validate the predictions of DE and COMSOL models (4.5-9.9 MPa measured moduli). The predictions for these scaffolds showed relative agreement for scaffold porosity (mean absolute percentage error of 4% for DE and 13% for COMSOL), but were substantially poorer for scaffold modulus (51% for DE; 21% for COMSOL), partly due to some simplifying assumptions made by the models. Expanding the design region in future experiments (e.g., higher nHA content and strand diameter), developing an efficient solvent evaporation method, and exerting a greater control over layer overlap could allow developing PLGA-nHA-collagen scaffolds to meet the mechanical requirements for

  16. The design and analysis of integral assembly experiments for CTR neutronics

    International Nuclear Information System (INIS)

    Beynon, T.D.; Curtis, R.H.; Lambert, C.

    1978-01-01

    The use of simple-geometry integral assemblies of lithium metal or lithium compounds for the study of the neutronics of various CTR designs is considered and four recent experiments are analysed. The relatively long mean free path of neutrons in these assemblies produces significantly different design problems from those encountered in similar experiments for fission reactor design. By considering sensitivity profiles for various parameters it is suggested that experiments can be designed to be optimised for data adjustments. (author)

  17. Designing a future Conditions Database based on LHC experience

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00064378; Formica, Andrea; Gallas, Elizabeth; Lehmann Miotto, Giovanna; Pfeiffer, A.; Govi, G.

    2015-01-01

    We describe a proposal for a new Conditions Database infrastructure that ATLAS and CMS (or other) experiments could use starting on the timescale of Run 3. This proposal is based on the experience that both experiments accumulated during Run 1. We will present the identified relevant data flows for conditions data and underline the common use cases that lead to a joint effort for the development of a new system. Conditions data are needed in any scientific experiment. It includes any ancillary data associated with primary data taking such as detector configuration, state or calibration or the environment in which the detector is operating. In any non-trivial experiment, conditions data typically reside outside the primary data store for various reasons (size, complexity or availability) and are best accessed at the point of processing or analysis (including for Monte Carlo simulations). The ability of any experiment to produce correct and timely results depends on the complete and efficient availability of ne...

  18. Gene Expression and Microarray Investigation of Dendrobium ...

    African Journals Online (AJOL)

    blood glucose > 16.7 mmol/L were used as the model group and treated with Dendrobium mixture. (DEN ... Keywords: Diabetes, Gene expression, Dendrobium mixture, Microarray testing ..... homeostasis in airway smooth muscle. Am J.

  19. SLIMarray: Lightweight software for microarray facility management

    Directory of Open Access Journals (Sweden)

    Marzolf Bruz

    2006-10-01

    Full Text Available Abstract Background Microarray core facilities are commonplace in biological research organizations, and need systems for accurately tracking various logistical aspects of their operation. Although these different needs could be handled separately, an integrated management system provides benefits in organization, automation and reduction in errors. Results We present SLIMarray (System for Lab Information Management of Microarrays, an open source, modular database web application capable of managing microarray inventories, sample processing and usage charges. The software allows modular configuration and is well suited for further development, providing users the flexibility to adapt it to their needs. SLIMarray Lite, a version of the software that is especially easy to install and run, is also available. Conclusion SLIMarray addresses the previously unmet need for free and open source software for managing the logistics of a microarray core facility.

  20. Radiological safety design considerations for fusion research experiments

    International Nuclear Information System (INIS)

    Crase, K.W.; Singh, M.S.

    1979-01-01

    A wide variety of fusion research experiments are in the planning or construction stages. Two such experiments, the Nova Laser Fusion Facility and the Mirror Fusion Test Facility (MFTF), are currently under construction at Lawrence Livermore Laboratory. Although the plasma chamber vault for MFTF and the Nova target room will have thick concrete walls and roofs, the radiation safety problems are made complex by the numerous requirements for shield wall penetrations. This paper addresses radiation safety considerations for the MFTF and Nova experiments, and the need for integrated safety considerations and safety technology development during the planning stages of fusion experiments

  1. The Local Maximum Clustering Method and Its Application in Microarray Gene Expression Data Analysis

    Directory of Open Access Journals (Sweden)

    Chen Yidong

    2004-01-01

    Full Text Available An unsupervised data clustering method, called the local maximum clustering (LMC method, is proposed for identifying clusters in experiment data sets based on research interest. A magnitude property is defined according to research purposes, and data sets are clustered around each local maximum of the magnitude property. By properly defining a magnitude property, this method can overcome many difficulties in microarray data clustering such as reduced projection in similarities, noises, and arbitrary gene distribution. To critically evaluate the performance of this clustering method in comparison with other methods, we designed three model data sets with known cluster distributions and applied the LMC method as well as the hierarchic clustering method, the -mean clustering method, and the self-organized map method to these model data sets. The results show that the LMC method produces the most accurate clustering results. As an example of application, we applied the method to cluster the leukemia samples reported in the microarray study of Golub et al. (1999.

  2. BioconductorBuntu: a Linux distribution that implements a web-based DNA microarray analysis server.

    Science.gov (United States)

    Geeleher, Paul; Morris, Dermot; Hinde, John P; Golden, Aaron

    2009-06-01

    BioconductorBuntu is a custom distribution of Ubuntu Linux that automatically installs a server-side microarray processing environment, providing a user-friendly web-based GUI to many of the tools developed by the Bioconductor Project, accessible locally or across a network. System installation is via booting off a CD image or by using a Debian package provided to upgrade an existing Ubuntu installation. In its current version, several microarray analysis pipelines are supported including oligonucleotide, dual-or single-dye experiments, including post-processing with Gene Set Enrichment Analysis. BioconductorBuntu is designed to be extensible, by server-side integration of further relevant Bioconductor modules as required, facilitated by its straightforward underlying Python-based infrastructure. BioconductorBuntu offers an ideal environment for the development of processing procedures to facilitate the analysis of next-generation sequencing datasets. BioconductorBuntu is available for download under a creative commons license along with additional documentation and a tutorial from (http://bioinf.nuigalway.ie).

  3. Construction of a cDNA microarray derived from the ascidian Ciona intestinalis.

    Science.gov (United States)

    Azumi, Kaoru; Takahashi, Hiroki; Miki, Yasufumi; Fujie, Manabu; Usami, Takeshi; Ishikawa, Hisayoshi; Kitayama, Atsusi; Satou, Yutaka; Ueno, Naoto; Satoh, Nori

    2003-10-01

    A cDNA microarray was constructed from a basal chordate, the ascidian Ciona intestinalis. The draft genome of Ciona has been read and inferred to contain approximately 16,000 protein-coding genes, and cDNAs for transcripts of 13,464 genes have been characterized and compiled as the "Ciona intestinalis Gene Collection Release I". In the present study, we constructed a cDNA microarray of these 13,464 Ciona genes. A preliminary experiment with Cy3- and Cy5-labeled probes showed extensive differential gene expression between fertilized eggs and larvae. In addition, there was a good correlation between results obtained by the present microarray analysis and those from previous EST analyses. This first microarray of a large collection of Ciona intestinalis cDNA clones should facilitate the analysis of global gene expression and gene networks during the embryogenesis of basal chordates.

  4. The IronChip evaluation package: a package of perl modules for robust analysis of custom microarrays

    Directory of Open Access Journals (Sweden)

    Brazma Alvis

    2010-03-01

    Full Text Available Abstract Background Gene expression studies greatly contribute to our understanding of complex relationships in gene regulatory networks. However, the complexity of array design, production and manipulations are limiting factors, affecting data quality. The use of customized DNA microarrays improves overall data quality in many situations, however, only if for these specifically designed microarrays analysis tools are available. Results The IronChip Evaluation Package (ICEP is a collection of Perl utilities and an easy to use data evaluation pipeline for the analysis of microarray data with a focus on data quality of custom-designed microarrays. The package has been developed for the statistical and bioinformatical analysis of the custom cDNA microarray IronChip but can be easily adapted for other cDNA or oligonucleotide-based designed microarray platforms. ICEP uses decision tree-based algorithms to assign quality flags and performs robust analysis based on chip design properties regarding multiple repetitions, ratio cut-off, background and negative controls. Conclusions ICEP is a stand-alone Windows application to obtain optimal data quality from custom-designed microarrays and is freely available here (see "Additional Files" section and at: http://www.alice-dsl.net/evgeniy.vainshtein/ICEP/

  5. Evaluation of gene expression data generated from expired Affymetrix GeneChip® microarrays using MAQC reference RNA samples

    Directory of Open Access Journals (Sweden)

    Tong Weida

    2010-10-01

    Full Text Available Abstract Background The Affymetrix GeneChip® system is a commonly used platform for microarray analysis but the technology is inherently expensive. Unfortunately, changes in experimental planning and execution, such as the unavailability of previously anticipated samples or a shift in research focus, may render significant numbers of pre-purchased GeneChip® microarrays unprocessed before their manufacturer’s expiration dates. Researchers and microarray core facilities wonder whether expired microarrays are still useful for gene expression analysis. In addition, it was not clear whether the two human reference RNA samples established by the MAQC project in 2005 still maintained their transcriptome integrity over a period of four years. Experiments were conducted to answer these questions. Results Microarray data were generated in 2009 in three replicates for each of the two MAQC samples with either expired Affymetrix U133A or unexpired U133Plus2 microarrays. These results were compared with data obtained in 2005 on the U133Plus2 microarray. The percentage of overlap between the lists of differentially expressed genes (DEGs from U133Plus2 microarray data generated in 2009 and in 2005 was 97.44%. While there was some degree of fold change compression in the expired U133A microarrays, the percentage of overlap between the lists of DEGs from the expired and unexpired microarrays was as high as 96.99%. Moreover, the microarray data generated using the expired U133A microarrays in 2009 were highly concordant with microarray and TaqMan® data generated by the MAQC project in 2005. Conclusions Our results demonstrated that microarray data generated using U133A microarrays, which were more than four years past the manufacturer’s expiration date, were highly specific and consistent with those from unexpired microarrays in identifying DEGs despite some appreciable fold change compression and decrease in sensitivity. Our data also suggested that the

  6. Workspace experiments: a journey on planning participatory design

    DEFF Research Database (Denmark)

    Souza da Conceição, Carolina; Broberg, Ole

    2017-01-01

    Summative Statement: This paper presents a resource material in planning and performing participatory workspace design processes. This material brings up design dialogues into focus and gives insights on how to stage them, bridging the gap of merging user involvement with the well-defined design...... work-practice. Problem statement: There is a widespread interest in implementing user involvement in major building and construction projects. Nevertheless, it is also often difficult to translate the contributions from users to workspace design that seriously take on board the employees’ specific work...... practices as a platform for a desired change. There is a need of tool that manages to travel into a well-defined design work-practice and merge with it. Research Objective: We developed a resource material to merge user involvement within current designers’ practices when designing new workspaces. The aim...

  7. PATMA: parser of archival tissue microarray

    Directory of Open Access Journals (Sweden)

    Lukasz Roszkowiak

    2016-12-01

    Full Text Available Tissue microarrays are commonly used in modern pathology for cancer tissue evaluation, as it is a very potent technique. Tissue microarray slides are often scanned to perform computer-aided histopathological analysis of the tissue cores. For processing the image, splitting the whole virtual slide into images of individual cores is required. The only way to distinguish cores corresponding to specimens in the tissue microarray is through their arrangement. Unfortunately, distinguishing the correct order of cores is not a trivial task as they are not labelled directly on the slide. The main aim of this study was to create a procedure capable of automatically finding and extracting cores from archival images of the tissue microarrays. This software supports the work of scientists who want to perform further image processing on single cores. The proposed method is an efficient and fast procedure, working in fully automatic or semi-automatic mode. A total of 89% of punches were correctly extracted with automatic selection. With an addition of manual correction, it is possible to fully prepare the whole slide image for extraction in 2 min per tissue microarray. The proposed technique requires minimum skill and time to parse big array of cores from tissue microarray whole slide image into individual core images.

  8. Design considerations for the Cornell megavolt ion coil experiment (MICE)

    International Nuclear Information System (INIS)

    Jayakumar, R.; Podulka, B.; Keller, S.; Milks, J.; Fleischmann, H.H.

    1986-01-01

    Field reversing ion rings offer an attractive alternative plasma confinement scheme in which a Compact Torus is formed with axis-encircling ion currents. An experiment on forming MeV ion rings-MICE, is under construction in this group. This experiment will prepare the physics base for application of ion rings to the tilt stabilization and/or heating of the larger near-future CT plasma experiments, which will need ion rings with energies of 1-2 MeV. The MICE experiments will therefore extend other experiments to stronger rings with MeV ions. The MICE experiment will employ a Marx generator operating at about 1 MV, coupled to a magnetically insulated ion diode. The ion beam so generated will be passed through a magnetic cusp region, where the axial beam energy will be converted into rotational energy. Gas will be puffed in the trapping region for charge neutralization of the beam. Various methods, including resistive image currents, pulsed fields and phase focussing are being considered for ring trapping. In the present first stage of the experiment, investigation of ion diode behavior and ring formation will be emphasized. A schematic of the proposed experimental arrangement is shown and the major parameters of the experiment are given. The various subsystems follows are described

  9. Augmented Reality Learning Experiences: Survey of Prototype Design and Evaluation

    Science.gov (United States)

    Santos, Marc Ericson C.; Chen, Angie; Taketomi, Takafumi; Yamamoto, Goshiro; Miyazaki, Jun; Kato, Hirokazu

    2014-01-01

    Augmented reality (AR) technology is mature for creating learning experiences for K-12 (pre-school, grade school, and high school) educational settings. We reviewed the applications intended to complement traditional curriculum materials for K-12. We found 87 research articles on augmented reality learning experiences (ARLEs) in the IEEE Xplore…

  10. Applied orthogonal experiment design for the optimum microwave ...

    African Journals Online (AJOL)

    An experiment on polysaccharides from Rhodiolae Radix (PRR) extraction was carried out using microwave-assisted extraction (MAE) method with an objective to establishing the optimum MAE conditions of PRR. Single factor experiments were performed to determine the appropriate range of extraction conditions, and the ...

  11. Design of wave breaking experiments and A-Posteriori Simulations

    NARCIS (Netherlands)

    Kurnia, R.; Kurnia, Ruddy; van Groesen, Embrecht W.C.

    2014-01-01

    This report presents results of 30 wave breaking experiments conducted in the long wave tank of TU Delft, Department of Maritime and Transport Technology (6,7 and 10-12 March 2014), together with simulations performed before the experiment to determine the required wave maker motion and a-posteriori

  12. Design of wave breaking experiments and A-Posteriori Simulations

    NARCIS (Netherlands)

    Kurnia, Ruddy; van Groesen, Embrecht W.C.

    This report presents results of 30 wave breaking experiments conducted in the long wave tank of TU Delft, Department of Maritime and Transport Technology (6,7 and 10-12 March 2014), together with simulations performed before the experiment to determine the required wave maker motion and a-posteriori

  13. Cooling water for SSC experiments: Supplemental Conceptual Design Report (SCDR)

    International Nuclear Information System (INIS)

    Doyle, R.E.

    1989-01-01

    This paper discusses the following topics on cooling water design on the superconducting super collider; low conductivity water; industrial cooling water; chilled water systems; and radioactive water systems

  14. How is brand experience designed in practice? : Results of a multiple-case study

    NARCIS (Netherlands)

    Bakker-Wu, S.; Calabretta, G.; Hultink, H.J.; Bohemia, E.; de Bont, C.; Svengren Holm, L.

    2017-01-01

    Brand experience is an important concept in marketing because it can affect brand loyalty, brand recall, and brand attitude. Brand experience design is therefore an important practice for companies to create favourable and meaningful experiences, through the design of various touchpoints that are in

  15. Aespoe Pillar Stability Experiment. Final experiment design, monitoring results and observations

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Christer [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Eng, Anders [Acuo Engineering AB, Linkoeping (Sweden)

    2005-12-15

    The field part of the Aespoe Pillar Stability Experiment at the Aespoe Hard Rock Laboratory (HRL) was finished in 2004. The experiment was designed to induce and monitor the process of brittle failure, spalling, in a fractured rock mass under controlled conditions. The field part was successfully conducted and a large data set was obtained. This report presents the final design of the experiment, the results of the monitoring, and the observations made during the spalling process and when the spalled rock was removed. When heating of the rock was initiated the rock responded quickly. After only a few days the spalling process was activated in the notch, as indicated by the acoustic emission system, and shortly thereafter displacement readings were recorded. Contraction (radial expansion) of the rock was recorded by several instruments before the notch reached the instrument levels. This contraction is probably the result of a 3D re-distribution of the stresses. The temperature increase in the system was both slower and reached a steady state much earlier than predicted by the numerical models. The propagation of the notch was therefore halted after approximately one month of heating. The power to the electrical heaters was therefore doubled. Spalling then started up again, and in one month's time it had propagated to a depth of approximately five metres in the hole. A second steady state was now reached, but this time the heater power was kept constant for a while to let the rock settle before the confinement pressure was reduced from 700 kPa to 0 in decrements of 50 kPa. The rock mass response to the pressure drop was very limited until the pressure was lowered to approximately 200 kPa (the atmospheric pressure is not included in the given pressure values). Large displacements and a high acoustic emission hit frequency were then measured in the open hole. After the de-pressurization of the confined hole, the heaters were left on for approximately one week

  16. High Powered Rocketry: Design, Construction, and Launching Experience and Analysis

    Science.gov (United States)

    Paulson, Pryce; Curtis, Jarret; Bartel, Evan; Cyr, Waycen Owens; Lamsal, Chiranjivi

    2018-01-01

    In this study, the nuts and bolts of designing and building a high powered rocket have been presented. A computer simulation program called RockSim was used to design the rocket. Simulation results are consistent with time variations of altitude, velocity, and acceleration obtained in the actual flight. The actual drag coefficient was determined…

  17. Developing Teachers' Competences for Designing Inclusive Learning Experiences

    Science.gov (United States)

    Navarro, Silvia Baldiris; Zervas, Panagiotis; Gesa, Ramon Fabregat; Sampson, Demetrios G.

    2016-01-01

    Inclusive education, namely the process of providing all learners with equal educational opportunities, is a major challenge for many educational systems worldwide. In order to address this issue, a widely used framework has been developed, namely the Universal Design for Learning (UDL), which aims to provide specific educational design guidelines…

  18. Coping with drought: the experience of water sensitive urban design ...

    African Journals Online (AJOL)

    This study investigated the extent of Water Sensitive Urban Design (WSUD) activities in the George Municipality in the Western Cape Province, South Africa, and its impact on water consumption. The WSUD approach aims to influence design and planning from the moment rainwater is captured in dams, to when it is treated, ...

  19. Physics design requirements for the Tokamak Physics Experiment (TPX)

    International Nuclear Information System (INIS)

    Neilson, G.H.; Goldston, R.J.; Jardin, S.C.; Reiersen, W.T.; Porkolab, M.; Ulrickson, M.

    1993-01-01

    The design of TPX is driven by physics requirements that follow from its mission. The tokamak and heating systems provide the performance and profile controls needed to study advanced steady state tokamak operating modes. The magnetic control systems provide substantial flexibility for the study of regimes with high beta and bootstrap current. The divertor is designed for high steady state power and particle exhaust

  20. The Prototype as Mediator of Embodied Experience in Fashion Design

    DEFF Research Database (Denmark)

    Kristensen, Tore; Ræbild, Ulla

    . It is based on photographic material obtained in design studios during prototype development. The prototype is considered a core fashion design competence. Yet, companies increasingly cut costs by reducing or omitting prototype development. We intend to show, how the garment prototype acts as an important...

  1. Experiences with a Course on Collaborative Design on Distance

    NARCIS (Netherlands)

    Gassel, van F.J.M.; Leeuwen, van J.P.; Otter, den A.F.H.J.

    2004-01-01

    In conceptual design of architectural artefacts, designers from different disciplines work together. Multi-disciplinary collaboration is required when buildings and their construction have a complex nature. If this collaboration is not effective and efficient, it will lead to the construction of

  2. Experiences with strain based limit state design in The Netherlands

    NARCIS (Netherlands)

    Gresnigt, A.M.; Foeken, R.J. van

    1996-01-01

    Limit state design differs from conventional design methods in that each failure mode is specifically addressed (e.g. burst, collapse, local buckling, fracture due to insufficient strain capacity of the pipe wall, fatigue). Based on an extensive theoretical and experimental research programme,

  3. Experimenting on how to create a sustainable gamified learning design that supports adult students when learning through designing learning games

    DEFF Research Database (Denmark)

    Weitze, Charlotte Lærke

    2014-01-01

    digital learning games (small games) in cross‐disciplinary subject matters. The experiment has focused on creating a game‐based learning design that enables the students to implement the learning goals into their games, and on making the game design process motivating and engaging. Another focus......This paper presents and discusses the first iteration of a design‐based research experiment focusing on how to create an overall gamified learning design (big Game) facilitating the learning process for adult students by letting them be their own learning designers through designing their own...... of the study has been to create a sustainable learning design that supports the learning game design process and gives teachers the ability to evaluate whether the students have been successful in learning their subject matter through this learning game design process. The findings are that this initial...

  4. Structural Design and Analysis of a Rigidizable Space Shuttle Experiment

    National Research Council Canada - National Science Library

    Holstein

    2004-01-01

    .... Once in space, the experiment will inflate and rigidize three composite structures and perform a vibration analysis on each by exciting the tubes using piezoelectric patches and collecting data via an accelerometer...

  5. The Experience City and challenges for Architects and Urban Designers

    DEFF Research Database (Denmark)

    Marling, Gitte

    2008-01-01

    The article discusse the challenges of the experience economy from a Nordic welfare perspective. It argues that the challenges of the experience economy must be combined with the ambition that our cities are not reduced to entertainment engines. The urban life in the Nordic "welfare cities" must...... emphasise experiences that challenge, that urge reflection and that contain elements of learning just as the Nordic welfare city must strive for a socially and culturally inclusive urban life which includes offers for many different lifestyles and cultures in its diversity.     Consequently......, it is not simply a matter of creating a framework for entertainment and "Fun" or of creating architectural icon buildings that can bring fame to the city. The question is whether or not the experience economy can provide for a more versatile urban development in which architectural innovation goes hand in hand...

  6. Small Probes for Orbital Return of Experiments Mission Design

    Data.gov (United States)

    National Aeronautics and Space Administration — Currently the Georgia Tech Small Probes for Orbital Return of Experiments (SPORE) team is collaborating with Aurora Flight Sciences to provide a launch, re-entry,...

  7. Using and Designing Platforms for In Vivo Education Experiments

    OpenAIRE

    Williams, Joseph Jay; Ostrow, Korinn; Xiong, Xiaolu; Glassman, Elena; Kim, Juho; Maldonado, Samuel G.; Li, Na; Reich, Justin; Hefferman, Neil

    2015-01-01

    In contrast to typical laboratory experiments, the everyday use of online educational resources by large populations and the prevalence of software infrastructure for A/B testing leads us to consider how platforms can embed in vivo experiments that do not merely support research, but ensure practical improvements to their educational components. Examples are presented of randomized experimental comparisons conducted by subsets of the authors in three widely used online educational platforms K...

  8. Experiential Marketing A Designer of Pleasurable and Memorable Experiences

    OpenAIRE

    Muthiah, Dr. Krishnaveni; Suja, S

    2013-01-01

    The rapid growth of globalisation, economic crisis and the change in the lifestyle of consumers poses a challenge for the marketers in the present era. Todays consumers have an insight beyond satisfying their needs and wants. Business firms today need to create long lasting impressions on their clients which are converted into memorable experiences as a result of the pleasures derived. An experience occurs when consumers become involved to such an extent that a lasting impression is made on t...

  9. Designing experiments and analyzing data a model comparison perspective

    CERN Document Server

    Maxwell, Scott E

    2013-01-01

    Through this book's unique model comparison approach, students and researchers are introduced to a set of fundamental principles for analyzing data. After seeing how these principles can be applied in simple designs, students are shown how these same principles also apply in more complicated designs. Drs. Maxwell and Delaney believe that the model comparison approach better prepares students to understand the logic behind a general strategy of data analysis appropriate for various designs; and builds a stronger foundation, which allows for the introduction of more complex topics omitt

  10. Design and construction of a basic principle simulator: an experiment

    International Nuclear Information System (INIS)

    Fernandez, O.; Galdoz, E.; Flury, C.; Fontanini, H.; Maciel, F.; Rovere, L.; Carpio, R.

    1992-01-01

    This paper describes activities developed over design and building of a Basic Principle Simulator for nuclear power plants. This simulator has been developed in Process Control Division of Bariloche Atomic Center, Argentina. This project was sponsored jointly by CNEA and Atomic Energy International Organization, through the United Nations Program for Development. The paper specially emphasizes aspects like: architecture design methodology of real time simulators; graphic environment and interfaces design for users and instructor interaction, and for display information; test and validation of the used models; and human resources formation. Finally describes the actual implementation of the simulator to be used in Embalse Nuclear Power Plant. (author)

  11. Dynamic, electronically switchable surfaces for membrane protein microarrays.

    Science.gov (United States)

    Tang, C S; Dusseiller, M; Makohliso, S; Heuschkel, M; Sharma, S; Keller, B; Vörös, J

    2006-02-01

    Microarray technology is a powerful tool that provides a high throughput of bioanalytical information within a single experiment. These miniaturized and parallelized binding assays are highly sensitive and have found widespread popularity especially during the genomic era. However, as drug diagnostics studies are often targeted at membrane proteins, the current arraying technologies are ill-equipped to handle the fragile nature of the protein molecules. In addition, to understand the complex structure and functions of proteins, different strategies to immobilize the probe molecules selectively onto a platform for protein microarray are required. We propose a novel approach to create a (membrane) protein microarray by using an indium tin oxide (ITO) microelectrode array with an electronic multiplexing capability. A polycationic, protein- and vesicle-resistant copolymer, poly(l-lysine)-grafted-poly(ethylene glycol) (PLL-g-PEG), is exposed to and adsorbed uniformly onto the microelectrode array, as a passivating adlayer. An electronic stimulation is then applied onto the individual ITO microelectrodes resulting in the localized release of the polymer thus revealing a bare ITO surface. Different polymer and biological moieties are specifically immobilized onto the activated ITO microelectrodes while the other regions remain protein-resistant as they are unaffected by the induced electrical potential. The desorption process of the PLL-g-PEG is observed to be highly selective, rapid, and reversible without compromising on the integrity and performance of the conductive ITO microelectrodes. As such, we have successfully created a stable and heterogeneous microarray of biomolecules by using selective electronic addressing on ITO microelectrodes. Both pharmaceutical diagnostics and biomedical technology are expected to benefit directly from this unique method.

  12. Kinetics experiments and bench-scale system: Background, design, and preliminary experiments

    International Nuclear Information System (INIS)

    Rofer, C.K.

    1987-10-01

    The project, Supercritical Water Oxidation of Hazardous Chemical Waste, is a Hazardous Waste Remedial Actions Program (HAZWRAP) Research and Development task being carried out by the Los Alamos National Laboratory. Its objective is to obtain information for use in understanding the basic technology and for scaling up and applying oxidation in supercritical water as a viable process for treating a variety of DOE-DP waste streams. This report gives the background and rationale for kinetics experiments on oxidation in supercritical water being carried out as a part of this HAZWRAP Research and Development task. It discusses supercritical fluid properties and their relevance to applying this process to the destruction of hazardous wastes. An overview is given of the small emerging industry based on applications of supercritical water oxidation. Factors that could lead to additional applications are listed. Modeling studies are described as a basis for the experimental design. The report describes plug flow reactor and batch reactor systems, and presents preliminary results. 28 refs., 4 figs., 5 tabs

  13. Biological data warehousing system for identifying transcriptional regulatory sites from gene expressions of microarray data.

    Science.gov (United States)

    Tsou, Ann-Ping; Sun, Yi-Ming; Liu, Chia-Lin; Huang, Hsien-Da; Horng, Jorng-Tzong; Tsai, Meng-Feng; Liu, Baw-Juine

    2006-07-01

    Identification of transcriptional regulatory sites plays an important role in the investigation of gene regulation. For this propose, we designed and implemented a data warehouse to integrate multiple heterogeneous biological data sources with data types such as text-file, XML, image, MySQL database model, and Oracle database model. The utility of the biological data warehouse in predicting transcriptional regulatory sites of coregulated genes was explored using a synexpression group derived from a microarray study. Both of the binding sites of known transcription factors and predicted over-represented (OR) oligonucleotides were demonstrated for the gene group. The potential biological roles of both known nucleotides and one OR nucleotide were demonstrated using bioassays. Therefore, the results from the wet-lab experiments reinforce the power and utility of the data warehouse as an approach to the genome-wide search for important transcription regulatory elements that are the key to many complex biological systems.

  14. Geothermal FIT Design: International Experience and U.S. Considerations

    Energy Technology Data Exchange (ETDEWEB)

    Rickerson, W.; Gifford, J.; Grace, R.; Cory, K.

    2012-08-01

    Developing power plants is a risky endeavor, whether conventional or renewable generation. Feed-in tariff (FIT) policies can be designed to address some of these risks, and their design can be tailored to geothermal electric plant development. Geothermal projects face risks similar to other generation project development, including finding buyers for power, ensuring adequate transmission capacity, competing to supply electricity and/or renewable energy certificates (RECs), securing reliable revenue streams, navigating the legal issues related to project development, and reacting to changes in existing regulations or incentives. Although FITs have not been created specifically for geothermal in the United States to date, a variety of FIT design options could reduce geothermal power plant development risks and are explored. This analysis focuses on the design of FIT incentive policies for geothermal electric projects and how FITs can be used to reduce risks (excluding drilling unproductive exploratory wells).

  15. KiloPower Project - KRUSTY Experiment Nuclear Design

    Energy Technology Data Exchange (ETDEWEB)

    Poston, David Irvin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Godfroy, Thomas [NASA Marshall Space Flight Center (MSFC), Huntsville, AL (United States); Mcclure, Patrick Ray [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sanchez, Rene Gerardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-07-20

    This PowerPoint presentation covers the following topics: Reference Kilopower configuration; Reference KRUSTY configuration; KRUSTY design sensitivities; KRUSTY reactivity coefficients; KRUSTY criticality safety and control; KRUSTY core activation/dose; and KRUSTY shielding, room activation/dose.

  16. TVA experience in BWR reload design and licensing

    International Nuclear Information System (INIS)

    Robertson, J.D.

    1986-01-01

    TVA has developed and implemented the capability to perform BWR reload core design and licensing analyses. The advantages accruing from this capability include the tangible cost-savings from performing reload analyses in-house. Also, ''intangible'' benefits such as increased operating flexibility and the ability to accommodate multivendor fuel designs have been demonstrated. The major disadvantage with performing in-house analyses is the cost associated with development and maintenance of the analytical methods and staff expertise

  17. Developing Teachers’ Competences for Designing Inclusive Learning Experiences

    OpenAIRE

    Baldiris Navarro, Silvia Margarita; Zervas, Panagiotis; Fabregat Gesa, Ramon; Sampson, Demetrios G.

    2016-01-01

    Inclusive education, namely the process of providing all learners with equal educational opportunities, is a major challenge for many educational systems worldwide. In order to address this issue, a widely used framework has been developed, namely the Universal Design for Learning (UDL), which aims to provide specific educational design guidelines to ensure accessibility of all learner types to the learning environment. On the other hand, nowadays teachers are provided with ample ...

  18. Genotyping microarray (gene chip) for the ABCR (ABCA4) gene.

    Science.gov (United States)

    Jaakson, K; Zernant, J; Külm, M; Hutchinson, A; Tonisson, N; Glavac, D; Ravnik-Glavac, M; Hawlina, M; Meltzer, M R; Caruso, R C; Testa, F; Maugeri, A; Hoyng, C B; Gouras, P; Simonelli, F; Lewis, R A; Lupski, J R; Cremers, F P M; Allikmets, R

    2003-11-01

    Genetic variation in the ABCR (ABCA4) gene has been associated with five distinct retinal phenotypes, including Stargardt disease/fundus flavimaculatus (STGD/FFM), cone-rod dystrophy (CRD), and age-related macular degeneration (AMD). Comparative genetic analyses of ABCR variation and diagnostics have been complicated by substantial allelic heterogeneity and by differences in screening methods. To overcome these limitations, we designed a genotyping microarray (gene chip) for ABCR that includes all approximately 400 disease-associated and other variants currently described, enabling simultaneous detection of all known ABCR variants. The ABCR genotyping microarray (the ABCR400 chip) was constructed by the arrayed primer extension (APEX) technology. Each sequence change in ABCR was included on the chip by synthesis and application of sequence-specific oligonucleotides. We validated the chip by screening 136 confirmed STGD patients and 96 healthy controls, each of whom we had analyzed previously by single strand conformation polymorphism (SSCP) technology and/or heteroduplex analysis. The microarray was >98% effective in determining the existing genetic variation and was comparable to direct sequencing in that it yielded many sequence changes undetected by SSCP. In STGD patient cohorts, the efficiency of the array to detect disease-associated alleles was between 54% and 78%, depending on the ethnic composition and degree of clinical and molecular characterization of a cohort. In addition, chip analysis suggested a high carrier frequency (up to 1:10) of ABCR variants in the general population. The ABCR genotyping microarray is a robust, cost-effective, and comprehensive screening tool for variation in one gene in which mutations are responsible for a substantial fraction of retinal disease. The ABCR chip is a prototype for the next generation of screening and diagnostic tools in ophthalmic genetics, bridging clinical and scientific research. Copyright 2003 Wiley

  19. DNA microarray technique for detecting food-borne pathogens

    Directory of Open Access Journals (Sweden)

    Xing GAO

    2012-08-01

    Full Text Available Objective To study the application of DNA microarray technique for screening and identifying multiple food-borne pathogens. Methods The oligonucleotide probes were designed by Clustal X and Oligo 6.0 at the conserved regions of specific genes of multiple food-borne pathogens, and then were validated by bioinformatic analyses. The 5' end of each probe was modified by amino-group and 10 Poly-T, and the optimized probes were synthesized and spotted on aldehyde-coated slides. The bacteria DNA template incubated with Klenow enzyme was amplified by arbitrarily primed PCR, and PCR products incorporated into Aminoallyl-dUTP were coupled with fluorescent dye. After hybridization of the purified PCR products with DNA microarray, the hybridization image and fluorescence intensity analysis was acquired by ScanArray and GenePix Pro 5.1 software. A series of detection conditions such as arbitrarily primed PCR and microarray hybridization were optimized. The specificity of this approach was evaluated by 16 different bacteria DNA, and the sensitivity and reproducibility were verified by 4 food-borne pathogens DNA. The samples of multiple bacteria DNA and simulated water samples of Shigella dysenteriae were detected. Results Nine different food-borne bacteria were successfully discriminated under the same condition. The sensitivity of genomic DNA was 102 -103pg/ μl, and the coefficient of variation (CV of the reproducibility of assay was less than 15%. The corresponding specific hybridization maps of the multiple bacteria DNA samples were obtained, and the detection limit of simulated water sample of Shigella dysenteriae was 3.54×105cfu/ml. Conclusions The DNA microarray detection system based on arbitrarily primed PCR can be employed for effective detection of multiple food-borne pathogens, and this assay may offer a new method for high-throughput platform for detecting bacteria.

  20. On the design and implementation of environmental conservation mechanisms : Evidence from field experiments

    NARCIS (Netherlands)

    Kitesa, Rahel

    2018-01-01

    This doctoral dissertation consists of three chapters on the design and implementation of environmental conservation mechanisms using economic experiments. The first chapter examines how variations in information and context affect the outcomes of valuation using field experiment. The chapter shows

  1. Generalization of DNA microarray dispersion properties: microarray equivalent of t-distribution

    DEFF Research Database (Denmark)

    Novak, Jaroslav P; Kim, Seon-Young; Xu, Jun

    2006-01-01

    BACKGROUND: DNA microarrays are a powerful technology that can provide a wealth of gene expression data for disease studies, drug development, and a wide scope of other investigations. Because of the large volume and inherent variability of DNA microarray data, many new statistical methods have...

  2. BASE - 2nd generation software for microarray data management and analysis

    Directory of Open Access Journals (Sweden)

    Nordborg Nicklas

    2009-10-01

    Full Text Available Abstract Background Microarray experiments are increasing in size and samples are collected asynchronously over long time. Available data are re-analysed as more samples are hybridized. Systematic use of collected data requires tracking of biomaterials, array information, raw data, and assembly of annotations. To meet the information tracking and data analysis challenges in microarray experiments we reimplemented and improved BASE version 1.2. Results The new BASE presented in this report is a comprehensive annotable local microarray data repository and analysis application providing researchers with an efficient information management and analysis tool. The information management system tracks all material from biosource, via sample and through extraction and labelling to raw data and analysis. All items in BASE can be annotated and the annotations can be used as experimental factors in downstream analysis. BASE stores all microarray experiment related data regardless if analysis tools for specific techniques or data formats are readily available. The BASE team is committed to continue improving and extending BASE to make it usable for even more experimental setups and techniques, and we encourage other groups to target their specific needs leveraging on the infrastructure provided by BASE. Conclusion BASE is a comprehensive management application for information, data, and analysis of microarray experiments, available as free open source software at http://base.thep.lu.se under the terms of the GPLv3 license.

  3. BASE--2nd generation software for microarray data management and analysis.

    Science.gov (United States)

    Vallon-Christersson, Johan; Nordborg, Nicklas; Svensson, Martin; Häkkinen, Jari

    2009-10-12

    Microarray experiments are increasing in size and samples are collected asynchronously over long time. Available data are re-analysed as more samples are hybridized. Systematic use of collected data requires tracking of biomaterials, array information, raw data, and assembly of annotations. To meet the information tracking and data analysis challenges in microarray experiments we reimplemented and improved BASE version 1.2. The new BASE presented in this report is a comprehensive annotable local microarray data repository and analysis application providing researchers with an efficient information management and analysis tool. The information management system tracks all material from biosource, via sample and through extraction and labelling to raw data and analysis. All items in BASE can be annotated and the annotations can be used as experimental factors in downstream analysis. BASE stores all microarray experiment related data regardless if analysis tools for specific techniques or data formats are readily available. The BASE team is committed to continue improving and extending BASE to make it usable for even more experimental setups and techniques, and we encourage other groups to target their specific needs leveraging on the infrastructure provided by BASE. BASE is a comprehensive management application for information, data, and analysis of microarray experiments, available as free open source software at http://base.thep.lu.se under the terms of the GPLv3 license.

  4. The prototype design of the Stanford Relativity Gyro Experiment

    Science.gov (United States)

    Parkinson, Bradford W.; Everitt, C. W. Francis; Turneaure, John P.; Parmley, Richard T.

    1987-01-01

    The Stanford Relativity Gyroscope Experiment constitutes a fundamental test of Einstein's General Theory of Relativity, probing such heretofore untested aspects of the theory as those that relate to spin by means of drag-free satellite-borne gyroscopes. General Relativity's prediction of two orthogonal precessions (motional and geodetic) for a perfect Newtonian gyroscope in polar orbit has not yet been experimentally assessed, and will mark a significant advancement in experimental gravitation. The technology employed in the experiment has been under development for 25 years at NASA's Marshall Space Flight Center. Four fused quartz gyroscopes will be used.

  5. Preliminary design and definition of field experiments for welded tuff rock mechanics program

    International Nuclear Information System (INIS)

    Zimmerman, R.M.

    1982-06-01

    The preliminary design contains objectives, typical experiment layouts, definitions of equipment and instrumentation, test matrices, preliminary design predictive modeling results for five experiments, and a definition of the G-Tunnel Underground Facility (GTUF) at the Nevada Test Site where the experiments are to be located. Experiments described for investigations in welded tuff are the Small Diameter Heater, Unit Cell-Canister Scale, Heated Block, Rocha Slot, and Miniature Heater

  6. Design and analysis of experiments classical and regression approaches with SAS

    CERN Document Server

    Onyiah, Leonard C

    2008-01-01

    Introductory Statistical Inference and Regression Analysis Elementary Statistical Inference Regression Analysis Experiments, the Completely Randomized Design (CRD)-Classical and Regression Approaches Experiments Experiments to Compare Treatments Some Basic Ideas Requirements of a Good Experiment One-Way Experimental Layout or the CRD: Design and Analysis Analysis of Experimental Data (Fixed Effects Model) Expected Values for the Sums of Squares The Analysis of Variance (ANOVA) Table Follow-Up Analysis to Check fo

  7. DNA microarrays : a molecular cloning manual

    National Research Council Canada - National Science Library

    Sambrook, Joseph; Bowtell, David

    2002-01-01

    .... This manual, designed to extend and to complement the information in the best-selling Molecular Cloning, is a synthesis of the expertise and experience of more than 30 contributors all innovators in a fast moving field...

  8. Designing for experience : arousing boredom to evoke predefined user behaviour

    NARCIS (Netherlands)

    van Aart, J.; Salem, B.I.; Bartneck, C.; Hu, J.; Rauterberg, G.W.M.; Desmet, P.; Tzvetanov, S.; Hekkert, P.; Justice, L.

    2008-01-01

    In the light of Cultural Computing, this study influences user affect and behaviour by touching upon core values of Western culture. We created an augmented reality environment in which users experience a predefined sequence of emotional states and events. This study concerns two typically Western

  9. Exhibition contribution: AN EXPERIMENT WITH THE VOICE TO DESIGN CERAMICS

    DEFF Research Database (Denmark)

    2013-01-01

    The artefacts show how experiential knowledge that the craftsmen gains in a direct physical interaction with a responding material can be transformed and utilized in the use of digital technologies. The exhibition presents an experiment with a 3D interactive and dynamic system to create ceramics ...

  10. How (not) to design procurement mechanisms: A laboratory experiment

    NARCIS (Netherlands)

    Onderstal, S.; van de Meerendonk, A.

    2008-01-01

    In this paper, we examine the relative performance of three commonly used procurement mechanisms: price-only auctions, scoring auctions, and benchmarking. We do so both in theory and in a laboratory experiment. We find that the auctions yield the same level of welfare, and welfare dominate

  11. Microarray Data Processing Techniques for Genome-Scale Network Inference from Large Public Repositories.

    Science.gov (United States)

    Chockalingam, Sriram; Aluru, Maneesha; Aluru, Srinivas

    2016-09-19

    Pre-processing of microarray data is a well-studied problem. Furthermore, all popular platforms come with their own recommended best practices for differential analysis of genes. However, for genome-scale network inference using microarray data collected from large public repositories, these methods filter out a considerable number of genes. This is primarily due to the effects of aggregating a diverse array of experiments with different technical and biological scenarios. Here we introduce a pre-processing pipeline suitable for inferring genome-scale gene networks from large microarray datasets. We show that partitioning of the available microarray datasets according to biological relevance into tissue- and process-specific categories significantly extends the limits of downstream network construction. We demonstrate the effectiveness of our pre-processing pipeline by inferring genome-scale networks for the model plant Arabidopsis thaliana using two different construction methods and a collection of 11,760 Affymetrix ATH1 microarray chips. Our pre-processing pipeline and the datasets used in this paper are made available at http://alurulab.cc.gatech.edu/microarray-pp.

  12. Learning design thinking online : studying students' learning experience in shared virtual reality

    OpenAIRE

    Lau, Kung Wong

    2010-01-01

    Learning Design Thinking Online: Studying Students' Learning Experience in Shared Virtual Reality My study attempts to deepen understanding about the learning experiences of design students in undertaking design-thinking exercises in a shared virtual reality. This study has identified the areas of an appropriate pedagogy for E-Learning and the use of a shared virtual environment for students in tertiary design education. Specific questions arising ji"Om this research are: (1...

  13. Mechanical engineering and design criteria for the Magnetically Insulated Transmission Experiment Accelerator

    International Nuclear Information System (INIS)

    Staller, G.E.; Hamilton, I.D.; Aker, M.F.; Fifer, H.G.

    1978-02-01

    A single-unit electron beam accelerator was designed, fabricated, and assembled in Sandia's Technical Area V to conduct magnetically insulated transmission experiments. Results of these experiments will be utilized in the future design of larger, more complex accelerators. This design makes optimum use of existing facilities and equipment. When designing new components, possible future applications were considered as well as compatibility with existing facilities and hardware

  14. Being in the Users' Shoes: Anticipating Experience while Designing Online Courses

    Science.gov (United States)

    Rapanta, Chrysi; Cantoni, Lorenzo

    2014-01-01

    While user-centred design and user experience are given much attention in the e-learning design field, no research has been found on how users are actually represented in the discussions during the design of online courses. In this paper we identify how and when end-users' experience--be they students or tutors--emerges in designers'…

  15. Optimal Experimental Design of Furan Shock Tube Kinetic Experiments

    KAUST Repository

    Kim, Daesang

    2015-01-07

    A Bayesian optimal experimental design methodology has been developed and applied to refine the rate coefficients of elementary reactions in Furan combustion. Furans are considered as potential renewable fuels. We focus on the Arrhenius rates of Furan + OH ↔ Furyl-2 + H2O and Furan ↔ OH Furyl-3 + H2O, and rely on the OH consumption rate as experimental observable. A polynomial chaos surrogate is first constructed using an adaptive pseudo-spectral projection algorithm. The PC surrogate is then exploited in conjunction with a fast estimation of the expected information gain in order to determine the optimal design in the space of initial temperatures and OH concentrations.

  16. Windfarm design in the light of previous experience

    International Nuclear Information System (INIS)

    Lloyd, A.

    1997-01-01

    Significant impacts to birds have been claimed at both Californian and Spanish windfarms, but in the United Kingdom the current evidence is that avian impact, where it does occur, is minimal. Examination of differences in design and location of wind energy installations worldwide in relation to differences in avian impact is leading towards the establishment of design principles which can help ensure that impacts remain at negligible. These principles are based on locational, technical and ecological factors taken either singly or in combination. (author)

  17. AGC-1 Experiment and Final Preliminary Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Robert L. Bratton; Tim Burchell

    2006-08-01

    This report details the experimental plan and design as of the preliminary design review for the Advanced Test Reactor Graphite Creep-1 graphite compressive creep capsule. The capsule will contain five graphite grades that will be irradiated in the Advanced Test Reactor at the Idaho National Laboratory to determine the irradiation induced creep constants. Seven other grades of graphite will be irradiated to determine irradiated physical properties. The capsule will have an irradiation temperature of 900 C and a peak irradiation dose of 5.8 x 10{sup 21} n/cm{sup 2} [E > 0.1 MeV], or 4.2 displacements per atom.

  18. Will the alphabet soup of design criteria affect discrete choice experiment results?

    DEFF Research Database (Denmark)

    Olsen, Søren Bøye; Meyerhoff, Jürgen

    2017-01-01

    Every discrete choice experiment needs one, but the impacts of a statistical design on the results are still not well understood. Comparative studies have found that efficient designs outperform especially orthogonal designs. What has been little studied is whether efficient designs come at a cos...

  19. Review Committee report on the conceptual design of the Tokamak Physics Experiment

    International Nuclear Information System (INIS)

    1993-04-01

    This report discusses the following topics on the conceptual design of the Tokamak Physics Experiment: Role and mission of TPX; overview of design; physics design assessment; engineering design assessment; evaluation of cost, schedule, and management plans; and, environment safety and health

  20. Nanotechnology: moving from microarrays toward nanoarrays.

    Science.gov (United States)

    Chen, Hua; Li, Jun

    2007-01-01

    Microarrays are important tools for high-throughput analysis of biomolecules. The use of microarrays for parallel screening of nucleic acid and protein profiles has become an industry standard. A few limitations of microarrays are the requirement for relatively large sample volumes and elongated incubation time, as well as the limit of detection. In addition, traditional microarrays make use of bulky instrumentation for the detection, and sample amplification and labeling are quite laborious, which increase analysis cost and delays the time for obtaining results. These problems limit microarray techniques from point-of-care and field applications. One strategy for overcoming these problems is to develop nanoarrays, particularly electronics-based nanoarrays. With further miniaturization, higher sensitivity, and simplified sample preparation, nanoarrays could potentially be employed for biomolecular analysis in personal healthcare and monitoring of trace pathogens. In this chapter, it is intended to introduce the concept and advantage of nanotechnology and then describe current methods and protocols for novel nanoarrays in three aspects: (1) label-free nucleic acids analysis using nanoarrays, (2) nanoarrays for protein detection by conventional optical fluorescence microscopy as well as by novel label-free methods such as atomic force microscopy, and (3) nanoarray for enzymatic-based assay. These nanoarrays will have significant applications in drug discovery, medical diagnosis, genetic testing, environmental monitoring, and food safety inspection.