WorldWideScience

Sample records for genomic evaluation methods

  1. Finding function: evaluation methods for functional genomic data

    Directory of Open Access Journals (Sweden)

    Barrett Daniel R

    2006-07-01

    Full Text Available Abstract Background Accurate evaluation of the quality of genomic or proteomic data and computational methods is vital to our ability to use them for formulating novel biological hypotheses and directing further experiments. There is currently no standard approach to evaluation in functional genomics. Our analysis of existing approaches shows that they are inconsistent and contain substantial functional biases that render the resulting evaluations misleading both quantitatively and qualitatively. These problems make it essentially impossible to compare computational methods or large-scale experimental datasets and also result in conclusions that generalize poorly in most biological applications. Results We reveal issues with current evaluation methods here and suggest new approaches to evaluation that facilitate accurate and representative characterization of genomic methods and data. Specifically, we describe a functional genomics gold standard based on curation by expert biologists and demonstrate its use as an effective means of evaluation of genomic approaches. Our evaluation framework and gold standard are freely available to the community through our website. Conclusion Proper methods for evaluating genomic data and computational approaches will determine how much we, as a community, are able to learn from the wealth of available data. We propose one possible solution to this problem here but emphasize that this topic warrants broader community discussion.

  2. Evaluation of methods and marker Systems in Genomic Selection of oil palm (Elaeis guineensis Jacq.).

    Science.gov (United States)

    Kwong, Qi Bin; Teh, Chee Keng; Ong, Ai Ling; Chew, Fook Tim; Mayes, Sean; Kulaveerasingam, Harikrishna; Tammi, Martti; Yeoh, Suat Hui; Appleton, David Ross; Harikrishna, Jennifer Ann

    2017-12-11

    Genomic selection (GS) uses genome-wide markers as an attempt to accelerate genetic gain in breeding programs of both animals and plants. This approach is particularly useful for perennial crops such as oil palm, which have long breeding cycles, and for which the optimal method for GS is still under debate. In this study, we evaluated the effect of different marker systems and modeling methods for implementing GS in an introgressed dura family derived from a Deli dura x Nigerian dura (Deli x Nigerian) with 112 individuals. This family is an important breeding source for developing new mother palms for superior oil yield and bunch characters. The traits of interest selected for this study were fruit-to-bunch (F/B), shell-to-fruit (S/F), kernel-to-fruit (K/F), mesocarp-to-fruit (M/F), oil per palm (O/P) and oil-to-dry mesocarp (O/DM). The marker systems evaluated were simple sequence repeats (SSRs) and single nucleotide polymorphisms (SNPs). RR-BLUP, Bayesian A, B, Cπ, LASSO, Ridge Regression and two machine learning methods (SVM and Random Forest) were used to evaluate GS accuracy of the traits. The kinship coefficient between individuals in this family ranged from 0.35 to 0.62. S/F and O/DM had the highest genomic heritability, whereas F/B and O/P had the lowest. The accuracies using 135 SSRs were low, with accuracies of the traits around 0.20. The average accuracy of machine learning methods was 0.24, as compared to 0.20 achieved by other methods. The trait with the highest mean accuracy was F/B (0.28), while the lowest were both M/F and O/P (0.18). By using whole genomic SNPs, the accuracies for all traits, especially for O/DM (0.43), S/F (0.39) and M/F (0.30) were improved. The average accuracy of machine learning methods was 0.32, compared to 0.31 achieved by other methods. Due to high genomic resolution, the use of whole-genome SNPs improved the efficiency of GS dramatically for oil palm and is recommended for dura breeding programs. Machine learning slightly

  3. Allele coding in genomic evaluation

    DEFF Research Database (Denmark)

    Standen, Ismo; Christensen, Ole Fredslund

    2011-01-01

    Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker...... effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous...... genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call...

  4. COMPARATIVE EVALUATION OF CONVENTIONAL VERSUS RAPID METHODS FOR AMPLIFIABLE GENOMIC DNA ISOLATION OF CULTURED Azospirillum sp. JG3

    Directory of Open Access Journals (Sweden)

    Stalis Norma Ethica

    2013-12-01

    Full Text Available As an initial attempt to reveal genetic information of Azospirillum sp. JG3 strain, which is still absence despite of the strains' ability in producing valued enzymes, two groups of conventional methods: lysis-enzyme and column-kit; and two rapid methods: thermal disruption and intact colony were evaluated. The aim is to determine the most practical method for obtaining high-grade PCR product using degenerate primers as part of routine-basis protocols for studying the molecular genetics of the Azospirillal bacteria. The evaluation includes the assessment of electrophoresis gel visualization, pellet appearance, preparation time, and PCR result of extracted genomic DNA from each method. Our results confirmed that the conventional methods were more superior to the rapid methods in generating genomic DNA isolates visible on electrophoresis gel. However, modification made in the previously developed DNA isolation protocol giving the simplest and most rapid method of all methods used in this study for extracting PCR-amplifiable DNA of Azospirillum sp. JG3. Intact bacterial cells (intact colony loaded on electrophoresis gel could present genomic DNA band, but could not be completely amplified by PCR without thermal treatment. It can also be inferred from our result that the 3 to 5-min heating in dH2O step is critical for the pre-treatment of colony PCR of Azospirillal cells.

  5. Statistical Methods in Integrative Genomics

    Science.gov (United States)

    Richardson, Sylvia; Tseng, George C.; Sun, Wei

    2016-01-01

    Statistical methods in integrative genomics aim to answer important biology questions by jointly analyzing multiple types of genomic data (vertical integration) or aggregating the same type of data across multiple studies (horizontal integration). In this article, we introduce different types of genomic data and data resources, and then review statistical methods of integrative genomics, with emphasis on the motivation and rationale of these methods. We conclude with some summary points and future research directions. PMID:27482531

  6. Allele coding in genomic evaluation

    Directory of Open Access Journals (Sweden)

    Christensen Ole F

    2011-06-01

    Full Text Available Abstract Background Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. Results Theoretical derivations showed that parameter estimates and estimated marker effects in marker-based models are the same irrespective of the allele coding, provided that the model has a fixed general mean. For the equivalent models, the same results hold, even though different allele coding methods lead to different genomic relationship matrices. Calculated genomic breeding values are independent of allele coding when the estimate of the general mean is included into the values. Reliabilities of estimated genomic breeding values calculated using elements of the inverse of the coefficient matrix depend on the allele coding because different allele coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being

  7. Sizing up arthropod genomes: an evaluation of the impact of environmental variation on genome size estimates by flow cytometry and the use of qPCR as a method of estimation.

    Science.gov (United States)

    Gregory, T Ryan; Nathwani, Paula; Bonnett, Tiffany R; Huber, Dezene P W

    2013-09-01

    A study was undertaken to evaluate both a pre-existing method and a newly proposed approach for the estimation of nuclear genome sizes in arthropods. First, concerns regarding the reliability of the well-established method of flow cytometry relating to impacts of rearing conditions on genome size estimates were examined. Contrary to previous reports, a more carefully controlled test found negligible environmental effects on genome size estimates in the fly Drosophila melanogaster. Second, a more recently touted method based on quantitative real-time PCR (qPCR) was examined in terms of ease of use, efficiency, and (most importantly) accuracy using four test species: the flies Drosophila melanogaster and Musca domestica and the beetles Tribolium castaneum and Dendroctonus ponderosa. The results of this analysis demonstrated that qPCR has the tendency to produce substantially different genome size estimates from other established techniques while also being far less efficient than existing methods.

  8. A method to evaluate genome-wide methylation in archival formalin-fixed, paraffin-embedded ovarian epithelial cells.

    Directory of Open Access Journals (Sweden)

    Qiling Li

    Full Text Available The use of DNA from archival formalin and paraffin embedded (FFPE tissue for genetic and epigenetic analyses may be problematic, since the DNA is often degraded and only limited amounts may be available. Thus, it is currently not known whether genome-wide methylation can be reliably assessed in DNA from archival FFPE tissue.Ovarian tissues, which were obtained and formalin-fixed and paraffin-embedded in either 1999 or 2011, were sectioned and stained with hematoxylin-eosin (H&E.Epithelial cells were captured by laser micro dissection, and their DNA subjected to whole genomic bisulfite conversion, whole genomic polymerase chain reaction (PCR amplification, and purification. Sequencing and software analyses were performed to identify the extent of genomic methylation. We observed that 31.7% of sequence reads from the DNA in the 1999 archival FFPE tissue, and 70.6% of the reads from the 2011 sample, could be matched with the genome. Methylation rates of CpG on the Watson and Crick strands were 32.2% and 45.5%, respectively, in the 1999 sample, and 65.1% and 42.7% in the 2011 sample.We have developed an efficient method that allows DNA methylation to be assessed in archival FFPE tissue samples.

  9. Evaluation of genome-enabled selection for bacterial cold water disease resistance using progeny performance data in Rainbow Trout: Insights on genotyping methods and genomic prediction models

    Science.gov (United States)

    Bacterial cold water disease (BCWD) causes significant economic losses in salmonid aquaculture, and traditional family-based breeding programs aimed at improving BCWD resistance have been limited to exploiting only between-family variation. We used genomic selection (GS) models to predict genomic br...

  10. Genomic evaluations with many more genotypes

    Directory of Open Access Journals (Sweden)

    Wiggans George R

    2011-03-01

    Full Text Available Abstract Background Genomic evaluations in Holstein dairy cattle have quickly become more reliable over the last two years in many countries as more animals have been genotyped for 50,000 markers. Evaluations can also include animals genotyped with more or fewer markers using new tools such as the 777,000 or 2,900 marker chips recently introduced for cattle. Gains from more markers can be predicted using simulation, whereas strategies to use fewer markers have been compared using subsets of actual genotypes. The overall cost of selection is reduced by genotyping most animals at less than the highest density and imputing their missing genotypes using haplotypes. Algorithms to combine different densities need to be efficient because numbers of genotyped animals and markers may continue to grow quickly. Methods Genotypes for 500,000 markers were simulated for the 33,414 Holsteins that had 50,000 marker genotypes in the North American database. Another 86,465 non-genotyped ancestors were included in the pedigree file, and linkage disequilibrium was generated directly in the base population. Mixed density datasets were created by keeping 50,000 (every tenth of the markers for most animals. Missing genotypes were imputed using a combination of population haplotyping and pedigree haplotyping. Reliabilities of genomic evaluations using linear and nonlinear methods were compared. Results Differing marker sets for a large population were combined with just a few hours of computation. About 95% of paternal alleles were determined correctly, and > 95% of missing genotypes were called correctly. Reliability of breeding values was already high (84.4% with 50,000 simulated markers. The gain in reliability from increasing the number of markers to 500,000 was only 1.6%, but more than half of that gain resulted from genotyping just 1,406 young bulls at higher density. Linear genomic evaluations had reliabilities 1.5% lower than the nonlinear evaluations with 50

  11. A network-based method to evaluate quality of reproducibility of differential expression in cancer genomics studies.

    Science.gov (United States)

    Li, Robin; Lin, Xiao; Geng, Haijiang; Li, Zhihui; Li, Jiabing; Lu, Tao; Yan, Fangrong

    2015-12-29

    Personalized cancer treatments depend on the determination of a patient's genetic status according to known genetic profiles for which targeted treatments exist. Such genetic profiles must be scientifically validated before they is applied to general patient population. Reproducibility of findings that support such genetic profiles is a fundamental challenge in validation studies. The percentage of overlapping genes (POG) criterion and derivative methods produce unstable and misleading results. Furthermore, in a complex disease, comparisons between different tumor subtypes can produce high POG scores that do not capture the consistencies in the functions. We focused on the quality rather than the quantity of the overlapping genes. We defined the rank value of each gene according to importance or quality by PageRank on basis of a particular topological structure. Then, we used the p-value of the rank-sum of the overlapping genes (PRSOG) to evaluate the quality of reproducibility. Though the POG scores were low in different studies of the same disease, the PRSOG was statistically significant, which suggests that sets of differentially expressed genes might be highly reproducible. Evaluations of eight datasets from breast cancer, lung cancer and four other disorders indicate that quality-based PRSOG method performs better than a quantity-based method. Our analysis of the components of the sets of overlapping genes supports the utility of the PRSOG method.

  12. Benchmarking of methods for genomic taxonomy

    DEFF Research Database (Denmark)

    Larsen, Mette Voldby; Cosentino, Salvatore; Lukjancenko, Oksana

    2014-01-01

    . Nevertheless, the method has been found to have a number of shortcomings. In the current study, we trained and benchmarked five methods for whole-genome sequence-based prokaryotic species identification on a common data set of complete genomes: (i) SpeciesFinder, which is based on the complete 16S rRNA gene...

  13. Genomic methods take the plunge

    DEFF Research Database (Denmark)

    Cammen, Kristina M.; Andrews, Kimberly R.; Carroll, Emma L.

    2016-01-01

    The dramatic increase in the application of genomic techniques to non-model organisms (NMOs) over the past decade has yielded numerous valuable contributions to evolutionary biology and ecology, many of which would not have been possible with traditional genetic markers. We review this recent...

  14. Technical note: Rapid calculation of genomic evaluations for new animals.

    Science.gov (United States)

    Wiggans, G R; VanRaden, P M; Cooper, T A

    2015-03-01

    A method was developed to calculate preliminary genomic evaluations daily or weekly before the release of official monthly evaluations by processing only newly genotyped animals using estimates of single nucleotide polymorphism effects from the previous official evaluation. To minimize computing time, reliabilities and genomic inbreeding are not calculated, and fixed weights are used to combine genomic and traditional information. Correlations of preliminary and September official monthly evaluations for animals with genotypes that became usable after the extraction of genotypes for August 2014 evaluations were >0.99 for most Holstein traits. Correlations were lower for breeds with smaller population size. Earlier access to genomic evaluations benefits producers by enabling earlier culling decisions and genotyping laboratories by making workloads more uniform across the month. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. Evaluation of three methods of DNA extraction from paraffin-embedded material for the amplification of genomic DNA by means of the PCR technique

    Directory of Open Access Journals (Sweden)

    MESQUITA Ricardo Alves

    2001-01-01

    Full Text Available There are several protocols reported in the literature for the extraction of genomic DNA from formalin-fixed paraffin-embedded samples. Genomic DNA is utilized in molecular analyses, including PCR. This study compares three different methods for the extraction of genomic DNA from formalin-fixed paraffin-embedded (inflammatory fibrous hyperplasia and non-formalin-fixed (normal oral mucosa samples: phenol with enzymatic digestion, and silica with and without enzymatic digestion. The amplification of DNA by means of the PCR technique was carried out with primers for the exon 7 of human keratin type 14. Amplicons were analyzed by means of electrophoresis in an 8% polyacrylamide gel with 5% glycerol, followed by silver-staining visualization. The phenol/enzymatic digestion and the silica/enzymatic digestion methods provided amplicons from both tissue samples. The method described is a potential aid in the establishment of the histopathologic diagnosis and in retrospective studies with archival paraffin-embedded samples.

  16. GI-SVM: A sensitive method for predicting genomic islands based on unannotated sequence of a single genome.

    Science.gov (United States)

    Lu, Bingxin; Leong, Hon Wai

    2016-02-01

    Genomic islands (GIs) are clusters of functionally related genes acquired by lateral genetic transfer (LGT), and they are present in many bacterial genomes. GIs are extremely important for bacterial research, because they not only promote genome evolution but also contain genes that enhance adaption and enable antibiotic resistance. Many methods have been proposed to predict GI. But most of them rely on either annotations or comparisons with other closely related genomes. Hence these methods cannot be easily applied to new genomes. As the number of newly sequenced bacterial genomes rapidly increases, there is a need for methods to detect GI based solely on sequences of a single genome. In this paper, we propose a novel method, GI-SVM, to predict GIs given only the unannotated genome sequence. GI-SVM is based on one-class support vector machine (SVM), utilizing composition bias in terms of k-mer content. From our evaluations on three real genomes, GI-SVM can achieve higher recall compared with current methods, without much loss of precision. Besides, GI-SVM allows flexible parameter tuning to get optimal results for each genome. In short, GI-SVM provides a more sensitive method for researchers interested in a first-pass detection of GI in newly sequenced genomes.

  17. Genomic prediction based on data from three layer lines: a comparison between linear methods

    NARCIS (Netherlands)

    Calus, M.P.L.; Huang, H.; Vereijken, J.; Visscher, J.; Napel, ten J.; Windig, J.J.

    2014-01-01

    Background The prediction accuracy of several linear genomic prediction models, which have previously been used for within-line genomic prediction, was evaluated for multi-line genomic prediction. Methods Compared to a conventional BLUP (best linear unbiased prediction) model using pedigree data, we

  18. Evaluation of phylogenetic reconstruction methods using bacterial whole genomes: a simulation based study [version 1; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    John A. Lees

    2018-03-01

    Full Text Available Background: Phylogenetic reconstruction is a necessary first step in many analyses which use whole genome sequence data from bacterial populations. There are many available methods to infer phylogenies, and these have various advantages and disadvantages, but few unbiased comparisons of the range of approaches have been made. Methods: We simulated data from a defined “true tree” using a realistic evolutionary model. We built phylogenies from this data using a range of methods, and compared reconstructed trees to the true tree using two measures, noting the computational time needed for different phylogenetic reconstructions. We also used real data from Streptococcus pneumoniae alignments to compare individual core gene trees to a core genome tree. Results: We found that, as expected, maximum likelihood trees from good quality alignments were the most accurate, but also the most computationally intensive. Using less accurate phylogenetic reconstruction methods, we were able to obtain results of comparable accuracy; we found that approximate results can rapidly be obtained using genetic distance based methods. In real data we found that highly conserved core genes, such as those involved in translation, gave an inaccurate tree topology, whereas genes involved in recombination events gave inaccurate branch lengths. We also show a tree-of-trees, relating the results of different phylogenetic reconstructions to each other. Conclusions: We recommend three approaches, depending on requirements for accuracy and computational time. Quicker approaches that do not perform full maximum likelihood optimisation may be useful for many analyses requiring a phylogeny, as generating a high quality input alignment is likely to be the major limiting factor of accurate tree topology. We have publicly released our simulated data and code to enable further comparisons.

  19. An evaluation of Comparative Genome Sequencing (CGS by comparing two previously-sequenced bacterial genomes

    Directory of Open Access Journals (Sweden)

    Herring Christopher D

    2007-08-01

    Full Text Available Abstract Background With the development of new technology, it has recently become practical to resequence the genome of a bacterium after experimental manipulation. It is critical though to know the accuracy of the technique used, and to establish confidence that all of the mutations were detected. Results In order to evaluate the accuracy of genome resequencing using the microarray-based Comparative Genome Sequencing service provided by Nimblegen Systems Inc., we resequenced the E. coli strain W3110 Kohara using MG1655 as a reference, both of which have been completely sequenced using traditional sequencing methods. CGS detected 7 of 8 small sequence differences, one large deletion, and 9 of 12 IS element insertions present in W3110, but did not detect a large chromosomal inversion. In addition, we confirmed that CGS also detected 2 SNPs, one deletion and 7 IS element insertions that are not present in the genome sequence, which we attribute to changes that occurred after the creation of the W3110 lambda clone library. The false positive rate for SNPs was one per 244 Kb of genome sequence. Conclusion CGS is an effective way to detect multiple mutations present in one bacterium relative to another, and while highly cost-effective, is prone to certain errors. Mutations occurring in repeated sequences or in sequences with a high degree of secondary structure may go undetected. It is also critical to follow up on regions of interest in which SNPs were not called because they often indicate deletions or IS element insertions.

  20. Genomic evaluation of both purebred and crossbred performances

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund; Madsen, Per; Nielsen, Bjarne

    2014-01-01

    relationship matrices for the two breeds; (2) marker-based partial relationship matrices are constructed; (3) marker-based partial relationship matrices are adjusted to be compatible to pedigree-based partial relationship matrices and (4) combined partial relationship matrices are constructed using information...... from both pedigree and marker genotypes. The extension of the Wei van der Werf model can be implemented using software that allows inverse covariance matrices in sparse format as input. A method for genomic evaluation of both purebred and crossbred performances was developed for a two...

  1. GenoSets: visual analytic methods for comparative genomics.

    Directory of Open Access Journals (Sweden)

    Aurora A Cain

    Full Text Available Many important questions in biology are, fundamentally, comparative, and this extends to our analysis of a growing number of sequenced genomes. Existing genomic analysis tools are often organized around literal views of genomes as linear strings. Even when information is highly condensed, these views grow cumbersome as larger numbers of genomes are added. Data aggregation and summarization methods from the field of visual analytics can provide abstracted comparative views, suitable for sifting large multi-genome datasets to identify critical similarities and differences. We introduce a software system for visual analysis of comparative genomics data. The system automates the process of data integration, and provides the analysis platform to identify and explore features of interest within these large datasets. GenoSets borrows techniques from business intelligence and visual analytics to provide a rich interface of interactive visualizations supported by a multi-dimensional data warehouse. In GenoSets, visual analytic approaches are used to enable querying based on orthology, functional assignment, and taxonomic or user-defined groupings of genomes. GenoSets links this information together with coordinated, interactive visualizations for both detailed and high-level categorical analysis of summarized data. GenoSets has been designed to simplify the exploration of multiple genome datasets and to facilitate reasoning about genomic comparisons. Case examples are included showing the use of this system in the analysis of 12 Brucella genomes. GenoSets software and the case study dataset are freely available at http://genosets.uncc.edu. We demonstrate that the integration of genomic data using a coordinated multiple view approach can simplify the exploration of large comparative genomic data sets, and facilitate reasoning about comparisons and features of interest.

  2. Genomics protocols [Methods in molecular biology, v. 175

    National Research Council Canada - National Science Library

    Starkey, Michael P; Elaswarapu, Ramnath

    2001-01-01

    ... exploiting the potential of gene therapy. Highlights include methods for the analysis of differential gene expression, SNP detection, comparative genomic hybridization, and the functional analysis of genes, as well as the use of bio...

  3. An assessment on epitope prediction methods for protozoa genomes

    Directory of Open Access Journals (Sweden)

    Resende Daniela M

    2012-11-01

    Full Text Available Abstract Background Epitope prediction using computational methods represents one of the most promising approaches to vaccine development. Reduction of time, cost, and the availability of completely sequenced genomes are key points and highly motivating regarding the use of reverse vaccinology. Parasites of genus Leishmania are widely spread and they are the etiologic agents of leishmaniasis. Currently, there is no efficient vaccine against this pathogen and the drug treatment is highly toxic. The lack of sufficiently large datasets of experimentally validated parasites epitopes represents a serious limitation, especially for trypanomatids genomes. In this work we highlight the predictive performances of several algorithms that were evaluated through the development of a MySQL database built with the purpose of: a evaluating individual algorithms prediction performances and their combination for CD8+ T cell epitopes, B-cell epitopes and subcellular localization by means of AUC (Area Under Curve performance and a threshold dependent method that employs a confusion matrix; b integrating data from experimentally validated and in silico predicted epitopes; and c integrating the subcellular localization predictions and experimental data. NetCTL, NetMHC, BepiPred, BCPred12, and AAP12 algorithms were used for in silico epitope prediction and WoLF PSORT, Sigcleave and TargetP for in silico subcellular localization prediction against trypanosomatid genomes. Results A database-driven epitope prediction method was developed with built-in functions that were capable of: a removing experimental data redundancy; b parsing algorithms predictions and storage experimental validated and predict data; and c evaluating algorithm performances. Results show that a better performance is achieved when the combined prediction is considered. This is particularly true for B cell epitope predictors, where the combined prediction of AAP12 and BCPred12 reached an AUC value

  4. Modified risk evaluation method

    International Nuclear Information System (INIS)

    Udell, C.J.; Tilden, J.A.; Toyooka, R.T.

    1993-08-01

    The purpose of this paper is to provide a structured and cost-oriented process to determine risks associated with nuclear material and other security interests. Financial loss is a continuing concern for US Department of Energy contractors. In this paper risk is equated with uncertainty of cost impacts to material assets or human resources. The concept provides a method for assessing the effectiveness of an integrated protection system, which includes operations, safety, emergency preparedness, and safeguards and security. The concept is suitable for application to sabotage evaluations. The protection of assets is based on risk associated with cost impacts to assets and the potential for undesirable events. This will allow managers to establish protection priorities in terms of the cost and the potential for the event, given the current level of protection

  5. Selection of assessment methods for evaluating banana weevil Cosmopolites sordidus (Coleoptera: Curculionidae) damage on highland cooking banana (Musa spp., genome group AAA-EA).

    Science.gov (United States)

    Gold, C S; Ragama, P E; Coe, R; Rukazambuga, N D T M

    2005-04-01

    Cosmopolites sordidus (Germar) is an important pest on bananas and plantains. Population build-up is slow and damage becomes increasingly important in successive crop cycles (ratoons). Yield loss results from plant loss, mat disappearance and reduced bunch size. Damage assessment requires destructive sampling and is most often done on corms of recently harvested plants. A wide range of damage assessment methods exist and there are no agreed protocols. It is critical to know what types of damage best reflect C. sordidus pest status through their relationships with yield loss. Multiple damage assessment parameters (i.e. for the corm periphery, cortex and central cylinder) were employed in two yield loss trials and a cultivar-screening trial in Uganda. Damage to the central cylinder had a greater effect on plant size and yield loss than damage to the cortex or corm periphery. In some cases, a combined assessment of damage to the central cylinder and cortex showed a better relationship with yield loss than an assessment of the central cylinder alone. Correlation, logistic and linear regression analyses showed weak to modest correlations between damage to the corm periphery and damage to the central cylinder. Thus, damage to the corm periphery is not a strong predictor of the more important damage to the central cylinder. Therefore, C. sordidus damage assessment should target the central cylinder and cortex.

  6. A tailing genome walking method suitable for genomes with high local GC content.

    Science.gov (United States)

    Liu, Taian; Fang, Yongxiang; Yao, Wenjuan; Guan, Qisai; Bai, Gang; Jing, Zhizhong

    2013-10-15

    The tailing genome walking strategies are simple and efficient. However, they sometimes can be restricted due to the low stringency of homo-oligomeric primers. Here we modified their conventional tailing step by adding polythymidine and polyguanine to the target single-stranded DNA (ssDNA). The tailed ssDNA was then amplified exponentially with a specific primer in the known region and a primer comprising 5' polycytosine and 3' polyadenosine. The successful application of this novel method for identifying integration sites mediated by φC31 integrase in goat genome indicates that the method is more suitable for genomes with high complexity and local GC content. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Kernel methods for large-scale genomic data analysis

    Science.gov (United States)

    Xing, Eric P.; Schaid, Daniel J.

    2015-01-01

    Machine learning, particularly kernel methods, has been demonstrated as a promising new tool to tackle the challenges imposed by today’s explosive data growth in genomics. They provide a practical and principled approach to learning how a large number of genetic variants are associated with complex phenotypes, to help reveal the complexity in the relationship between the genetic markers and the outcome of interest. In this review, we highlight the potential key role it will have in modern genomic data processing, especially with regard to integration with classical methods for gene prioritizing, prediction and data fusion. PMID:25053743

  8. Genome analysis methods - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods Genome analysis... methods Data detail Data name Genome analysis methods DOI 10.18908/lsdba.nbdc01194-01-005 De...scription of data contents The current status and related information of the genomic analysis about each org...anism (March, 2014). In the case of organisms carried out genomic analysis, the d...e File name: pgdbj_dna_marker_linkage_map_genome_analysis_methods_en.zip File URL: ftp://ftp.biosciencedbc.j

  9. A Novel Method to Predict Genomic Islands Based on Mean Shift Clustering Algorithm.

    Directory of Open Access Journals (Sweden)

    Daniel M de Brito

    Full Text Available Genomic Islands (GIs are regions of bacterial genomes that are acquired from other organisms by the phenomenon of horizontal transfer. These regions are often responsible for many important acquired adaptations of the bacteria, with great impact on their evolution and behavior. Nevertheless, these adaptations are usually associated with pathogenicity, antibiotic resistance, degradation and metabolism. Identification of such regions is of medical and industrial interest. For this reason, different approaches for genomic islands prediction have been proposed. However, none of them are capable of predicting precisely the complete repertory of GIs in a genome. The difficulties arise due to the changes in performance of different algorithms in the face of the variety of nucleotide distribution in different species. In this paper, we present a novel method to predict GIs that is built upon mean shift clustering algorithm. It does not require any information regarding the number of clusters, and the bandwidth parameter is automatically calculated based on a heuristic approach. The method was implemented in a new user-friendly tool named MSGIP--Mean Shift Genomic Island Predictor. Genomes of bacteria with GIs discussed in other papers were used to evaluate the proposed method. The application of this tool revealed the same GIs predicted by other methods and also different novel unpredicted islands. A detailed investigation of the different features related to typical GI elements inserted in these new regions confirmed its effectiveness. Stand-alone and user-friendly versions for this new methodology are available at http://msgip.integrativebioinformatics.me.

  10. Nuclear data evaluation method and evaluation system

    International Nuclear Information System (INIS)

    Liu Tingjin

    1995-01-01

    The evaluation methods and Nuclear Data Evaluation System have been developed in China. A new version of the system has been established on Micro-VAX2 computer, which is supported by IAEA under the technology assistance program. The flow chart of Chinese Nuclear Data Evaluation System is shown out. For last ten years, the main efforts have been put on the double differential cross section, covariance data and evaluated data library validation. The developed evaluation method and Chinese Nuclear Data Evaluation System have been widely used at CNDC and in Chinese Nuclear Data Network for CENDL. (1 tab., 15 figs.)

  11. A universal, rapid, and inexpensive method for genomic DNA ...

    Indian Academy of Sciences (India)

    MOHAMMED BAQUR SAHIB A. AL-SHUHAIB

    gels, containing 7% glycerol, and 1×TBE buffer. The gels were run under 200 .... Inc. Germany, GeneaidTM DNA Isolation Kit, Geneaid. Biotech., New Taipei City, .... C. L. and Arsenos G. 2015 Comparison of eleven methods for genomic DNA ...

  12. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  13. Development and evaluation of a genomics training program for community health workers in Texas.

    Science.gov (United States)

    Chen, Lei-Shih; Zhao, Shixi; Stelzig, Donaji; Dhar, Shweta U; Eble, Tanya; Yeh, Yu-Chen; Kwok, Oi-Man

    2018-01-04

    PurposeGenomics services have the potential to reduce incidence and mortality of diseases by providing individualized, family health history (FHH)-based prevention strategies to clients. These services may benefit from the involvement of community health workers (CHWs) in the provision of FHH-based genomics education and services, as CHWs are frontline public health workers and lay health educators, who share similar ethnicities, languages, socioeconomic statuses, and life experiences with the communities they serve. We developed, implemented, and evaluated the FHH-based genomics training program for CHWs.MethodsThis theory- and evidence-based FHH-focused genomics curriculum was developed by an interdisciplinary team. Full-day workshops in English and Spanish were delivered to 145 Texas CHWs (91.6% were Hispanic/black). Preworkshop, postworkshop, and 3-month follow-up data were collected.ResultsCHWs significantly improved their attitudes, intention, self-efficacy, and knowledge regarding adopting FHH-based genomics into their practice after the workshops. At 3-month follow-up, these scores remained higher, and there was a significant increase in CHWs' genomics practices.ConclusionThis FHH-based genomics training successfully educated Texas CHWs, and the outcomes were promising. Dissemination of training to CHWs in and outside of Texas is needed to promote better access to and delivery of personalized genomics services for the lay and underserved communities.GENETICS in MEDICINE advance online publication, 4 January 2018; doi:10.1038/gim.2017.236.

  14. Genome-wide comparison of ultraviolet and ethyl methanesulphonate mutagenesis methods for the brown alga Ectocarpus.

    Science.gov (United States)

    Godfroy, Olivier; Peters, Akira F; Coelho, Susana M; Cock, J Mark

    2015-12-01

    Ectocarpus has emerged as a model organism for the brown algae and a broad range of genetic and genomic resources are being generated for this species. The aim of the work presented here was to evaluate two mutagenesis protocols based on ultraviolet irradiation and ethyl methanesulphonate treatment using genome resequencing to measure the number, type and distribution of mutations generated by the two methods. Ultraviolet irradiation generated a greater number of genetic lesions than ethyl methanesulphonate treatment, with more than 400 mutations being detected in the genome of the mutagenised individual. This study therefore confirms that the ultraviolet mutagenesis protocol is suitable for approaches that require a high density of mutations, such as saturation mutagenesis or Targeting Induced Local Lesions in Genomes (TILLING). Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Genomic Selection in Plant Breeding: Methods, Models, and Perspectives.

    Science.gov (United States)

    Crossa, José; Pérez-Rodríguez, Paulino; Cuevas, Jaime; Montesinos-López, Osval; Jarquín, Diego; de Los Campos, Gustavo; Burgueño, Juan; González-Camacho, Juan M; Pérez-Elizalde, Sergio; Beyene, Yoseph; Dreisigacker, Susanne; Singh, Ravi; Zhang, Xuecai; Gowda, Manje; Roorkiwal, Manish; Rutkoski, Jessica; Varshney, Rajeev K

    2017-11-01

    Genomic selection (GS) facilitates the rapid selection of superior genotypes and accelerates the breeding cycle. In this review, we discuss the history, principles, and basis of GS and genomic-enabled prediction (GP) as well as the genetics and statistical complexities of GP models, including genomic genotype×environment (G×E) interactions. We also examine the accuracy of GP models and methods for two cereal crops and two legume crops based on random cross-validation. GS applied to maize breeding has shown tangible genetic gains. Based on GP results, we speculate how GS in germplasm enhancement (i.e., prebreeding) programs could accelerate the flow of genes from gene bank accessions to elite lines. Recent advances in hyperspectral image technology could be combined with GS and pedigree-assisted breeding. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Genome Target Evaluator (GTEvaluator: A workflow exploiting genome dataset to measure the sensitivity and specificity of genetic markers.

    Directory of Open Access Journals (Sweden)

    Arnaud Felten

    Full Text Available Most of the bacterial typing methods used to discriminate isolates in medical or food safety microbiology are based on genetic markers used as targets in PCR or hybridization experiments. These DNA typing methods are important tools for studying prevalence and epidemiology, for conducting surveillance, investigations and control of biological hazard sources. In that perspective, it is crucial to insure that the chosen genetic markers have the greatest specificity and sensitivity. The wealth of whole-genome sequences available for many bacterial species offers the opportunity to evaluate the performance of these genetic markers. In the present study, we have developed GTEvaluator, a bioinformatics workflow which ranks genetic markers depending on their sensitivity and specificity towards groups of well-defined genomes. GTEvaluator identifies the most performant genetic markers to target individuals among a population. The individuals (i.e. a group of genomes within a collection are defined by any kind of particular phenotypic or biological properties inside a related population (i.e. collection of genomes. The performance of the genetic markers is computed by a distance value which takes into account both sensitivity and specificity. In this study we report two examples of GTEvaluator application. In the first example Bacillus phenotypic markers were evaluated for their capacity to distinguish B. cereus from B. thuringiensis. In the second experiment, GTEvaluator measured the performance of genetic markers dedicated to the molecular serotyping of Salmonella enterica. In one in silico experiment it was possible to test 64 markers onto 134 genomes corresponding to 14 different serotypes.

  17. Bayesian methods for jointly estimating genomic breeding values of one continuous and one threshold trait.

    Directory of Open Access Journals (Sweden)

    Chonglong Wang

    Full Text Available Genomic selection has become a useful tool for animal and plant breeding. Currently, genomic evaluation is usually carried out using a single-trait model. However, a multi-trait model has the advantage of using information on the correlated traits, leading to more accurate genomic prediction. To date, joint genomic prediction for a continuous and a threshold trait using a multi-trait model is scarce and needs more attention. Based on the previously proposed methods BayesCπ for single continuous trait and BayesTCπ for single threshold trait, we developed a novel method based on a linear-threshold model, i.e., LT-BayesCπ, for joint genomic prediction of a continuous trait and a threshold trait. Computing procedures of LT-BayesCπ using Markov Chain Monte Carlo algorithm were derived. A simulation study was performed to investigate the advantages of LT-BayesCπ over BayesCπ and BayesTCπ with regard to the accuracy of genomic prediction on both traits. Factors affecting the performance of LT-BayesCπ were addressed. The results showed that, in all scenarios, the accuracy of genomic prediction obtained from LT-BayesCπ was significantly increased for the threshold trait compared to that from single trait prediction using BayesTCπ, while the accuracy for the continuous trait was comparable with that from single trait prediction using BayesCπ. The proposed LT-BayesCπ could be a method of choice for joint genomic prediction of one continuous and one threshold trait.

  18. Voltammetry Method Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Hoyt, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Pereira, C. [Argonne National Lab. (ANL), Argonne, IL (United States); Willit, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Williamson, M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-07-29

    The purpose of the ANL MPACT Voltammetry project is to evaluate the suitability of previously developed cyclic voltammetry techniques to provide electroanalytical measurements of actinide concentrations in realistic used fuel processing scenarios. The molten salts in these scenarios are very challenging as they include high concentrations of multiple electrochemically active species, thereby creating a variety of complications. Some of the problems that arise therein include issues related to uncompensated resistance, cylindrical diffusion, and alloying of the electrodeposited metals. Improvements to the existing voltammetry technique to account for these issues have been implemented, resulting in good measurements of actinide concentrations across a wide range of adverse conditions.

  19. Evaluating genomic tests from bench to bedside: a practical framework

    Directory of Open Access Journals (Sweden)

    Lin Jennifer S

    2012-10-01

    Full Text Available Abstract The development of genomic tests is one of the most significant technological advances in medical testing in recent decades. As these tests become increasingly available, so does the need for a pragmatic framework to evaluate the evidence base and evidence gaps in order to facilitate informed decision-making. In this article we describe such a framework that can provide a common language and benchmarks for different stakeholders of genomic testing. Each stakeholder can use this framework to specify their respective thresholds for decision-making, depending on their perspective and particular needs. This framework is applicable across a broad range of test applications and can be helpful in the application and communication of a regulatory science for genomic testing. Our framework builds upon existing work and incorporates principles familiar to researchers involved in medical testing (both diagnostic and prognostic generally, as well as those involved in genomic testing. This framework is organized around six phases in the development of genomic tests beginning with marker identification and ending with population impact, and highlights the important knowledge gaps that need to be filled in establishing the clinical relevance of a test. Our framework focuses on the clinical appropriateness of the four main dimensions of test research questions (population/setting, intervention/index test, comparators/reference test, and outcomes rather than prescribing a hierarchy of study designs that should be used to address each phase.

  20. Including α s1 casein gene information in genomic evaluations of French dairy goats.

    Science.gov (United States)

    Carillier-Jacquin, Céline; Larroque, Hélène; Robert-Granié, Christèle

    2016-08-04

    Genomic best linear unbiased prediction methods assume that all markers explain the same fraction of the genetic variance and do not account effectively for genes with major effects such as the α s1 casein polymorphism in dairy goats. In this study, we investigated methods to include the available α s1 casein genotype effect in genomic evaluations of French dairy goats. First, the α s1 casein genotype was included as a fixed effect in genomic evaluation models based only on bucks that were genotyped at the α s1 casein locus. Less than 1 % of the females with phenotypes were genotyped at the α s1 casein gene. Thus, to incorporate these female phenotypes in the genomic evaluation, two methods that allowed for this large number of missing α s1 casein genotypes were investigated. Probabilities for each possible α s1 casein genotype were first estimated for each female of unknown genotype based on iterative peeling equations. The second method is based on a multiallelic gene content approach. For each model tested, we used three datasets each divided into a training and a validation set: (1) two-breed population (Alpine + Saanen), (2) Alpine population, and (3) Saanen population. The α s1 casein genotype had a significant effect on milk yield, fat content and protein content. Including an α s1 casein effect in genetic and genomic evaluations based only on male known α s1 casein genotypes improved accuracies (from 6 to 27 %). In genomic evaluations based on all female phenotypes, the gene content approach performed better than the other tested methods but the improvement in accuracy was only slightly better (from 1 to 14 %) than that of a genomic model without the α s1 casein effect. Including the α s1 casein effect in a genomic evaluation model for French dairy goats is possible and useful to improve accuracy. Difficulties in predicting the genotypes for ungenotyped animals limited the improvement in accuracy of the obtained estimated breeding values.

  1. Evaluation methods for hospital facilities

    DEFF Research Database (Denmark)

    Fronczek-Munter, Aneta

    2013-01-01

    according to focus areas and proposes which evaluation methods to use in different building phases of healthcare facilities. Hospital evaluations with experts and users are also considered; their subjective view on space, function, technology, usability and aesthetics. Results & solutions: This paper...... presents the different methods for evaluating buildings in use in a new model, the Evaluation Focus Flower, and proposes which evaluation methods are suitable for various aims and building phases, i.e. which is giving best input for the initial briefing process of new hospital facilities with ambition...... of creating buildings with enhanced usability. Additionally various evaluation methods used in hospital cases in Denmark and Norway are presented. Involvement of users is proposed, not just in defining requirements but also in co-creation/design and evaluation of solutions. The theories and preliminary...

  2. Mapping the sensory perception of apple using descriptive sensory evaluation in a genome wide association study.

    Science.gov (United States)

    Amyotte, Beatrice; Bowen, Amy J; Banks, Travis; Rajcan, Istvan; Somers, Daryl J

    2017-01-01

    Breeding apples is a long-term endeavour and it is imperative that new cultivars are selected to have outstanding consumer appeal. This study has taken the approach of merging sensory science with genome wide association analyses in order to map the human perception of apple flavour and texture onto the apple genome. The goal was to identify genomic associations that could be used in breeding apples for improved fruit quality. A collection of 85 apple cultivars was examined over two years through descriptive sensory evaluation by a trained sensory panel. The trained sensory panel scored randomized sliced samples of each apple cultivar for seventeen taste, flavour and texture attributes using controlled sensory evaluation practices. In addition, the apple collection was subjected to genotyping by sequencing for marker discovery. A genome wide association analysis suggested significant genomic associations for several sensory traits including juiciness, crispness, mealiness and fresh green apple flavour. The findings include previously unreported genomic regions that could be used in apple breeding and suggest that similar sensory association mapping methods could be applied in other plants.

  3. Mapping the sensory perception of apple using descriptive sensory evaluation in a genome wide association study

    Science.gov (United States)

    Amyotte, Beatrice; Bowen, Amy J.; Banks, Travis; Rajcan, Istvan; Somers, Daryl J.

    2017-01-01

    Breeding apples is a long-term endeavour and it is imperative that new cultivars are selected to have outstanding consumer appeal. This study has taken the approach of merging sensory science with genome wide association analyses in order to map the human perception of apple flavour and texture onto the apple genome. The goal was to identify genomic associations that could be used in breeding apples for improved fruit quality. A collection of 85 apple cultivars was examined over two years through descriptive sensory evaluation by a trained sensory panel. The trained sensory panel scored randomized sliced samples of each apple cultivar for seventeen taste, flavour and texture attributes using controlled sensory evaluation practices. In addition, the apple collection was subjected to genotyping by sequencing for marker discovery. A genome wide association analysis suggested significant genomic associations for several sensory traits including juiciness, crispness, mealiness and fresh green apple flavour. The findings include previously unreported genomic regions that could be used in apple breeding and suggest that similar sensory association mapping methods could be applied in other plants. PMID:28231290

  4. Predicting human height by Victorian and genomic methods.

    Science.gov (United States)

    Aulchenko, Yurii S; Struchalin, Maksim V; Belonogova, Nadezhda M; Axenovich, Tatiana I; Weedon, Michael N; Hofman, Albert; Uitterlinden, Andre G; Kayser, Manfred; Oostra, Ben A; van Duijn, Cornelia M; Janssens, A Cecile J W; Borodin, Pavel M

    2009-08-01

    In the Victorian era, Sir Francis Galton showed that 'when dealing with the transmission of stature from parents to children, the average height of the two parents, ... is all we need care to know about them' (1886). One hundred and twenty-two years after Galton's work was published, 54 loci showing strong statistical evidence for association to human height were described, providing us with potential genomic means of human height prediction. In a population-based study of 5748 people, we find that a 54-loci genomic profile explained 4-6% of the sex- and age-adjusted height variance, and had limited ability to discriminate tall/short people, as characterized by the area under the receiver-operating characteristic curve (AUC). In a family-based study of 550 people, with both parents having height measurements, we find that the Galtonian mid-parental prediction method explained 40% of the sex- and age-adjusted height variance, and showed high discriminative accuracy. We have also explored how much variance a genomic profile should explain to reach certain AUC values. For highly heritable traits such as height, we conclude that in applications in which parental phenotypic information is available (eg, medicine), the Victorian Galton's method will long stay unsurpassed, in terms of both discriminative accuracy and costs. For less heritable traits, and in situations in which parental information is not available (eg, forensics), genomic methods may provide an alternative, given that the variants determining an essential proportion of the trait's variation can be identified.

  5. Performance comparison of two efficient genomic selection methods (gsbay & MixP) applied in aquacultural organisms

    Science.gov (United States)

    Su, Hailin; Li, Hengde; Wang, Shi; Wang, Yangfan; Bao, Zhenmin

    2017-02-01

    Genomic selection is more and more popular in animal and plant breeding industries all around the world, as it can be applied early in life without impacting selection candidates. The objective of this study was to bring the advantages of genomic selection to scallop breeding. Two different genomic selection tools MixP and gsbay were applied on genomic evaluation of simulated data and Zhikong scallop ( Chlamys farreri) field data. The data were compared with genomic best linear unbiased prediction (GBLUP) method which has been applied widely. Our results showed that both MixP and gsbay could accurately estimate single-nucleotide polymorphism (SNP) marker effects, and thereby could be applied for the analysis of genomic estimated breeding values (GEBV). In simulated data from different scenarios, the accuracy of GEBV acquired was ranged from 0.20 to 0.78 by MixP; it was ranged from 0.21 to 0.67 by gsbay; and it was ranged from 0.21 to 0.61 by GBLUP. Estimations made by MixP and gsbay were expected to be more reliable than those estimated by GBLUP. Predictions made by gsbay were more robust, while with MixP the computation is much faster, especially in dealing with large-scale data. These results suggested that both algorithms implemented by MixP and gsbay are feasible to carry out genomic selection in scallop breeding, and more genotype data will be necessary to produce genomic estimated breeding values with a higher accuracy for the industry.

  6. Methods for Optimizing CRISPR-Cas9 Genome Editing Specificity

    Science.gov (United States)

    Tycko, Josh; Myer, Vic E.; Hsu, Patrick D.

    2016-01-01

    Summary Advances in the development of delivery, repair, and specificity strategies for the CRISPR-Cas9 genome engineering toolbox are helping researchers understand gene function with unprecedented precision and sensitivity. CRISPR-Cas9 also holds enormous therapeutic potential for the treatment of genetic disorders by directly correcting disease-causing mutations. Although the Cas9 protein has been shown to bind and cleave DNA at off-target sites, the field of Cas9 specificity is rapidly progressing with marked improvements in guide RNA selection, protein and guide engineering, novel enzymes, and off-target detection methods. We review important challenges and breakthroughs in the field as a comprehensive practical guide to interested users of genome editing technologies, highlighting key tools and strategies for optimizing specificity. The genome editing community should now strive to standardize such methods for measuring and reporting off-target activity, while keeping in mind that the goal for specificity should be continued improvement and vigilance. PMID:27494557

  7. Defining and Evaluating a Core Genome Multilocus Sequence Typing Scheme for Genome-Wide Typing of Clostridium difficile.

    Science.gov (United States)

    Bletz, Stefan; Janezic, Sandra; Harmsen, Dag; Rupnik, Maja; Mellmann, Alexander

    2018-06-01

    Clostridium difficile , recently renamed Clostridioides difficile , is the most common cause of antibiotic-associated nosocomial gastrointestinal infections worldwide. To differentiate endogenous infections and transmission events, highly discriminatory subtyping is necessary. Today, methods based on whole-genome sequencing data are increasingly used to subtype bacterial pathogens; however, frequently a standardized methodology and typing nomenclature are missing. Here we report a core genome multilocus sequence typing (cgMLST) approach developed for C. difficile Initially, we determined the breadth of the C. difficile population based on all available MLST sequence types with Bayesian inference (BAPS). The resulting BAPS partitions were used in combination with C. difficile clade information to select representative isolates that were subsequently used to define cgMLST target genes. Finally, we evaluated the novel cgMLST scheme with genomes from 3,025 isolates. BAPS grouping ( n = 6 groups) together with the clade information led to a total of 11 representative isolates that were included for cgMLST definition and resulted in 2,270 cgMLST genes that were present in all isolates. Overall, 2,184 to 2,268 cgMLST targets were detected in the genome sequences of 70 outbreak-associated and reference strains, and on average 99.3% cgMLST targets (1,116 to 2,270 targets) were present in 2,954 genomes downloaded from the NCBI database, underlining the representativeness of the cgMLST scheme. Moreover, reanalyzing different cluster scenarios with cgMLST were concordant to published single nucleotide variant analyses. In conclusion, the novel cgMLST is representative for the whole C. difficile population, is highly discriminatory in outbreak situations, and provides a unique nomenclature facilitating interlaboratory exchange. Copyright © 2018 American Society for Microbiology.

  8. A simple and inexpensive method for genomic restriction mapping analysis

    International Nuclear Information System (INIS)

    Huang, C.H.; Lam, V.M.S.; Tam, J.W.O.

    1988-01-01

    The Southern blotting procedure for the transfer of DNA fragments from agarose gels to nitrocellulose membranes has revolutionized nucleic acid detection methods, and it forms the cornerstone of research in molecular biology. Basically, the method involves the denaturation of DNA fragments that have been separated on an agarose gel, the immobilization of the fragments by transfer to a nitrocellulose membrane, and the identification of the fragments of interest through hybridization to /sup 32/P-labeled probes and autoradiography. While the method is sensitive and applicable to both genomic and cloned DNA, it suffers from the disadvantages of being time consuming and expensive, and fragments of greater than 15 kb are difficult to transfer. Moreover, although theoretically the nitrocellulose membrane can be washed and hybridized repeatedly using different probes, in practice, the membrane becomes brittle and difficult to handle after a few cycles. A direct hybridization method for pure DNA clones was developed in 1975 but has not been widely exploited. The authors report here a modification of their procedure as applied to genomic DNA. The method is simple, rapid, and inexpensive, and it does not involve transfer to nitrocellulose membranes

  9. Comparison of methods for genomic localization of gene trap sequences

    Directory of Open Access Journals (Sweden)

    Ferrin Thomas E

    2006-09-01

    Full Text Available Abstract Background Gene knockouts in a model organism such as mouse provide a valuable resource for the study of basic biology and human disease. Determining which gene has been inactivated by an untargeted gene trapping event poses a challenging annotation problem because gene trap sequence tags, which represent sequence near the vector insertion site of a trapped gene, are typically short and often contain unresolved residues. To understand better the localization of these sequences on the mouse genome, we compared stand-alone versions of the alignment programs BLAT, SSAHA, and MegaBLAST. A set of 3,369 sequence tags was aligned to build 34 of the mouse genome using default parameters for each algorithm. Known genome coordinates for the cognate set of full-length genes (1,659 sequences were used to evaluate localization results. Results In general, all three programs performed well in terms of localizing sequences to a general region of the genome, with only relatively subtle errors identified for a small proportion of the sequence tags. However, large differences in performance were noted with regard to correctly identifying exon boundaries. BLAT correctly identified the vast majority of exon boundaries, while SSAHA and MegaBLAST missed the majority of exon boundaries. SSAHA consistently reported the fewest false positives and is the fastest algorithm. MegaBLAST was comparable to BLAT in speed, but was the most susceptible to localizing sequence tags incorrectly to pseudogenes. Conclusion The differences in performance for sequence tags and full-length reference sequences were surprisingly small. Characteristic variations in localization results for each program were noted that affect the localization of sequence at exon boundaries, in particular.

  10. Will genomic selection be a practical method for plant breeding?

    Science.gov (United States)

    Nakaya, Akihiro; Isobe, Sachiko N

    2012-11-01

    Genomic selection or genome-wide selection (GS) has been highlighted as a new approach for marker-assisted selection (MAS) in recent years. GS is a form of MAS that selects favourable individuals based on genomic estimated breeding values. Previous studies have suggested the utility of GS, especially for capturing small-effect quantitative trait loci, but GS has not become a popular methodology in the field of plant breeding, possibly because there is insufficient information available on GS for practical use. In this review, GS is discussed from a practical breeding viewpoint. Statistical approaches employed in GS are briefly described, before the recent progress in GS studies is surveyed. GS practices in plant breeding are then reviewed before future prospects are discussed. Statistical concepts used in GS are discussed with genetic models and variance decomposition, heritability, breeding value and linear model. Recent progress in GS studies is reviewed with a focus on empirical studies. For the practice of GS in plant breeding, several specific points are discussed including linkage disequilibrium, feature of populations and genotyped markers and breeding scheme. Currently, GS is not perfect, but it is a potent, attractive and valuable approach for plant breeding. This method will be integrated into many practical breeding programmes in the near future with further advances and the maturing of its theory.

  11. Evaluation Methods for Prevention Education.

    Science.gov (United States)

    Blue, Amy V.; Barnette, J. Jackson; Ferguson, Kristi J.; Garr, David R.

    2000-01-01

    Discusses the importance of assessing medical students' competence in prevention knowledge, skills, and attitudes. Provides general guidance for programs interested in evaluating their prevention instructional efforts, and gives specific examples of possible methods for evaluating prevention education. Stresses the need to tailor assessment…

  12. A comparison of multivariate genome-wide association methods

    DEFF Research Database (Denmark)

    Galesloot, Tessel E; Van Steen, Kristel; Kiemeney, Lambertus A L M

    2014-01-01

    Joint association analysis of multiple traits in a genome-wide association study (GWAS), i.e. a multivariate GWAS, offers several advantages over analyzing each trait in a separate GWAS. In this study we directly compared a number of multivariate GWAS methods using simulated data. We focused on six...... methods that are implemented in the software packages PLINK, SNPTEST, MultiPhen, BIMBAM, PCHAT and TATES, and also compared them to standard univariate GWAS, analysis of the first principal component of the traits, and meta-analysis of univariate results. We simulated data (N = 1000) for three...... for scenarios with an opposite sign of genetic and residual correlation. All multivariate analyses resulted in a higher power than univariate analyses, even when only one of the traits was associated with the QTL. Hence, use of multivariate GWAS methods can be recommended, even when genetic correlations between...

  13. Genome-wide engineering of an infectious clone of herpes simplex virus type 1 using synthetic genomics assembly methods.

    Science.gov (United States)

    Oldfield, Lauren M; Grzesik, Peter; Voorhies, Alexander A; Alperovich, Nina; MacMath, Derek; Najera, Claudia D; Chandra, Diya Sabrina; Prasad, Sanjana; Noskov, Vladimir N; Montague, Michael G; Friedman, Robert M; Desai, Prashant J; Vashee, Sanjay

    2017-10-17

    Here, we present a transformational approach to genome engineering of herpes simplex virus type 1 (HSV-1), which has a large DNA genome, using synthetic genomics tools. We believe this method will enable more rapid and complex modifications of HSV-1 and other large DNA viruses than previous technologies, facilitating many useful applications. Yeast transformation-associated recombination was used to clone 11 fragments comprising the HSV-1 strain KOS 152 kb genome. Using overlapping sequences between the adjacent pieces, we assembled the fragments into a complete virus genome in yeast, transferred it into an Escherichia coli host, and reconstituted infectious virus following transfection into mammalian cells. The virus derived from this yeast-assembled genome, KOS YA , replicated with kinetics similar to wild-type virus. We demonstrated the utility of this modular assembly technology by making numerous modifications to a single gene, making changes to two genes at the same time and, finally, generating individual and combinatorial deletions to a set of five conserved genes that encode virion structural proteins. While the ability to perform genome-wide editing through assembly methods in large DNA virus genomes raises dual-use concerns, we believe the incremental risks are outweighed by potential benefits. These include enhanced functional studies, generation of oncolytic virus vectors, development of delivery platforms of genes for vaccines or therapy, as well as more rapid development of countermeasures against potential biothreats.

  14. Evaluating Phylogenetic Congruence in the Post-Genomic Era

    Science.gov (United States)

    Leigh, Jessica W.; Lapointe, François-Joseph; Lopez, Philippe; Bapteste, Eric

    2011-01-01

    Congruence is a broadly applied notion in evolutionary biology used to justify multigene phylogeny or phylogenomics, as well as in studies of coevolution, lateral gene transfer, and as evidence for common descent. Existing methods for identifying incongruence or heterogeneity using character data were designed for data sets that are both small and expected to be rarely incongruent. At the same time, methods that assess incongruence using comparison of trees test a null hypothesis of uncorrelated tree structures, which may be inappropriate for phylogenomic studies. As such, they are ill-suited for the growing number of available genome sequences, most of which are from prokaryotes and viruses, either for phylogenomic analysis or for studies of the evolutionary forces and events that have shaped these genomes. Specifically, many existing methods scale poorly with large numbers of genes, cannot accommodate high levels of incongruence, and do not adequately model patterns of missing taxa for different markers. We propose the development of novel incongruence assessment methods suitable for the analysis of the molecular evolution of the vast majority of life and support the investigation of homogeneity of evolutionary process in cases where markers do not share identical tree structures. PMID:21712432

  15. A comparative evaluation of genome assembly reconciliation tools.

    Science.gov (United States)

    Alhakami, Hind; Mirebrahim, Hamid; Lonardi, Stefano

    2017-05-18

    The majority of eukaryotic genomes are unfinished due to the algorithmic challenges of assembling them. A variety of assembly and scaffolding tools are available, but it is not always obvious which tool or parameters to use for a specific genome size and complexity. It is, therefore, common practice to produce multiple assemblies using different assemblers and parameters, then select the best one for public release. A more compelling approach would allow one to merge multiple assemblies with the intent of producing a higher quality consensus assembly, which is the objective of assembly reconciliation. Several assembly reconciliation tools have been proposed in the literature, but their strengths and weaknesses have never been compared on a common dataset. We fill this need with this work, in which we report on an extensive comparative evaluation of several tools. Specifically, we evaluate contiguity, correctness, coverage, and the duplication ratio of the merged assembly compared to the individual assemblies provided as input. None of the tools we tested consistently improved the quality of the input GAGE and synthetic assemblies. Our experiments show an increase in contiguity in the consensus assembly when the original assemblies already have high quality. In terms of correctness, the quality of the results depends on the specific tool, as well as on the quality and the ranking of the input assemblies. In general, the number of misassemblies ranges from being comparable to the best of the input assembly to being comparable to the worst of the input assembly.

  16. LNG Safety Assessment Evaluation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Muna, Alice Baca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Angela Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-05-01

    Sandia National Laboratories evaluated published safety assessment methods across a variety of industries including Liquefied Natural Gas (LNG), hydrogen, land and marine transportation, as well as the US Department of Defense (DOD). All the methods were evaluated for their potential applicability for use in the LNG railroad application. After reviewing the documents included in this report, as well as others not included because of repetition, the Department of Energy (DOE) Hydrogen Safety Plan Checklist is most suitable to be adapted to the LNG railroad application. This report was developed to survey industries related to rail transportation for methodologies and tools that can be used by the FRA to review and evaluate safety assessments submitted by the railroad industry as a part of their implementation plans for liquefied or compressed natural gas storage ( on-board or tender) and engine fueling delivery systems. The main sections of this report provide an overview of various methods found during this survey. In most cases, the reference document is quoted directly. The final section provides discussion and a recommendation for the most appropriate methodology that will allow efficient and consistent evaluations to be made. The DOE Hydrogen Safety Plan Checklist was then revised to adapt it as a methodology for the Federal Railroad Administration’s use in evaluating safety plans submitted by the railroad industry.

  17. A novel method of genomic DNA extraction for Cactaceae1

    Science.gov (United States)

    Fehlberg, Shannon D.; Allen, Jessica M.; Church, Kathleen

    2013-01-01

    • Premise of the study: Genetic studies of Cactaceae can at times be impeded by difficult sampling logistics and/or high mucilage content in tissues. Simplifying sampling and DNA isolation through the use of cactus spines has not previously been investigated. • Methods and Results: Several protocols for extracting DNA from spines were tested and modified to maximize yield, amplification, and sequencing. Sampling of and extraction from spines resulted in a simplified protocol overall and complete avoidance of mucilage as compared to typical tissue extractions. Sequences from one nuclear and three plastid regions were obtained across eight genera and 20 species of cacti using DNA extracted from spines. • Conclusions: Genomic DNA useful for amplification and sequencing can be obtained from cactus spines. The protocols described here are valuable for any cactus species, but are particularly useful for investigators interested in sampling living collections, extensive field sampling, and/or conservation genetic studies. PMID:25202521

  18. Evaluation Using Sequential Trials Methods.

    Science.gov (United States)

    Cohen, Mark E.; Ralls, Stephen A.

    1986-01-01

    Although dental school faculty as well as practitioners are interested in evaluating products and procedures used in clinical practice, research design and statistical analysis can sometimes pose problems. Sequential trials methods provide an analytical structure that is both easy to use and statistically valid. (Author/MLW)

  19. Evaluation of Kjeldahl digestion method

    International Nuclear Information System (INIS)

    Amin, M.; Flowers, T.H.

    2004-01-01

    The evaluation of the Kjeldahl digestion method was investigated by comparing measured values of total nitrogen, phosphorus and potassium using three salt and catalyst mixture in Standard Kjeldahl digestion method and Salicyclic acid Modification method with certified values of plant material as well as comparison was made for determination of total nitrogen from steam distillation method verses the Technicon Auto-analyzer, and phosphorus Ascorbic acid/Molybdate method verses Molybdate/ Metavanadate method on the Technicon Auto-Analyzer. The 1 g salt/catalyst mixture recovered less nitrogen than the 2.5 g in the standard Kjeldahl method due to the lower temperature and incomplete digestion in both plant and soil samples. The 2.5 g catalyst mixture partially recovered nitrate in the standard Kjeldahl method and the salicylic acid modification fail to recover all over nitrate in plant material. Use of 2.5 g salt catalyst mixture and selenium appears to promote nitrogen losses in salicylic acid modification method but not in the standard Kjeldahl method of digestion for soil samples. No interference of selenium or copper was observed in Nitrogen and Phosphorus on calorimetric determination. The standard Kjeldahl method with 2.5 g of salt/catalyst mixture of sodium sulphate copper sulphate (10:1) in 5 ml sulfuric acid were found suitable for determination of total Nitrogen, phosphorus and potassium. The steam distillation and the Technicon Auto-Analyzer technique measure similar amounts of ammonium nitrogen. However, the Technicon Auto analyzer technique is easier, rapid, higher degree of reproducibility, precise, accurate, reliable and free from human error. The amount of phosphorus measured by the Ascorbic acid/Molybdate method was more accurate than by the Molybdate/Metavanadate method on Technicon Auto-Analyzer. (author)

  20. Classes evaluation: Methods and tools

    Directory of Open Access Journals (Sweden)

    Grabiński Tadeusz

    2013-01-01

    Full Text Available This study presents a method, tools, course and results of foreign language classes evaluation conducted in the summer semester 2012/2013 in the Andrzej Frycz - Modrzewski Krakow University. Because a new evaluation procedure has been implemented at the University, the former method - based on paper forms filled in by the students - was abandoned. On the surveyanyplace.com website, a free account has been registered and the form of the evaluation questionnaire has been inserted. This coverage presents results of a taxometric analysis aimed at checking the degree of mutual correspondence (correlation between certain criteria and instancing a graphic presentation of the evaluation results in a multidimensional perspective. In order to classify the grading criteria, the Ward's agglomerative method, along with Euclidean metric as a measure of criteria similarity, have been used. Calculations have been made with the use of Statistica package. Results of the questionnaire show that foreign language teaching at the Andrzej Frycz Modrzewski Krakow University is conducted professionally and on a high factual level.

  1. Genomics protocols [Methods in molecular biology, v. 175

    National Research Council Canada - National Science Library

    Starkey, Michael P; Elaswarapu, Ramnath

    2001-01-01

    ... to the larger community of researchers who have recognized the potential of genomics research and may themselves be beginning to explore the technologies involved. Some of the techniques described in Genomics Protocols are clearly not restricted to the genomics field; indeed, a prerequisite for many procedures in this discipline is that they require an extremely high throughput, beyond the scope of the average investigator. However, what we have endeavored here to achieve is both to compile a collection of...

  2. Will genomic selection be a practical method for plant breeding?

    OpenAIRE

    Nakaya, Akihiro; Isobe, Sachiko N.

    2012-01-01

    Background Genomic selection or genome-wide selection (GS) has been highlighted as a new approach for marker-assisted selection (MAS) in recent years. GS is a form of MAS that selects favourable individuals based on genomic estimated breeding values. Previous studies have suggested the utility of GS, especially for capturing small-effect quantitative trait loci, but GS has not become a popular methodology in the field of plant breeding, possibly because there is insufficient information avail...

  3. Genomics protocols [Methods in molecular biology, v. 175

    National Research Council Canada - National Science Library

    Starkey, Michael P; Elaswarapu, Ramnath

    2001-01-01

    .... Drawing on emerging technologies in the fields of bioinformatics and proteomics, these protocols cover not only those traditionally recognized as genomics, but also early therapeutich approaches...

  4. TEGS-CN: A Statistical Method for Pathway Analysis of Genome-wide Copy Number Profile.

    Science.gov (United States)

    Huang, Yen-Tsung; Hsu, Thomas; Christiani, David C

    2014-01-01

    The effects of copy number alterations make up a significant part of the tumor genome profile, but pathway analyses of these alterations are still not well established. We proposed a novel method to analyze multiple copy numbers of genes within a pathway, termed Test for the Effect of a Gene Set with Copy Number data (TEGS-CN). TEGS-CN was adapted from TEGS, a method that we previously developed for gene expression data using a variance component score test. With additional development, we extend the method to analyze DNA copy number data, accounting for different sizes and thus various numbers of copy number probes in genes. The test statistic follows a mixture of X (2) distributions that can be obtained using permutation with scaled X (2) approximation. We conducted simulation studies to evaluate the size and the power of TEGS-CN and to compare its performance with TEGS. We analyzed a genome-wide copy number data from 264 patients of non-small-cell lung cancer. With the Molecular Signatures Database (MSigDB) pathway database, the genome-wide copy number data can be classified into 1814 biological pathways or gene sets. We investigated associations of the copy number profile of the 1814 gene sets with pack-years of cigarette smoking. Our analysis revealed five pathways with significant P values after Bonferroni adjustment (number data, and causal mechanisms of the five pathways require further study.

  5. TSSer: an automated method to identify transcription start sites in prokaryotic genomes from differential RNA sequencing data.

    Science.gov (United States)

    Jorjani, Hadi; Zavolan, Mihaela

    2014-04-01

    Accurate identification of transcription start sites (TSSs) is an essential step in the analysis of transcription regulatory networks. In higher eukaryotes, the capped analysis of gene expression technology enabled comprehensive annotation of TSSs in genomes such as those of mice and humans. In bacteria, an equivalent approach, termed differential RNA sequencing (dRNA-seq), has recently been proposed, but the application of this approach to a large number of genomes is hindered by the paucity of computational analysis methods. With few exceptions, when the method has been used, annotation of TSSs has been largely done manually. In this work, we present a computational method called 'TSSer' that enables the automatic inference of TSSs from dRNA-seq data. The method rests on a probabilistic framework for identifying both genomic positions that are preferentially enriched in the dRNA-seq data as well as preferentially captured relative to neighboring genomic regions. Evaluating our approach for TSS calling on several publicly available datasets, we find that TSSer achieves high consistency with the curated lists of annotated TSSs, but identifies many additional TSSs. Therefore, TSSer can accelerate genome-wide identification of TSSs in bacterial genomes and can aid in further characterization of bacterial transcription regulatory networks. TSSer is freely available under GPL license at http://www.clipz.unibas.ch/TSSer/index.php

  6. Methods for evaluating information sources

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2012-01-01

    The article briefly presents and discusses 12 different approaches to the evaluation of information sources (for example a Wikipedia entry or a journal article): (1) the checklist approach; (2) classical peer review; (3) modified peer review; (4) evaluation based on examining the coverage...... of controversial views; (5) evidence-based evaluation; (6) comparative studies; (7) author credentials; (8) publisher reputation; (9) journal impact factor; (10) sponsoring: tracing the influence of economic, political, and ideological interests; (11) book reviews and book reviewing; and (12) broader criteria....... Reading a text is often not a simple process. All the methods discussed here are steps on the way on learning how to read, understand, and criticize texts. According to hermeneutics it involves the subjectivity of the reader, and that subjectivity is influenced, more or less, by different theoretical...

  7. Genomic breeding value prediction:methods and procedures

    NARCIS (Netherlands)

    Calus, M.P.L.

    2010-01-01

    Animal breeding faces one of the most significant changes of the past decades – the implementation of genomic selection. Genomic selection uses dense marker maps to predict the breeding value of animals with reported accuracies that are up to 0.31 higher than those of pedigree indexes, without the

  8. Evaluation of FTA ® paper for storage of oral meta-genomic DNA.

    Science.gov (United States)

    Foitzik, Magdalena; Stumpp, Sascha N; Grischke, Jasmin; Eberhard, Jörg; Stiesch, Meike

    2014-10-01

    The purpose of the present study was to evaluate the short-term storage of meta-genomic DNA from native oral biofilms on FTA(®) paper. Thirteen volunteers of both sexes received an acrylic splint for intraoral biofilm formation over a period of 48 hours. The biofilms were collected, resuspended in phosphate-buffered saline, and either stored on FTA(®) paper or directly processed by standard laboratory DNA extraction. The nucleic acid extraction efficiencies were evaluated by 16S rDNA targeted SSCP fingerprinting. The acquired banding pattern of FTA-derived meta-genomic DNA was compared to a standard DNA preparation protocol. Sensitivity and positive predictive values were calculated. The volunteers showed inter-individual differences in their bacterial species composition. A total of 200 bands were found for both methods and 85% of the banding patterns were equal, representing a sensitivity of 0.941 and a false-negative predictive value of 0.059. Meta-genomic DNA sampling, extraction, and adhesion using FTA(®) paper is a reliable method for storage of microbial DNA for a short period of time.

  9. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    Science.gov (United States)

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  10. An Evaluation Framework for Lossy Compression of Genome Sequencing Quality Values

    OpenAIRE

    Alberti, Claudio; Daniels, Noah; Hernaez, Mikel; Voges, Jan; Goldfeder, Rachel L.; Hernandez-Lopez, Ana A.; Mattavelli, Marco; Berger, Bonnie

    2016-01-01

    This paper provides the specification and an initial validation of an evaluation framework for the comparison of lossy compressors of genome sequencing quality values. The goal is to define reference data, test sets, tools and metrics that shall be used to evaluate the impact of lossy compression of quality values on human genome variant calling. The functionality of the framework is validated referring to two state-of-the-art genomic compressors. This work has been spurred by the current act...

  11. Evaluation of Genomic Instability in the Abnormal Prostate

    National Research Council Canada - National Science Library

    Haaland-Pullus, Christina; Griffith, Jeffrey K

    2006-01-01

    ...: prognosis and diagnosis. Several tools are being used to investigate this effect, specifically the assessment of telomere length, allelic imbalance, and methylation status, all markers of genomic instability...

  12. Evaluation of Genomic Instability in the Abnormal Prostate

    National Research Council Canada - National Science Library

    Haaland-Pullus, Christina; Griffth, Jeffrey K

    2008-01-01

    ...: prognosis and diagnosis. Several tools are being used to investigate this effect, specifically the assessment of telomere length, allelic imbalance, and methylation status, all markers of genomic instability...

  13. Genomic comparisons of Brucella spp. and closely related bacteria using base compositional and proteome based methods

    DEFF Research Database (Denmark)

    Bohlin, Jon; Snipen, Lars; Cloeckaert, Axel

    2010-01-01

    BACKGROUND: Classification of bacteria within the genus Brucella has been difficult due in part to considerable genomic homogeneity between the different species and biovars, in spite of clear differences in phenotypes. Therefore, many different methods have been used to assess Brucella taxonomy....... In the current work, we examine 32 sequenced genomes from genus Brucella representing the six classical species, as well as more recently described species, using bioinformatical methods. Comparisons were made at the level of genomic DNA using oligonucleotide based methods (Markov chain based genomic signatures...... between the oligonucleotide based methods used. Whilst the Markov chain based genomic signatures grouped the different species in genus Brucella according to host preference, the codon and amino acid frequencies based methods reflected small differences between the Brucella species. Only minor differences...

  14. Two efficient methods for isolation of high-quality genomic DNA from entomopathogenic fungi.

    Science.gov (United States)

    Serna-Domínguez, María G; Andrade-Michel, Gilda Y; Arredondo-Bernal, Hugo C; Gallou, Adrien

    2018-03-27

    Conventional and commercial methods for isolation of nucleic acids are available for fungal samples including entomopathogenic fungi (EPF). However, there is not a unique optimal method for all organisms. The cell wall structure and the wide range of secondary metabolites of EPF can broadly interfere with the efficiency of the DNA extraction protocol. This study compares three commercial protocols: DNeasy® Plant Mini Kit (Qiagen), Wizard® Genomic DNA Purification Kit (Promega), and Axygen™ Multisource Genomic DNA Miniprep Kit (Axygen) and three conventional methods based on different buffers: SDS, CTAB/PVPP, and CTAB/β-mercaptoethanol versus three cell lysis procedures: liquid nitrogen homogenization and two bead-beating materials (i.e., tungsten-carbide and stainless-steel) for four representative species of EPF (i.e., Beauveria bassiana, Hirsutella citriformis, Isaria javanica, and Metarhizium anisopliae). Liquid nitrogen homogenization combined with DNeasy® Plant Mini Kit (i.e., QN) or SDS buffer (i.e., SN) significantly improved the yield with a good purity (~1.8) and high integrity (>20,000 bp) of genomic DNA in contrast with other methods, also, these results were better when compared with the two bead-beating materials. The purified DNA was evaluated by PCR-based techniques: amplification of translation elongation factor 1-α (TEF) and two highly sensitive molecular markers (i.e., ISSR and AFLP) with reliable and reproducible results. Despite a variation in yield, purity, and integrity of extracted DNA across the four species of EPF with the different DNA extraction methods, the SN and QN protocols maintained a high-quality of DNA which is required for downstream molecular applications. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Genomes

    National Research Council Canada - National Science Library

    Brown, T. A. (Terence A.)

    2002-01-01

    ... of genome expression and replication processes, and transcriptomics and proteomics. This text is richly illustrated with clear, easy-to-follow, full color diagrams, which are downloadable from the book's website...

  16. A simple and effective method for construction of Escherichia coli strains proficient for genome engineering.

    Directory of Open Access Journals (Sweden)

    Young Shin Ryu

    Full Text Available Multiplex genome engineering is a standalone recombineering tool for large-scale programming and accelerated evolution of cells. However, this advanced genome engineering technique has been limited to use in selected bacterial strains. We developed a simple and effective strain-independent method for effective genome engineering in Escherichia coli. The method involves introducing a suicide plasmid carrying the λ Red recombination system into the mutS gene. The suicide plasmid can be excised from the chromosome via selection in the absence of antibiotics, thus allowing transient inactivation of the mismatch repair system during genome engineering. In addition, we developed another suicide plasmid that enables integration of large DNA fragments into the lacZ genomic locus. These features enable this system to be applied in the exploitation of the benefits of genome engineering in synthetic biology, as well as the metabolic engineering of different strains of E. coli.

  17. Controversy and debate on clinical genomics sequencing-paper 1: genomics is not exceptional: rigorous evaluations are necessary for clinical applications of genomic sequencing.

    Science.gov (United States)

    Wilson, Brenda J; Miller, Fiona Alice; Rousseau, François

    2017-12-01

    Next generation genomic sequencing (NGS) technologies-whole genome and whole exome sequencing-are now cheap enough to be within the grasp of many health care organizations. To many, NGS is symbolic of cutting edge health care, offering the promise of "precision" and "personalized" medicine. Historically, research and clinical application has been a two-way street in clinical genetics: research often driven directly by the desire to understand and try to solve immediate clinical problems affecting real, identifiable patients and families, accompanied by a low threshold of willingness to apply research-driven interventions without resort to formal empirical evaluations. However, NGS technologies are not simple substitutes for older technologies and need careful evaluation for use as screening, diagnostic, or prognostic tools. We have concerns across three areas. First, at the moment, analytic validity is unknown because technical platforms are not yet stable, laboratory quality assurance programs are in their infancy, and data interpretation capabilities are badly underdeveloped. Second, clinical validity of genomic findings for patient populations without pre-existing high genetic risk is doubtful, as most clinical experience with NGS technologies relates to patients with a high prior likelihood of a genetic etiology. Finally, we are concerned that proponents argue not only for clinically driven approaches to assessing a patient's genome, but also for seeking out variants associated with unrelated conditions or susceptibilities-so-called "secondary targets"-this is screening on a genomic scale. We argue that clinical uses of genomic sequencing should remain limited to specialist and research settings, that screening for secondary findings in clinical testing should be limited to the maximum extent possible, and that the benefits, harms, and economic implications of their routine use be systematically evaluated. All stakeholders have a responsibility to ensure that

  18. Codon usage bias: causative factors, quantification methods and genome-wide patterns: with emphasis on insect genomes.

    Science.gov (United States)

    Behura, Susanta K; Severson, David W

    2013-02-01

    Codon usage bias refers to the phenomenon where specific codons are used more often than other synonymous codons during translation of genes, the extent of which varies within and among species. Molecular evolutionary investigations suggest that codon bias is manifested as a result of balance between mutational and translational selection of such genes and that this phenomenon is widespread across species and may contribute to genome evolution in a significant manner. With the advent of whole-genome sequencing of numerous species, both prokaryotes and eukaryotes, genome-wide patterns of codon bias are emerging in different organisms. Various factors such as expression level, GC content, recombination rates, RNA stability, codon position, gene length and others (including environmental stress and population size) can influence codon usage bias within and among species. Moreover, there has been a continuous quest towards developing new concepts and tools to measure the extent of codon usage bias of genes. In this review, we outline the fundamental concepts of evolution of the genetic code, discuss various factors that may influence biased usage of synonymous codons and then outline different principles and methods of measurement of codon usage bias. Finally, we discuss selected studies performed using whole-genome sequences of different insect species to show how codon bias patterns vary within and among genomes. We conclude with generalized remarks on specific emerging aspects of codon bias studies and highlight the recent explosion of genome-sequencing efforts on arthropods (such as twelve Drosophila species, species of ants, honeybee, Nasonia and Anopheles mosquitoes as well as the recent launch of a genome-sequencing project involving 5000 insects and other arthropods) that may help us to understand better the evolution of codon bias and its biological significance. © 2012 The Authors. Biological Reviews © 2012 Cambridge Philosophical Society.

  19. An Evaluation Framework for Lossy Compression of Genome Sequencing Quality Values.

    Science.gov (United States)

    Alberti, Claudio; Daniels, Noah; Hernaez, Mikel; Voges, Jan; Goldfeder, Rachel L; Hernandez-Lopez, Ana A; Mattavelli, Marco; Berger, Bonnie

    2016-01-01

    This paper provides the specification and an initial validation of an evaluation framework for the comparison of lossy compressors of genome sequencing quality values. The goal is to define reference data, test sets, tools and metrics that shall be used to evaluate the impact of lossy compression of quality values on human genome variant calling. The functionality of the framework is validated referring to two state-of-the-art genomic compressors. This work has been spurred by the current activity within the ISO/IEC SC29/WG11 technical committee (a.k.a. MPEG), which is investigating the possibility of starting a standardization activity for genomic information representation.

  20. Accuracy of multi-trait genomic selection using different methods

    NARCIS (Netherlands)

    Calus, M.P.L.; Veerkamp, R.F.

    2011-01-01

    Background Genomic selection has become a very important tool in animal genetics and is rapidly emerging in plant genetics. It holds the promise to be particularly beneficial to select for traits that are difficult or expensive to measure, such as traits that are measured in one environment and

  1. Multiple-trait genetic evaluation using genomic matrix

    African Journals Online (AJOL)

    Jane

    2011-07-06

    Jul 6, 2011 ... relationships was estimated through computer simulation and was compared with the accuracy of ... programs, detect animals with superior genetic and select ... genomic matrices in the mixed model equations of BLUP.

  2. Evaluation of three automated genome annotations for Halorhabdus utahensis.

    Directory of Open Access Journals (Sweden)

    Peter Bakke

    2009-07-01

    Full Text Available Genome annotations are accumulating rapidly and depend heavily on automated annotation systems. Many genome centers offer annotation systems but no one has compared their output in a systematic way to determine accuracy and inherent errors. Errors in the annotations are routinely deposited in databases such as NCBI and used to validate subsequent annotation errors. We submitted the genome sequence of halophilic archaeon Halorhabdus utahensis to be analyzed by three genome annotation services. We have examined the output from each service in a variety of ways in order to compare the methodology and effectiveness of the annotations, as well as to explore the genes, pathways, and physiology of the previously unannotated genome. The annotation services differ considerably in gene calls, features, and ease of use. We had to manually identify the origin of replication and the species-specific consensus ribosome-binding site. Additionally, we conducted laboratory experiments to test H. utahensis growth and enzyme activity. Current annotation practices need to improve in order to more accurately reflect a genome's biological potential. We make specific recommendations that could improve the quality of microbial annotation projects.

  3. Effect of genotyped cows in the reference population on the genomic evaluation of Holstein cattle.

    Science.gov (United States)

    Uemoto, Y; Osawa, T; Saburi, J

    2017-03-01

    This study evaluated the dependence of reliability and prediction bias on the prediction method, the contribution of including animals (bulls or cows), and the genetic relatedness, when including genotyped cows in the progeny-tested bull reference population. We performed genomic evaluation using a Japanese Holstein population, and assessed the accuracy of genomic enhanced breeding value (GEBV) for three production traits and 13 linear conformation traits. A total of 4564 animals for production traits and 4172 animals for conformation traits were genotyped using Illumina BovineSNP50 array. Single- and multi-step methods were compared for predicting GEBV in genotyped bull-only and genotyped bull-cow reference populations. No large differences in realized reliability and regression coefficient were found between the two reference populations; however, a slight difference was found between the two methods for production traits. The accuracy of GEBV determined by single-step method increased slightly when genotyped cows were included in the bull reference population, but decreased slightly by multi-step method. A validation study was used to evaluate the accuracy of GEBV when 800 additional genotyped bulls (POPbull) or cows (POPcow) were included in the base reference population composed of 2000 genotyped bulls. The realized reliabilities of POPbull were higher than those of POPcow for all traits. For the gain of realized reliability over the base reference population, the average ratios of POPbull gain to POPcow gain for production traits and conformation traits were 2.6 and 7.2, respectively, and the ratios depended on heritabilities of the traits. For regression coefficient, no large differences were found between the results for POPbull and POPcow. Another validation study was performed to investigate the effect of genetic relatedness between cows and bulls in the reference and test populations. The effect of genetic relationship among bulls in the reference

  4. Comparison of methods used to identify superior individuals in genomic selection in plant breeding.

    Science.gov (United States)

    Bhering, L L; Junqueira, V S; Peixoto, L A; Cruz, C D; Laviola, B G

    2015-09-10

    The aim of this study was to evaluate different methods used in genomic selection, and to verify those that select a higher proportion of individuals with superior genotypes. Thus, F2 populations of different sizes were simulated (100, 200, 500, and 1000 individuals) with 10 replications each. These consisted of 10 linkage groups (LG) of 100 cM each, containing 100 equally spaced markers per linkage group, of which 200 controlled the characteristics, defined as the 20 initials of each LG. Genetic and phenotypic values were simulated assuming binomial distribution of effects for each LG, and the absence of dominance. For phenotypic values, heritabilities of 20, 50, and 80% were considered. To compare methodologies, the analysis processing time, coefficient of coincidence (selection of 5, 10, and 20% of superior individuals), and Spearman correlation between true genetic values, and the genomic values predicted by each methodology were determined. Considering the processing time, the three methodologies were statistically different, rrBLUP was the fastest, and Bayesian LASSO was the slowest. Spearman correlation revealed that the rrBLUP and GBLUP methodologies were equivalent, and Bayesian LASSO provided the lowest correlation values. Similar results were obtained in coincidence variables among the individuals selected, in which Bayesian LASSO differed statistically and presented a lower value than the other methodologies. Therefore, for the scenarios evaluated, rrBLUP is the best methodology for the selection of genetically superior individuals.

  5. Reliability and applications of statistical methods based on oligonucleotide frequencies in bacterial and archaeal genomes

    DEFF Research Database (Denmark)

    Bohlin, J; Skjerve, E; Ussery, David

    2008-01-01

    with here are mainly used to examine similarities between archaeal and bacterial DNA from different genomes. These methods compare observed genomic frequencies of fixed-sized oligonucleotides with expected values, which can be determined by genomic nucleotide content, smaller oligonucleotide frequencies......, or be based on specific statistical distributions. Advantages with these statistical methods include measurements of phylogenetic relationship with relatively small pieces of DNA sampled from almost anywhere within genomes, detection of foreign/conserved DNA, and homology searches. Our aim was to explore...... the reliability and best suited applications for some popular methods, which include relative oligonucleotide frequencies (ROF), di- to hexanucleotide zero'th order Markov methods (ZOM) and 2.order Markov chain Method (MCM). Tests were performed on distant homology searches with large DNA sequences, detection...

  6. Estimated allele substitution effects underlying genomic evaluation models depend on the scaling of allele counts

    NARCIS (Netherlands)

    Bouwman, Aniek C.; Hayes, Ben J.; Calus, Mario P.L.

    2017-01-01

    Background: Genomic evaluation is used to predict direct genomic values (DGV) for selection candidates in breeding programs, but also to estimate allele substitution effects (ASE) of single nucleotide polymorphisms (SNPs). Scaling of allele counts influences the estimated ASE, because scaling of

  7. Genomes correction and assembling: present methods and tools

    Science.gov (United States)

    Wojcieszek, Michał; Pawełkowicz, Magdalena; Nowak, Robert; Przybecki, Zbigniew

    2014-11-01

    Recent rapid development of next generation sequencing (NGS) technologies provided significant impact into genomics field of study enabling implementation of many de novo sequencing projects of new species which was previously confined by technological costs. Along with advancement of NGS there was need for adjustment in assembly programs. New algorithms must cope with massive amounts of data computation in reasonable time limits and processing power and hardware is also an important factor. In this paper, we address the issue of assembly pipeline for de novo genome assembly provided by programs presently available for scientist both as commercial and as open - source software. The implementation of four different approaches - Greedy, Overlap - Layout - Consensus (OLC), De Bruijn and Integrated resulting in variation of performance is the main focus of our discussion with additional insight into issue of short and long reads correction.

  8. Genome-wide prediction of cis-regulatory regions using supervised deep learning methods.

    Science.gov (United States)

    Li, Yifeng; Shi, Wenqiang; Wasserman, Wyeth W

    2018-05-31

    In the human genome, 98% of DNA sequences are non-protein-coding regions that were previously disregarded as junk DNA. In fact, non-coding regions host a variety of cis-regulatory regions which precisely control the expression of genes. Thus, Identifying active cis-regulatory regions in the human genome is critical for understanding gene regulation and assessing the impact of genetic variation on phenotype. The developments of high-throughput sequencing and machine learning technologies make it possible to predict cis-regulatory regions genome wide. Based on rich data resources such as the Encyclopedia of DNA Elements (ENCODE) and the Functional Annotation of the Mammalian Genome (FANTOM) projects, we introduce DECRES based on supervised deep learning approaches for the identification of enhancer and promoter regions in the human genome. Due to their ability to discover patterns in large and complex data, the introduction of deep learning methods enables a significant advance in our knowledge of the genomic locations of cis-regulatory regions. Using models for well-characterized cell lines, we identify key experimental features that contribute to the predictive performance. Applying DECRES, we delineate locations of 300,000 candidate enhancers genome wide (6.8% of the genome, of which 40,000 are supported by bidirectional transcription data), and 26,000 candidate promoters (0.6% of the genome). The predicted annotations of cis-regulatory regions will provide broad utility for genome interpretation from functional genomics to clinical applications. The DECRES model demonstrates potentials of deep learning technologies when combined with high-throughput sequencing data, and inspires the development of other advanced neural network models for further improvement of genome annotations.

  9. The gradient boosting algorithm and random boosting for genome-assisted evaluation in large data sets.

    Science.gov (United States)

    González-Recio, O; Jiménez-Montero, J A; Alenda, R

    2013-01-01

    In the next few years, with the advent of high-density single nucleotide polymorphism (SNP) arrays and genome sequencing, genomic evaluation methods will need to deal with a large number of genetic variants and an increasing sample size. The boosting algorithm is a machine-learning technique that may alleviate the drawbacks of dealing with such large data sets. This algorithm combines different predictors in a sequential manner with some shrinkage on them; each predictor is applied consecutively to the residuals from the committee formed by the previous ones to form a final prediction based on a subset of covariates. Here, a detailed description is provided and examples using a toy data set are included. A modification of the algorithm called "random boosting" was proposed to increase predictive ability and decrease computation time of genome-assisted evaluation in large data sets. Random boosting uses a random selection of markers to add a subsequent weak learner to the predictive model. These modifications were applied to a real data set composed of 1,797 bulls genotyped for 39,714 SNP. Deregressed proofs of 4 yield traits and 1 type trait from January 2009 routine evaluations were used as dependent variables. A 2-fold cross-validation scenario was implemented. Sires born before 2005 were used as a training sample (1,576 and 1,562 for production and type traits, respectively), whereas younger sires were used as a testing sample to evaluate predictive ability of the algorithm on yet-to-be-observed phenotypes. Comparison with the original algorithm was provided. The predictive ability of the algorithm was measured as Pearson correlations between observed and predicted responses. Further, estimated bias was computed as the average difference between observed and predicted phenotypes. The results showed that the modification of the original boosting algorithm could be run in 1% of the time used with the original algorithm and with negligible differences in accuracy

  10. A simple method for the parallel deep sequencing of full influenza A genomes

    DEFF Research Database (Denmark)

    Kampmann, Marie-Louise; Fordyce, Sarah Louise; Avila Arcos, Maria del Carmen

    2011-01-01

    Given the major threat of influenza A to human and animal health, and its ability to evolve rapidly through mutation and reassortment, tools that enable its timely characterization are necessary to help monitor its evolution and spread. For this purpose, deep sequencing can be a very valuable tool....... This study reports a comprehensive method that enables deep sequencing of the complete genomes of influenza A subtypes using the Illumina Genome Analyzer IIx (GAIIx). By using this method, the complete genomes of nine viruses were sequenced in parallel, representing the 2009 pandemic H1N1 virus, H5N1 virus...

  11. Pragmatism, Evidence, and Mixed Methods Evaluation

    Science.gov (United States)

    Hall, Jori N.

    2013-01-01

    Mixed methods evaluation has a long-standing history of enhancing the credibility of evaluation findings. However, using mixed methods in a utilitarian way implicitly emphasizes convenience over engaging with its philosophical underpinnings (Denscombe, 2008). Because of this, some mixed methods evaluators and social science researchers have been…

  12. Benchmarking Relatedness Inference Methods with Genome-Wide Data from Thousands of Relatives.

    Science.gov (United States)

    Ramstetter, Monica D; Dyer, Thomas D; Lehman, Donna M; Curran, Joanne E; Duggirala, Ravindranath; Blangero, John; Mezey, Jason G; Williams, Amy L

    2017-09-01

    Inferring relatedness from genomic data is an essential component of genetic association studies, population genetics, forensics, and genealogy. While numerous methods exist for inferring relatedness, thorough evaluation of these approaches in real data has been lacking. Here, we report an assessment of 12 state-of-the-art pairwise relatedness inference methods using a data set with 2485 individuals contained in several large pedigrees that span up to six generations. We find that all methods have high accuracy (92-99%) when detecting first- and second-degree relationships, but their accuracy dwindles to 76% of relative pairs. Overall, the most accurate methods are Estimation of Recent Shared Ancestry (ERSA) and approaches that compute total IBD sharing using the output from GERMLINE and Refined IBD to infer relatedness. Combining information from the most accurate methods provides little accuracy improvement, indicating that novel approaches, such as new methods that leverage relatedness signals from multiple samples, are needed to achieve a sizeable jump in performance. Copyright © 2017 Ramstetter et al.

  13. A novel bioinformatics method for efficient knowledge discovery by BLSOM from big genomic sequence data.

    Science.gov (United States)

    Bai, Yu; Iwasaki, Yuki; Kanaya, Shigehiko; Zhao, Yue; Ikemura, Toshimichi

    2014-01-01

    With remarkable increase of genomic sequence data of a wide range of species, novel tools are needed for comprehensive analyses of the big sequence data. Self-Organizing Map (SOM) is an effective tool for clustering and visualizing high-dimensional data such as oligonucleotide composition on one map. By modifying the conventional SOM, we have previously developed Batch-Learning SOM (BLSOM), which allows classification of sequence fragments according to species, solely depending on the oligonucleotide composition. In the present study, we introduce the oligonucleotide BLSOM used for characterization of vertebrate genome sequences. We first analyzed pentanucleotide compositions in 100 kb sequences derived from a wide range of vertebrate genomes and then the compositions in the human and mouse genomes in order to investigate an efficient method for detecting differences between the closely related genomes. BLSOM can recognize the species-specific key combination of oligonucleotide frequencies in each genome, which is called a "genome signature," and the specific regions specifically enriched in transcription-factor-binding sequences. Because the classification and visualization power is very high, BLSOM is an efficient powerful tool for extracting a wide range of information from massive amounts of genomic sequences (i.e., big sequence data).

  14. Evaluation of approaches for estimating the accuracy of genomic prediction in plant breeding.

    Science.gov (United States)

    Ould Estaghvirou, Sidi Boubacar; Ogutu, Joseph O; Schulz-Streeck, Torben; Knaak, Carsten; Ouzunova, Milena; Gordillo, Andres; Piepho, Hans-Peter

    2013-12-06

    In genomic prediction, an important measure of accuracy is the correlation between the predicted and the true breeding values. Direct computation of this quantity for real datasets is not possible, because the true breeding value is unknown. Instead, the correlation between the predicted breeding values and the observed phenotypic values, called predictive ability, is often computed. In order to indirectly estimate predictive accuracy, this latter correlation is usually divided by an estimate of the square root of heritability. In this study we use simulation to evaluate estimates of predictive accuracy for seven methods, four (1 to 4) of which use an estimate of heritability to divide predictive ability computed by cross-validation. Between them the seven methods cover balanced and unbalanced datasets as well as correlated and uncorrelated genotypes. We propose one new indirect method (4) and two direct methods (5 and 6) for estimating predictive accuracy and compare their performances and those of four other existing approaches (three indirect (1 to 3) and one direct (7)) with simulated true predictive accuracy as the benchmark and with each other. The size of the estimated genetic variance and hence heritability exerted the strongest influence on the variation in the estimated predictive accuracy. Increasing the number of genotypes considerably increases the time required to compute predictive accuracy by all the seven methods, most notably for the five methods that require cross-validation (Methods 1, 2, 3, 4 and 6). A new method that we propose (Method 5) and an existing method (Method 7) used in animal breeding programs were the fastest and gave the least biased, most precise and stable estimates of predictive accuracy. Of the methods that use cross-validation Methods 4 and 6 were often the best. The estimated genetic variance and the number of genotypes had the greatest influence on predictive accuracy. Methods 5 and 7 were the fastest and produced the least

  15. Multiplexed genome engineering and genotyping methods applications for synthetic biology and metabolic engineering.

    Science.gov (United States)

    Wang, Harris H; Church, George M

    2011-01-01

    Engineering at the scale of whole genomes requires fundamentally new molecular biology tools. Recent advances in recombineering using synthetic oligonucleotides enable the rapid generation of mutants at high efficiency and specificity and can be implemented at the genome scale. With these techniques, libraries of mutants can be generated, from which individuals with functionally useful phenotypes can be isolated. Furthermore, populations of cells can be evolved in situ by directed evolution using complex pools of oligonucleotides. Here, we discuss ways to utilize these multiplexed genome engineering methods, with special emphasis on experimental design and implementation. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Systematic evaluation of bias in microbial community profiles induced by whole genome amplification.

    NARCIS (Netherlands)

    Direito, S.; Zaura, E.; Little, M.; Ehrenfreund, P.; Roling, W.F.M.

    2014-01-01

    Whole genome amplification methods facilitate the detection and characterization of microbial communities in low biomass environments. We examined the extent to which the actual community structure is reliably revealed and factors contributing to bias. One widely used [multiple displacement

  17. Systematic evaluation of bias in microbial community profiles induced by whole genome amplification

    NARCIS (Netherlands)

    Direito, S.O.L.; Zaura, E.; Little, M.; Ehrenfreund, P.; Röling, W.F.M.

    2014-01-01

    Whole genome amplification methods facilitate the detection and characterization of microbial communities in low biomass environments. We examined the extent to which the actual community structure is reliably revealed and factors contributing to bias. One widely used [multiple displacement

  18. A systematic study of genome context methods: calibration, normalization and combination

    Directory of Open Access Journals (Sweden)

    Dale Joseph M

    2010-10-01

    Full Text Available Abstract Background Genome context methods have been introduced in the last decade as automatic methods to predict functional relatedness between genes in a target genome using the patterns of existence and relative locations of the homologs of those genes in a set of reference genomes. Much work has been done in the application of these methods to different bioinformatics tasks, but few papers present a systematic study of the methods and their combination necessary for their optimal use. Results We present a thorough study of the four main families of genome context methods found in the literature: phylogenetic profile, gene fusion, gene cluster, and gene neighbor. We find that for most organisms the gene neighbor method outperforms the phylogenetic profile method by as much as 40% in sensitivity, being competitive with the gene cluster method at low sensitivities. Gene fusion is generally the worst performing of the four methods. A thorough exploration of the parameter space for each method is performed and results across different target organisms are presented. We propose the use of normalization procedures as those used on microarray data for the genome context scores. We show that substantial gains can be achieved from the use of a simple normalization technique. In particular, the sensitivity of the phylogenetic profile method is improved by around 25% after normalization, resulting, to our knowledge, on the best-performing phylogenetic profile system in the literature. Finally, we show results from combining the various genome context methods into a single score. When using a cross-validation procedure to train the combiners, with both original and normalized scores as input, a decision tree combiner results in gains of up to 20% with respect to the gene neighbor method. Overall, this represents a gain of around 15% over what can be considered the state of the art in this area: the four original genome context methods combined using a

  19. Evaluation of Mammalian Interspersed Repeats to investigate the goat genome

    Directory of Open Access Journals (Sweden)

    P. Mariani

    2010-01-01

    Full Text Available Among the repeated sequences present in most eukaryotic genomes, SINEs (Short Interspersed Nuclear Elements are widely used to investigate evolution in the mammalian order (Buchanan et al., 1999. One family of these repetitive sequences, the MIR (Mammalian Interspersed Repeats; Jurka et al., 1995, is ubiquitous in all mammals.MIR elements are tRNA-derived SINEs and are identifiable by a conserved core region of about 70 nucleotides.

  20. Discount method for programming language evaluation

    DEFF Research Database (Denmark)

    Kurtev, Svetomir; Christensen, Tommy Aagaard; Thomsen, Bent

    2016-01-01

    This paper presents work in progress on developing a Discount Method for Programming Language Evaluation inspired by the Discount Usability Evaluation method (Benyon 2010) and the Instant Data Analysis method (Kjeldskov et al. 2004). The method is intended to bridge the gap between small scale...... internal language design evaluation methods and large scale surveys and quantitative evaluation methods. The method is designed to be applicable even before a compiler or IDE is developed for a new language. To test the method, a usability evaluation experiment was carried out on the Quorum programming...... language (Stefik et al. 2016) using programmers with experience in C and C#. When comparing our results with previous studies of Quorum, most of the data was comparable though not strictly in agreement. However, the discrepancies were mainly related to the programmers pre-existing expectations...

  1. Computational methods for data evaluation and assimilation

    CERN Document Server

    Cacuci, Dan Gabriel

    2013-01-01

    Data evaluation and data combination require the use of a wide range of probability theory concepts and tools, from deductive statistics mainly concerning frequencies and sample tallies to inductive inference for assimilating non-frequency data and a priori knowledge. Computational Methods for Data Evaluation and Assimilation presents interdisciplinary methods for integrating experimental and computational information. This self-contained book shows how the methods can be applied in many scientific and engineering areas. After presenting the fundamentals underlying the evaluation of experiment

  2. Evaluation of nitrate destruction methods

    International Nuclear Information System (INIS)

    Taylor, P.A.; Kurath, D.E.; Guenther, R.

    1993-01-01

    A wide variety of high nitrate-concentration aqueous mixed [radioactive and Resource Conservation and Recovery Act (RCRA) hazardous] wastes are stored at various US Department of Energy (DOE) facilities. These wastes will ultimately be solidified for final disposal, although the waste acceptance criteria for the final waste form is still being determined. Because the nitrates in the wastes will normally increase the volume or reduce the integrity of all of the waste forms under consideration for final disposal, nitrate destruction before solidification of the waste will generally be beneficial. This report describes and evaluates various technologies that could be used to destroy the nitrates in the stored wastes. This work was funded by the Department of Energy's Office of Technology Development, through the Chemical/Physical Technology Support Group of the Mixed Waste Integrated Program. All the nitrate destruction technologies will require further development work before a facility could be designed and built to treat the majority of the stored wastes. Several of the technologies have particularly attractive features: the nitrate to ammonia and ceramic (NAC) process produces an insoluble waste form with a significant volume reduction, electrochemical reduction destroys nitrates without any chemical addition, and the hydrothermal process can simultaneously treat nitrates and organics in both acidic and alkaline wastes. These three technologies have been tested using lab-scale equipment and surrogate solutions. At their current state of development, it is not possible to predict which process will be the most beneficial for a particular waste stream

  3. An overview of recent developments in genomics and associated statistical methods.

    Science.gov (United States)

    Bickel, Peter J; Brown, James B; Huang, Haiyan; Li, Qunhua

    2009-11-13

    The landscape of genomics has changed drastically in the last two decades. Increasingly inexpensive sequencing has shifted the primary focus from the acquisition of biological sequences to the study of biological function. Assays have been developed to study many intricacies of biological systems, and publicly available databases have given rise to integrative analyses that combine information from many sources to draw complex conclusions. Such research was the focus of the recent workshop at the Isaac Newton Institute, 'High dimensional statistics in biology'. Many computational methods from modern genomics and related disciplines were presented and discussed. Using, as much as possible, the material from these talks, we give an overview of modern genomics: from the essential assays that make data-generation possible, to the statistical methods that yield meaningful inference. We point to current analytical challenges, where novel methods, or novel applications of extant methods, are presently needed.

  4. Evidence-based design and evaluation of a whole genome sequencing clinical report for the reference microbiology laboratory.

    Science.gov (United States)

    Crisan, Anamaria; McKee, Geoffrey; Munzner, Tamara; Gardy, Jennifer L

    2018-01-01

    Microbial genome sequencing is now being routinely used in many clinical and public health laboratories. Understanding how to report complex genomic test results to stakeholders who may have varying familiarity with genomics-including clinicians, laboratorians, epidemiologists, and researchers-is critical to the successful and sustainable implementation of this new technology; however, there are no evidence-based guidelines for designing such a report in the pathogen genomics domain. Here, we describe an iterative, human-centered approach to creating a report template for communicating tuberculosis (TB) genomic test results. We used Design Study Methodology-a human centered approach drawn from the information visualization domain-to redesign an existing clinical report. We used expert consults and an online questionnaire to discover various stakeholders' needs around the types of data and tasks related to TB that they encounter in their daily workflow. We also evaluated their perceptions of and familiarity with genomic data, as well as its utility at various clinical decision points. These data shaped the design of multiple prototype reports that were compared against the existing report through a second online survey, with the resulting qualitative and quantitative data informing the final, redesigned, report. We recruited 78 participants, 65 of whom were clinicians, nurses, laboratorians, researchers, and epidemiologists involved in TB diagnosis, treatment, and/or surveillance. Our first survey indicated that participants were largely enthusiastic about genomic data, with the majority agreeing on its utility for certain TB diagnosis and treatment tasks and many reporting some confidence in their ability to interpret this type of data (between 58.8% and 94.1%, depending on the specific data type). When we compared our four prototype reports against the existing design, we found that for the majority (86.7%) of design comparisons, participants preferred the

  5. Evidence-based design and evaluation of a whole genome sequencing clinical report for the reference microbiology laboratory

    Science.gov (United States)

    Crisan, Anamaria; McKee, Geoffrey; Munzner, Tamara

    2018-01-01

    Background Microbial genome sequencing is now being routinely used in many clinical and public health laboratories. Understanding how to report complex genomic test results to stakeholders who may have varying familiarity with genomics—including clinicians, laboratorians, epidemiologists, and researchers—is critical to the successful and sustainable implementation of this new technology; however, there are no evidence-based guidelines for designing such a report in the pathogen genomics domain. Here, we describe an iterative, human-centered approach to creating a report template for communicating tuberculosis (TB) genomic test results. Methods We used Design Study Methodology—a human centered approach drawn from the information visualization domain—to redesign an existing clinical report. We used expert consults and an online questionnaire to discover various stakeholders’ needs around the types of data and tasks related to TB that they encounter in their daily workflow. We also evaluated their perceptions of and familiarity with genomic data, as well as its utility at various clinical decision points. These data shaped the design of multiple prototype reports that were compared against the existing report through a second online survey, with the resulting qualitative and quantitative data informing the final, redesigned, report. Results We recruited 78 participants, 65 of whom were clinicians, nurses, laboratorians, researchers, and epidemiologists involved in TB diagnosis, treatment, and/or surveillance. Our first survey indicated that participants were largely enthusiastic about genomic data, with the majority agreeing on its utility for certain TB diagnosis and treatment tasks and many reporting some confidence in their ability to interpret this type of data (between 58.8% and 94.1%, depending on the specific data type). When we compared our four prototype reports against the existing design, we found that for the majority (86.7%) of design

  6. Evidence-based design and evaluation of a whole genome sequencing clinical report for the reference microbiology laboratory

    Directory of Open Access Journals (Sweden)

    Anamaria Crisan

    2018-01-01

    Full Text Available Background Microbial genome sequencing is now being routinely used in many clinical and public health laboratories. Understanding how to report complex genomic test results to stakeholders who may have varying familiarity with genomics—including clinicians, laboratorians, epidemiologists, and researchers—is critical to the successful and sustainable implementation of this new technology; however, there are no evidence-based guidelines for designing such a report in the pathogen genomics domain. Here, we describe an iterative, human-centered approach to creating a report template for communicating tuberculosis (TB genomic test results. Methods We used Design Study Methodology—a human centered approach drawn from the information visualization domain—to redesign an existing clinical report. We used expert consults and an online questionnaire to discover various stakeholders’ needs around the types of data and tasks related to TB that they encounter in their daily workflow. We also evaluated their perceptions of and familiarity with genomic data, as well as its utility at various clinical decision points. These data shaped the design of multiple prototype reports that were compared against the existing report through a second online survey, with the resulting qualitative and quantitative data informing the final, redesigned, report. Results We recruited 78 participants, 65 of whom were clinicians, nurses, laboratorians, researchers, and epidemiologists involved in TB diagnosis, treatment, and/or surveillance. Our first survey indicated that participants were largely enthusiastic about genomic data, with the majority agreeing on its utility for certain TB diagnosis and treatment tasks and many reporting some confidence in their ability to interpret this type of data (between 58.8% and 94.1%, depending on the specific data type. When we compared our four prototype reports against the existing design, we found that for the majority (86.7% of

  7. Systematic differences in the response of genetic variation to pedigree and genome-based selection methods

    NARCIS (Netherlands)

    Heidaritabar, M.; Vereijken, A.; Muir, W.M.; Meuwissen, T.H.E.; Cheng, H.; Megens, H.J.W.C.; Groenen, M.; Bastiaansen, J.W.M.

    2014-01-01

    Genomic selection (GS) is a DNA-based method of selecting for quantitative traits in animal and plant breeding, and offers a potentially superior alternative to traditional breeding methods that rely on pedigree and phenotype information. Using a 60¿K SNP chip with markers spaced throughout the

  8. Methods for evaluating imaging methods of limited reproducibility

    International Nuclear Information System (INIS)

    Krummenauer, F.

    2005-01-01

    Just like new drugs, new or modified imaging methods must be subjected to objective clinical tests, including tests on humans. In this, it must be ensured that the principle of Good Clinical Practice (GCP) are followed with regard to medical, administrative and methodical quality. Innovative methods fo clinical epidemiology and medical biometry should be applied from the planning stage to the final statistical evaluation. The author presents established and new methods for planning, evaluation and reporting of clinical tests of diagnostic methods, and especially imaging methods, in clinical medicine and illustrates these by means of current research projects in the various medical disciplines. The strategies presented are summarized in a recommendation based on the concept of phases I - IV of clinical drug testing in order to enable standardisation of the clinical evaluation of imaging methods. (orig.)

  9. Optimizing Usability Studies by Complementary Evaluation Methods

    NARCIS (Netherlands)

    Schmettow, Martin; Bach, Cedric; Scapin, Dominique

    2014-01-01

    This paper examines combinations of complementary evaluation methods as a strategy for efficient usability problem discovery. A data set from an earlier study is re-analyzed, involving three evaluation methods applied to two virtual environment applications. Results of a mixed-effects logistic

  10. Whole-Genome Regression and Prediction Methods Applied to Plant and Animal Breeding

    Science.gov (United States)

    de los Campos, Gustavo; Hickey, John M.; Pong-Wong, Ricardo; Daetwyler, Hans D.; Calus, Mario P. L.

    2013-01-01

    Genomic-enabled prediction is becoming increasingly important in animal and plant breeding and is also receiving attention in human genetics. Deriving accurate predictions of complex traits requires implementing whole-genome regression (WGR) models where phenotypes are regressed on thousands of markers concurrently. Methods exist that allow implementing these large-p with small-n regressions, and genome-enabled selection (GS) is being implemented in several plant and animal breeding programs. The list of available methods is long, and the relationships between them have not been fully addressed. In this article we provide an overview of available methods for implementing parametric WGR models, discuss selected topics that emerge in applications, and present a general discussion of lessons learned from simulation and empirical data analysis in the last decade. PMID:22745228

  11. Empirical evaluation methods in computer vision

    CERN Document Server

    Christensen, Henrik I

    2002-01-01

    This book provides comprehensive coverage of methods for the empirical evaluation of computer vision techniques. The practical use of computer vision requires empirical evaluation to ensure that the overall system has a guaranteed performance. The book contains articles that cover the design of experiments for evaluation, range image segmentation, the evaluation of face recognition and diffusion methods, image matching using correlation methods, and the performance of medical image processing algorithms. Sample Chapter(s). Foreword (228 KB). Chapter 1: Introduction (505 KB). Contents: Automate

  12. A genomic background based method for association analysis in related individuals.

    Directory of Open Access Journals (Sweden)

    Najaf Amin

    Full Text Available BACKGROUND: Feasibility of genotyping of hundreds and thousands of single nucleotide polymorphisms (SNPs in thousands of study subjects have triggered the need for fast, powerful, and reliable methods for genome-wide association analysis. Here we consider a situation when study participants are genetically related (e.g. due to systematic sampling of families or because a study was performed in a genetically isolated population. Of the available methods that account for relatedness, the Measured Genotype (MG approach is considered the 'gold standard'. However, MG is not efficient with respect to time taken for the analysis of genome-wide data. In this context we proposed a fast two-step method called Genome-wide Association using Mixed Model and Regression (GRAMMAR for the analysis of pedigree-based quantitative traits. This method certainly overcomes the drawback of time limitation of the measured genotype (MG approach, but pays in power. One of the major drawbacks of both MG and GRAMMAR, is that they crucially depend on the availability of complete and correct pedigree data, which is rarely available. METHODOLOGY: In this study we first explore type 1 error and relative power of MG, GRAMMAR, and Genomic Control (GC approaches for genetic association analysis. Secondly, we propose an extension to GRAMMAR i.e. GRAMMAR-GC. Finally, we propose application of GRAMMAR-GC using the kinship matrix estimated through genomic marker data, instead of (possibly missing and/or incorrect genealogy. CONCLUSION: Through simulations we show that MG approach maintains high power across a range of heritabilities and possible pedigree structures, and always outperforms other contemporary methods. We also show that the power of our proposed GRAMMAR-GC approaches to that of the 'gold standard' MG for all models and pedigrees studied. We show that this method is both feasible and powerful and has correct type 1 error in the context of genome-wide association analysis

  13. Protecting and Evaluating Genomic Privacy in Medical Tests and Personalized Medicine

    OpenAIRE

    Ayday, Erman; Raisaro, Jean Louis; Rougemont, Jacques; Hubaux, Jean-Pierre

    2013-01-01

    In this paper, we propose privacy-enhancing technologies for medical tests and personalized medicine methods that use patients' genomic data. Focusing on genetic disease-susceptibility tests, we develop a new architecture (between the patient and the medical unit) and propose a "privacy-preserving disease susceptibility test" (PDS) by using homomorphic encryption and proxy re-encryption. Assuming the whole genome sequencing to be done by a certified institution, we propose to store patients' ...

  14. Inverse PCR-based method for isolating novel SINEs from genome.

    Science.gov (United States)

    Han, Yawei; Chen, Liping; Guan, Lihong; He, Shunping

    2014-04-01

    Short interspersed elements (SINEs) are moderately repetitive DNA sequences in eukaryotic genomes. Although eukaryotic genomes contain numerous SINEs copy, it is very difficult and laborious to isolate and identify them by the reported methods. In this study, the inverse PCR was successfully applied to isolate SINEs from Opsariichthys bidens genome in Eastern Asian Cyprinid. A group of SINEs derived from tRNA(Ala) molecular had been identified, which were named Opsar according to Opsariichthys. SINEs characteristics were exhibited in Opsar, which contained a tRNA(Ala)-derived region at the 5' end, a tRNA-unrelated region, and AT-rich region at the 3' end. The tRNA-derived region of Opsar shared 76 % sequence similarity with tRNA(Ala) gene. This result indicated that Opsar could derive from the inactive or pseudogene of tRNA(Ala). The reliability of method was tested by obtaining C-SINE, Ct-SINE, and M-SINEs from Ctenopharyngodon idellus, Megalobrama amblycephala, and Cyprinus carpio genomes. This method is simpler than the previously reported, which successfully omitted many steps, such as preparation of probes, construction of genomic libraries, and hybridization.

  15. Challenges in Using Mix Methods in Evaluation

    OpenAIRE

    Stefan COJOCARU

    2010-01-01

    This article explores the debates between quantitative and qualitative methods in the evaluation process, analyzes the challenges about methodological mix in terms of credibility and validity of data and tools, and the evaluation findings. Beyond the epistemological contradictions, it seems that, in terms of usefulness, the mixing of methods is a practical solution, along with hybrids theories, able to provide information to improve the sufficiency of the program. Mixing methods is also a wa...

  16. Evaluation methods for neutron cross section standards

    International Nuclear Information System (INIS)

    Bhat, M.R.

    1980-01-01

    Methods used to evaluate the neutron cross section standards are reviewed and their relative merits, assessed. These include phase-shift analysis, R-matrix fit, and a number of other methods by Poenitz, Bhat, Kon'shin and the Bayesian or generalized least-squares procedures. The problems involved in adopting these methods for future cross section standards evaluations are considered, and the prospects for their use, discussed. 115 references, 5 figures, 3 tables

  17. Genomic DNA extraction method from pearl millet ( Pennisetum ...

    African Journals Online (AJOL)

    DNA extraction is difficult in a variety of plants because of the presence of metabolites that interfere with DNA isolation procedures and downstream applications such as DNA restriction, amplification, and cloning. Here we describe a modified procedure based on the hexadecyltrimethylammonium bromide (CTAB) method to ...

  18. Genomic DNA extraction method from Annona senegalensis Pers ...

    African Journals Online (AJOL)

    Extraction of DNA in many plants is difficult because of the presence of metabolites that interfere with DNA isolation procedures and downstream applications such as DNA restriction, replications, amplification, as well as cloning. Modified procedure based on the hexadecyltrimethyl ammoniumbromide (CTAB) method is ...

  19. Geophysical methods for evaluation of plutonic rocks

    International Nuclear Information System (INIS)

    Gibb, R.A.; Scott, J.S.

    1986-04-01

    Geophysical methods are systematically described according to the physical principle and operational mode of each method, the type of information produced, limitations of a technical and/or economic nature, and the applicability of the method to rock-mass evaluation at Research Areas of the Nuclear Fuel Waste Management Program. The geophysical methods fall into three categories: (1) airborne and other reconnaissance surveys, (2) detailed or surface (ground) surveys, and (3) borehole or subsurface surveys. The possible roles of each method in the site-screening and site-evaluation processes of disposal vault site selection are summarized

  20. Safeguards Evaluation Method for evaluating vulnerability to insider threats

    International Nuclear Information System (INIS)

    Al-Ayat, R.A.; Judd, B.R.; Renis, T.A.

    1986-01-01

    As protection of DOE facilities against outsiders increases to acceptable levels, attention is shifting toward achieving comparable protection against insiders. Since threats and protection measures for insiders are substantially different from those for outsiders, new perspectives and approaches are needed. One such approach is the Safeguards Evaluation Method. This method helps in assessing safeguards vulnerabilities to theft or diversion of special nuclear meterial (SNM) by insiders. The Safeguards Evaluation Method-Insider Threat is a simple model that can be used by safeguards and security planners to evaluate safeguards and proposed upgrades at their own facilities. The method is used to evaluate the effectiveness of safeguards in both timely detection (in time to prevent theft) and late detection (after-the-fact). The method considers the various types of potential insider adversaries working alone or in collusion with other insiders. The approach can be used for a wide variety of facilities with various quantities and forms of SNM. An Evaluation Workbook provides documentation of the baseline assessment; this simplifies subsequent on-site appraisals. Quantitative evaluation is facilitated by an accompanying computer program. The method significantly increases an evaluation team's on-site analytical capabilities, thereby producing a more thorough and accurate safeguards evaluation

  1. Nondestructive evaluation ultrasonic methods for construction materials

    International Nuclear Information System (INIS)

    Chilibon, I.; Zisu, T.; Raetchi, V.

    2002-01-01

    The paper presents some ultrasonic methods for evaluation of physical-mechanical properties of construction materials (bricks, concrete, BCA), such as: pulse method, examination methods, and direct measurement of the propagation velocity and impact-echo method. Utilizing these nondestructive evaluation ultrasonic methods it can be determined the main material parameters and material characteristics (elasticity coefficients, density, propagation velocity, ultrasound attenuation, etc.) of construction materials. These method are suitable for construction materials because the defectoscopy methods for metallic materials cannot be utilized, due to its rugged and non-homogeneous structures and grate attenuation coefficients of ultrasound propagation through materials. Also, the impact-echo method is a technique for flaw detection in concrete based on stress wave propagation. Studies have shown that the impact-echo method is effective for locating voids, honeycombing, delaminating, depth of surface opening cracks, and measuring member thickness

  2. Efficient Server-Aided Secure Two-Party Function Evaluation with Applications to Genomic Computation

    Directory of Open Access Journals (Sweden)

    Blanton Marina

    2016-10-01

    Full Text Available Computation based on genomic data is becoming increasingly popular today, be it for medical or other purposes. Non-medical uses of genomic data in a computation often take place in a server-mediated setting where the server offers the ability for joint genomic testing between the users. Undeniably, genomic data is highly sensitive, which in contrast to other biometry types, discloses a plethora of information not only about the data owner, but also about his or her relatives. Thus, there is an urgent need to protect genomic data. This is particularly true when the data is used in computation for what we call recreational non-health-related purposes. Towards this goal, in this work we put forward a framework for server-aided secure two-party computation with the security model motivated by genomic applications. One particular security setting that we treat in this work provides stronger security guarantees with respect to malicious users than the traditional malicious model. In particular, we incorporate certified inputs into secure computation based on garbled circuit evaluation to guarantee that a malicious user is unable to modify her inputs in order to learn unauthorized information about the other user’s data. Our solutions are general in the sense that they can be used to securely evaluate arbitrary functions and offer attractive performance compared to the state of the art. We apply the general constructions to three specific types of genomic tests: paternity, genetic compatibility, and ancestry testing and implement the constructions. The results show that all such private tests can be executed within a matter of seconds or less despite the large size of one’s genomic data.

  3. A strategy for evaluating pathway analysis methods.

    Science.gov (United States)

    Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques

    2017-10-13

    Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth

  4. Color image definition evaluation method based on deep learning method

    Science.gov (United States)

    Liu, Di; Li, YingChun

    2018-01-01

    In order to evaluate different blurring levels of color image and improve the method of image definition evaluation, this paper proposed a method based on the depth learning framework and BP neural network classification model, and presents a non-reference color image clarity evaluation method. Firstly, using VGG16 net as the feature extractor to extract 4,096 dimensions features of the images, then the extracted features and labeled images are employed in BP neural network to train. And finally achieve the color image definition evaluation. The method in this paper are experimented by using images from the CSIQ database. The images are blurred at different levels. There are 4,000 images after the processing. Dividing the 4,000 images into three categories, each category represents a blur level. 300 out of 400 high-dimensional features are trained in VGG16 net and BP neural network, and the rest of 100 samples are tested. The experimental results show that the method can take full advantage of the learning and characterization capability of deep learning. Referring to the current shortcomings of the major existing image clarity evaluation methods, which manually design and extract features. The method in this paper can extract the images features automatically, and has got excellent image quality classification accuracy for the test data set. The accuracy rate is 96%. Moreover, the predicted quality levels of original color images are similar to the perception of the human visual system.

  5. Methods of ecological capability evaluation of forest

    International Nuclear Information System (INIS)

    Hosseini, M.; Makhdoum, M.F.; Akbarnia, M.; Saghebtalebi, Kh.

    2000-01-01

    In this research common methods of ecological capability evaluation of forests were reviewed and limitations for performance were analysed. Ecological capability of forests is an index that show site potential in several role of wood production, soil conservation, flood control, biodiversity, conservation and water supply. This index is related to ecological characteristics of land, such as soil, micro climate, elevation, slope and aspect that affect potential of sites. Suitable method of ecological capability evaluation must be chosen according to the objective of forestry. Common methods for ecological capability evaluation include plant and animal diversity, site index curve, soil and land form, inter branches, index plants, leave analyses, analyses regeneration and ecological mapping

  6. Registered plant list - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...the Plant DB link list in simple search page) Genome analysis methods Presence or... absence of Genome analysis methods information in this DB (link to the Genome analysis methods information ...base Site Policy | Contact Us Registered plant list - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  7. Comparison of variations detection between whole-genome amplification methods used in single-cell resequencing

    DEFF Research Database (Denmark)

    Hou, Yong; Wu, Kui; Shi, Xulian

    2015-01-01

    methods, focusing particularly on variations detection. Low-coverage whole-genome sequencing revealed that DOP-PCR had the highest duplication ratio, but an even read distribution and the best reproducibility and accuracy for detection of copy-number variations (CNVs). However, MDA had significantly...... performance using SCRS amplified by different WGA methods. It will guide researchers to determine which WGA method is best suited to individual experimental needs at single-cell level....

  8. Evaluation and comparison of mammalian subcellular localization prediction methods

    Directory of Open Access Journals (Sweden)

    Fink J Lynn

    2006-12-01

    Full Text Available Abstract Background Determination of the subcellular location of a protein is essential to understanding its biochemical function. This information can provide insight into the function of hypothetical or novel proteins. These data are difficult to obtain experimentally but have become especially important since many whole genome sequencing projects have been finished and many resulting protein sequences are still lacking detailed functional information. In order to address this paucity of data, many computational prediction methods have been developed. However, these methods have varying levels of accuracy and perform differently based on the sequences that are presented to the underlying algorithm. It is therefore useful to compare these methods and monitor their performance. Results In order to perform a comprehensive survey of prediction methods, we selected only methods that accepted large batches of protein sequences, were publicly available, and were able to predict localization to at least nine of the major subcellular locations (nucleus, cytosol, mitochondrion, extracellular region, plasma membrane, Golgi apparatus, endoplasmic reticulum (ER, peroxisome, and lysosome. The selected methods were CELLO, MultiLoc, Proteome Analyst, pTarget and WoLF PSORT. These methods were evaluated using 3763 mouse proteins from SwissProt that represent the source of the training sets used in development of the individual methods. In addition, an independent evaluation set of 2145 mouse proteins from LOCATE with a bias towards the subcellular localization underrepresented in SwissProt was used. The sensitivity and specificity were calculated for each method and compared to a theoretical value based on what might be observed by random chance. Conclusion No individual method had a sufficient level of sensitivity across both evaluation sets that would enable reliable application to hypothetical proteins. All methods showed lower performance on the LOCATE

  9. Developing a common framework for evaluating the implementation of genomic medicine interventions in clinical care: the IGNITE Network's Common Measures Working Group.

    Science.gov (United States)

    Orlando, Lori A; Sperber, Nina R; Voils, Corrine; Nichols, Marshall; Myers, Rachel A; Wu, R Ryanne; Rakhra-Burris, Tejinder; Levy, Kenneth D; Levy, Mia; Pollin, Toni I; Guan, Yue; Horowitz, Carol R; Ramos, Michelle; Kimmel, Stephen E; McDonough, Caitrin W; Madden, Ebony B; Damschroder, Laura J

    2018-06-01

    PurposeImplementation research provides a structure for evaluating the clinical integration of genomic medicine interventions. This paper describes the Implementing Genomics in Practice (IGNITE) Network's efforts to promote (i) a broader understanding of genomic medicine implementation research and (ii) the sharing of knowledge generated in the network.MethodsTo facilitate this goal, the IGNITE Network Common Measures Working Group (CMG) members adopted the Consolidated Framework for Implementation Research (CFIR) to guide its approach to identifying constructs and measures relevant to evaluating genomic medicine as a whole, standardizing data collection across projects, and combining data in a centralized resource for cross-network analyses.ResultsCMG identified 10 high-priority CFIR constructs as important for genomic medicine. Of those, eight did not have standardized measurement instruments. Therefore, we developed four survey tools to address this gap. In addition, we identified seven high-priority constructs related to patients, families, and communities that did not map to CFIR constructs. Both sets of constructs were combined to create a draft genomic medicine implementation model.ConclusionWe developed processes to identify constructs deemed valuable for genomic medicine implementation and codified them in a model. These resources are freely available to facilitate knowledge generation and sharing across the field.

  10. Development of a fluorescence-activated cell sorting method coupled with whole genome amplification to analyze minority and trace Dehalococcoides genomes in microbial communities.

    Science.gov (United States)

    Lee, Patrick K H; Men, Yujie; Wang, Shanquan; He, Jianzhong; Alvarez-Cohen, Lisa

    2015-02-03

    Dehalococcoides mccartyi are functionally important bacteria that catalyze the reductive dechlorination of chlorinated ethenes. However, these anaerobic bacteria are fastidious to isolate, making downstream genomic characterization challenging. In order to facilitate genomic analysis, a fluorescence-activated cell sorting (FACS) method was developed in this study to separate D. mccartyi cells from a microbial community, and the DNA of the isolated cells was processed by whole genome amplification (WGA) and hybridized onto a D. mccartyi microarray for comparative genomics against four sequenced strains. First, FACS was successfully applied to a D. mccartyi isolate as positive control, and then microarray results verified that WGA from 10(6) cells or ∼1 ng of genomic DNA yielded high-quality coverage detecting nearly all genes across the genome. As expected, some inter- and intrasample variability in WGA was observed, but these biases were minimized by performing multiple parallel amplifications. Subsequent application of the FACS and WGA protocols to two enrichment cultures containing ∼10% and ∼1% D. mccartyi cells successfully enabled genomic analysis. As proof of concept, this study demonstrates that coupling FACS with WGA and microarrays is a promising tool to expedite genomic characterization of target strains in environmental communities where the relative concentrations are low.

  11. Rapid methods for the extraction and archiving of molecular grade fungal genomic DNA.

    Science.gov (United States)

    Borman, Andrew M; Palmer, Michael; Johnson, Elizabeth M

    2013-01-01

    The rapid and inexpensive extraction of fungal genomic DNA that is of sufficient quality for molecular approaches is central to the molecular identification, epidemiological analysis, taxonomy, and strain typing of pathogenic fungi. Although many commercially available and in-house extraction procedures do eliminate the majority of contaminants that commonly inhibit molecular approaches, the inherent difficulties in breaking fungal cell walls lead to protocols that are labor intensive and that routinely take several hours to complete. Here we describe several methods that we have developed in our laboratory that allow the extremely rapid and inexpensive preparation of fungal genomic DNA.

  12. Methods to improve genomic prediction and GWAS using combined Holstein populations

    DEFF Research Database (Denmark)

    Li, Xiujin

    The thesis focuses on methods to improve GWAS and genomic prediction using combined Holstein populations and investigations G by E interaction. The conclusions are: 1) Prediction reliabilities for Brazilian Holsteins can be increased by adding Nordic and Frensh genotyped bulls and a large G by E...... interaction exists between populations. 2) Combining data from Chinese and Danish Holstein populations increases the power of GWAS and detects new QTL regions for milk fatty acid traits. 3) The novel multi-trait Bayesian model efficiently estimates region-specific genomic variances, covariances...

  13. Creating Alternative Methods for Educational Evaluation.

    Science.gov (United States)

    Smith, Nick L.

    1981-01-01

    A project supported by the National Institute of Education is adapting evaluation procedures from such areas as philosophy, geography, operations research, journalism, film criticism, and other areas. The need for such methods is reviewed, as is the context in which they function, and their contributions to evaluation methodology. (Author/GK)

  14. Consumer behavior changing: methods of evaluation

    Directory of Open Access Journals (Sweden)

    Elīna Gaile-Sarkane

    2013-11-01

    Full Text Available The article is devoted to methods of analyses of consumer buying behavior as well as to evaluation of most important factors what influences consumer behavior. This research aims at investigations about the changes in consumer behavior caused by globalization and development of information technologies; it helps to understand the specific factors what should be taken into account in evaluation of consumer behavior. The authors employ well-established quantitative and qualitative methods of research: grouping, analysis, synthesis, expert method, statistic method, etc. Research findings disclosed that there is possibility to introduce new methods for evaluation of changing consumer behavior.

  15. A multi-sample based method for identifying common CNVs in normal human genomic structure using high-resolution aCGH data.

    Directory of Open Access Journals (Sweden)

    Chihyun Park

    Full Text Available BACKGROUND: It is difficult to identify copy number variations (CNV in normal human genomic data due to noise and non-linear relationships between different genomic regions and signal intensity. A high-resolution array comparative genomic hybridization (aCGH containing 42 million probes, which is very large compared to previous arrays, was recently published. Most existing CNV detection algorithms do not work well because of noise associated with the large amount of input data and because most of the current methods were not designed to analyze normal human samples. Normal human genome analysis often requires a joint approach across multiple samples. However, the majority of existing methods can only identify CNVs from a single sample. METHODOLOGY AND PRINCIPAL FINDINGS: We developed a multi-sample-based genomic variations detector (MGVD that uses segmentation to identify common breakpoints across multiple samples and a k-means-based clustering strategy. Unlike previous methods, MGVD simultaneously considers multiple samples with different genomic intensities and identifies CNVs and CNV zones (CNVZs; CNVZ is a more precise measure of the location of a genomic variant than the CNV region (CNVR. CONCLUSIONS AND SIGNIFICANCE: We designed a specialized algorithm to detect common CNVs from extremely high-resolution multi-sample aCGH data. MGVD showed high sensitivity and a low false discovery rate for a simulated data set, and outperformed most current methods when real, high-resolution HapMap datasets were analyzed. MGVD also had the fastest runtime compared to the other algorithms evaluated when actual, high-resolution aCGH data were analyzed. The CNVZs identified by MGVD can be used in association studies for revealing relationships between phenotypes and genomic aberrations. Our algorithm was developed with standard C++ and is available in Linux and MS Windows format in the STL library. It is freely available at: http://embio.yonsei.ac.kr/~Park/mgvd.php.

  16. Evaluation of inbreeding depression in Holstein cattle using whole-genome SNP markers and alternative measures of genomic inbreeding.

    Science.gov (United States)

    Bjelland, D W; Weigel, K A; Vukasinovic, N; Nkrumah, J D

    2013-07-01

    The effects of increased pedigree inbreeding in dairy cattle populations have been well documented and result in a negative impact on profitability. Recent advances in genotyping technology have allowed researchers to move beyond pedigree analysis and study inbreeding at a molecular level. In this study, 5,853 animals were genotyped for 54,001 single nucleotide polymorphisms (SNP); 2,913 cows had phenotypic records including a single lactation for milk yield (from either lactation 1, 2, 3, or 4), reproductive performance, and linear type conformation. After removing SNP with poor call rates, low minor allele frequencies, and departure from Hardy-Weinberg equilibrium, 33,025 SNP remained for analyses. Three measures of genomic inbreeding were evaluated: percent homozygosity (FPH), inbreeding calculated from runs of homozygosity (FROH), and inbreeding derived from a genomic relationship matrix (FGRM). Average FPH was 60.5±1.1%, average FROH was 3.8±2.1%, and average FGRM was 20.8±2.3%, where animals with larger values for each of the genomic inbreeding indices were considered more inbred. Decreases in total milk yield to 205d postpartum of 53, 20, and 47kg per 1% increase in FPH, FROH, and FGRM, respectively, were observed. Increases in days open per 1% increase in FPH (1.76 d), FROH (1.72 d), and FGRM (1.06 d) were also noted, as well as increases in maternal calving difficulty (0.09, 0.03, and 0.04 on a 5-point scale for FPH, FROH, and FGRM, respectively). Several linear type traits, such as strength (-0.40, -0.11, and -0.19), rear legs rear view (-0.35, -0.16, and -0.14), front teat placement (0.35, 0.25, 0.18), and teat length (-0.24, -0.14, and -0.13) were also affected by increases in FPH, FROH, and FGRM, respectively. Overall, increases in each measure of genomic inbreeding in this study were associated with negative effects on production and reproductive ability in dairy cows. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc

  17. Nonparametric method for genomics-based prediction of performance of quantitative traits involving epistasis in plant breeding.

    Directory of Open Access Journals (Sweden)

    Xiaochun Sun

    Full Text Available Genomic selection (GS procedures have proven useful in estimating breeding value and predicting phenotype with genome-wide molecular marker information. However, issues of high dimensionality, multicollinearity, and the inability to deal effectively with epistasis can jeopardize accuracy and predictive ability. We, therefore, propose a new nonparametric method, pRKHS, which combines the features of supervised principal component analysis (SPCA and reproducing kernel Hilbert spaces (RKHS regression, with versions for traits with no/low epistasis, pRKHS-NE, to high epistasis, pRKHS-E. Instead of assigning a specific relationship to represent the underlying epistasis, the method maps genotype to phenotype in a nonparametric way, thus requiring fewer genetic assumptions. SPCA decreases the number of markers needed for prediction by filtering out low-signal markers with the optimal marker set determined by cross-validation. Principal components are computed from reduced marker matrix (called supervised principal components, SPC and included in the smoothing spline ANOVA model as independent variables to fit the data. The new method was evaluated in comparison with current popular methods for practicing GS, specifically RR-BLUP, BayesA, BayesB, as well as a newer method by Crossa et al., RKHS-M, using both simulated and real data. Results demonstrate that pRKHS generally delivers greater predictive ability, particularly when epistasis impacts trait expression. Beyond prediction, the new method also facilitates inferences about the extent to which epistasis influences trait expression.

  18. Nonparametric method for genomics-based prediction of performance of quantitative traits involving epistasis in plant breeding.

    Science.gov (United States)

    Sun, Xiaochun; Ma, Ping; Mumm, Rita H

    2012-01-01

    Genomic selection (GS) procedures have proven useful in estimating breeding value and predicting phenotype with genome-wide molecular marker information. However, issues of high dimensionality, multicollinearity, and the inability to deal effectively with epistasis can jeopardize accuracy and predictive ability. We, therefore, propose a new nonparametric method, pRKHS, which combines the features of supervised principal component analysis (SPCA) and reproducing kernel Hilbert spaces (RKHS) regression, with versions for traits with no/low epistasis, pRKHS-NE, to high epistasis, pRKHS-E. Instead of assigning a specific relationship to represent the underlying epistasis, the method maps genotype to phenotype in a nonparametric way, thus requiring fewer genetic assumptions. SPCA decreases the number of markers needed for prediction by filtering out low-signal markers with the optimal marker set determined by cross-validation. Principal components are computed from reduced marker matrix (called supervised principal components, SPC) and included in the smoothing spline ANOVA model as independent variables to fit the data. The new method was evaluated in comparison with current popular methods for practicing GS, specifically RR-BLUP, BayesA, BayesB, as well as a newer method by Crossa et al., RKHS-M, using both simulated and real data. Results demonstrate that pRKHS generally delivers greater predictive ability, particularly when epistasis impacts trait expression. Beyond prediction, the new method also facilitates inferences about the extent to which epistasis influences trait expression.

  19. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    Science.gov (United States)

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  20. Evaluation of winter pothole patching methods.

    Science.gov (United States)

    2014-01-01

    The main objective of this study was to evaluate the performance and cost-effectiveness of the tow-behind combination : infrared asphalt heater/reclaimer patching method and compare it to the throw and roll and spray injection methods. To : achieve t...

  1. Evaluation of registration methods on thoracic CT

    DEFF Research Database (Denmark)

    Murphy, K.; van Ginneken, B.; Reinhardt, J.

    2011-01-01

    method and the evaluation is independent, using the same criteria for all participants. All results are published on the EMPIRE10 website (http://empire10.isi.uu.nl). The challenge remains ongoing and open to new participants. Full results from 24 algorithms have been published at the time of writing......EMPIRE10 (Evaluation of Methods for Pulmonary Image REgistration 2010) is a public platform for fair and meaningful comparison of registration algorithms which are applied to a database of intra-patient thoracic CT image pairs. Evaluation of non-rigid registration techniques is a non trivial task....... This article details the organisation of the challenge, the data and evaluation methods and the outcome of the initial launch with 20 algorithms. The gain in knowledge and future work are discussed....

  2. Success tree method of resources evaluation

    International Nuclear Information System (INIS)

    Chen Qinglan; Sun Wenpeng

    1994-01-01

    By applying the reliability theory in system engineering, the success tree method is used to transfer the expert's recognition on metallogenetic regularities into the form of the success tree. The aim of resources evaluation is achieved by means of calculating the metallogenetic probability or favorability of the top event of the success tree. This article introduces in detail, the source, principle of the success tree method and three kinds of calculation methods, expounds concretely how to establish the success tree of comprehensive uranium metallogenesis as well as the procedure from which the resources evaluation is performed. Because this method has not restrictions on the number of known deposits and calculated area, it is applicable to resources evaluation for different mineral species, types and scales and possesses good prospects of development

  3. Calculation of evolutionary correlation between individual genes and full-length genome: a method useful for choosing phylogenetic markers for molecular epidemiology.

    Directory of Open Access Journals (Sweden)

    Shuai Wang

    Full Text Available Individual genes or regions are still commonly used to estimate the phylogenetic relationships among viral isolates. The genomic regions that can faithfully provide assessments consistent with those predicted with full-length genome sequences would be preferable to serve as good candidates of the phylogenetic markers for molecular epidemiological studies of many viruses. Here we employed a statistical method to evaluate the evolutionary relationships between individual viral genes and full-length genomes without tree construction as a way to determine which gene can match the genome well in phylogenetic analyses. This method was performed by calculation of linear correlations between the genetic distance matrices of aligned individual gene sequences and aligned genome sequences. We applied this method to the phylogenetic analyses of porcine circovirus 2 (PCV2, measles virus (MV, hepatitis E virus (HEV and Japanese encephalitis virus (JEV. Phylogenetic trees were constructed for comparisons and the possible factors affecting the method accuracy were also discussed in the calculations. The results revealed that this method could produce results consistent with those of previous studies about the proper consensus sequences that could be successfully used as phylogenetic markers. And our results also suggested that these evolutionary correlations could provide useful information for identifying genes that could be used effectively to infer the genetic relationships.

  4. Evaluating genome-wide DNA methylation changes in mice by Methylation Specific Digital Karyotyping

    Directory of Open Access Journals (Sweden)

    Maruoka Shuichiro

    2008-12-01

    Full Text Available Abstract Background The study of genome-wide DNA methylation changes has become more accessible with the development of various array-based technologies though when studying species other than human the choice of applications are limited and not always within reach. In this study, we adapted and tested the applicability of Methylation Specific Digital Karyotyping (MSDK, a non-array based method, for the prospective analysis of epigenetic changes after perinatal nutritional modifications in a mouse model of allergic airway disease. MSDK is a sequenced based method that allows a comprehensive and unbiased methylation profiling. The method generates 21 base pairs long sequence tags derived from specific locations in the genome. The resulting tag frequencies determine in a quantitative manner the methylation level of the corresponding loci. Results Genomic DNA from whole lung was isolated and subjected to MSDK analysis using the methylation-sensitive enzyme Not I as the mapping enzyme and Nla III as the fragmenting enzyme. In a pair wise comparison of the generated mouse MSDK libraries we identified 158 loci that are significantly differentially methylated (P-value = 0.05 after perinatal dietary changes in our mouse model. Quantitative methylation specific PCR and sequence analysis of bisulfate modified genomic DNA confirmed changes in methylation at specific loci. Differences in genomic MSDK tag counts for a selected set of genes, correlated well with changes in transcription levels as measured by real-time PCR. Furthermore serial analysis of gene expression profiling demonstrated a dramatic difference in expressed transcripts in mice exposed to perinatal nutritional changes. Conclusion The genome-wide methylation survey applied in this study allowed for an unbiased methylation profiling revealing subtle changes in DNA methylation in mice maternally exposed to dietary changes in methyl-donor content. The MSDK method is applicable for mouse models

  5. Should we use the single nucleotide polymorphism linked to in genomic evaluation of French trotter?

    Science.gov (United States)

    Brard, S; Ricard, A

    2015-10-01

    An A/C mutation responsible for the ability to pace in horses was recently discovered in the gene. It has also been proven that allele C has a negative effect on trotters' performances. However, in French trotters (FT), the frequency of allele A is only 77% due to an unexpected positive effect of allele C in late-career FT performances. Here we set out to ascertain whether the genotype at SNP (linked to ) should be used to compute EBV for FT. We used the genotypes of 630 horses, with 41,711 SNP retained. The pedigree comprised 5,699 horses. Qualification status (trotters need to complete a 2,000-m race within a limited time to begin their career) and earnings at different ages were precorrected for fixed effects and evaluated with a multitrait model. Estimated breeding values were computed with and without the genotype at SNP as a fixed effect in the model. The analyses were performed using pedigree only via BLUP and using the genotypes via genomic BLUP (GBLUP). The genotype at SNP was removed from the file of genotypes when already taken into account as a fixed effect. Alternatively, 3 groups of 100 candidates were used for validation. Validations were also performed on 50 random-clustered groups of 126 candidates and compared against the results of the 3 disjoint sets. For performances on which has a minor effect, the coefficients of correlation were not improved when the genotype at SNP was a fixed effect in the model (earnings at 3 and 4 yr). However, for traits proven strongly related to , the accuracy of evaluation was improved, increasing +0.17 for earnings at 2 yr, +0.04 for earnings at 5 yr and older, and +0.09 for qualification status (with the GBLUP method). For all traits, the bias was reduced when the SNP linked to was a fixed effect in the model. This work finds a clear rationale for using the genotype at for this multitrait evaluation. Genomic selection seemed to achieve better results than classic selection.

  6. Methods for evaluation of industry training programs

    International Nuclear Information System (INIS)

    Morisseau, D.S.; Roe, M.L.; Persensky, J.J.

    1987-01-01

    The NRC Policy Statement on Training and Qualification endorses the INPO-managed Training Accreditation Program in that it encompasses the elements of effective performance-based training. Those elements are: analysis of the job, performance-based learning objectives, training design and implementation, trainee evaluation, and program evaluation. As part of the NRC independent evaluation of utilities implementation of training improvement programs, the staff developed training review criteria and procedures that address all five elements of effective performance-based training. The staff uses these criteria to perform reviews of utility training programs that have already received accreditation. Although no performance-based training program can be said to be complete unless all five elements are in place, the last two, trainee and program evaluation, are perhaps the most important because they determine how well the first three elements have been implemented and ensure the dynamic nature of training. This paper discusses the evaluation elements of the NRC training review criteria. The discussion will detail the elements of evaluation methods and techniques that the staff expects to find as integral parts of performance-based training programs at accredited utilities. Further, the review of the effectiveness of implementation of the evaluation methods is discussed. The paper also addresses some of the qualitative differences between what is minimally acceptable and what is most desirable with respect to trainee and program evaluation mechanisms and their implementation

  7. Evaluation of whole genome sequencing for outbreak detection of Salmonella enterica

    DEFF Research Database (Denmark)

    Leekitcharoenphon, Pimlapas; Nielsen, Eva M.; Kaas, Rolf Sommer

    2014-01-01

    Salmonella enterica is a common cause of minor and large food borne outbreaks. To achieve successful and nearly ‘real-time’ monitoring and identification of outbreaks, reliable sub-typing is essential. Whole genome sequencing (WGS) shows great promises for using as a routine epidemiological typing....... Enteritidis and 5 S. Derby were also sequenced and used for comparison. A number of different bioinformatics approaches were applied on the data; including pan-genome tree, k-mer tree, nucleotide difference tree and SNP tree. The outcome of each approach was evaluated in relation to the association...... of the isolates to specific outbreaks. The pan-genome tree clustered 65% of the S. Typhimurium isolates according to the pre-defined epidemiology, the k-mer tree 88%, the nucleotide difference tree 100% and the SNP tree 100% of the strains within S. Typhimurium. The resulting outcome of the four phylogenetic...

  8. Comparative Analysis of the Genomic DNA Isolation Methods on Inula sp. (Asteraceae

    Directory of Open Access Journals (Sweden)

    Emre SEVİNDİK

    2016-12-01

    Full Text Available Simple, fast, low-cost and high throughput protocols are required for DNA isolation of plant species. In this study, phenol chloroform isoamyl alcohol and commercial (Sigma DNA isolation kit methods were applied on some Inula species that belong to Asteraceae family. Genomic DNA amounts, A260, A280, A260/A230 and purity degrees (A260/A280 that were obtained through both methods were measured through electrophoresis and spectrophotometer. Additionally, PCR amplification was realized by primer pairs specific to nrDNA ITS, cpDNA ndhF (972F-1603R and trnL-F regions. Results showed that maximum genomic DNA in nanograms obtained by phenol chloroform isoamyl alcohol method. The study also revealed that I. macrocephala had the maximum DNA and I. heterolepis had the minimum DNA amount. A260/A280 purity degrees showed that the highest and lowest purity in gDNAs obtained through phenol-choloform isoamyl alcohol method were in I.aucheriana and I. salicina, respectively. The highest and lowest purity degrees of gDNAs obtained through commercial kit was observed in I. fragilis and I. macrocephala samples, respectively. PCR amplification results showed that while band profiles of each three regions (ITS, trnL-F and ndhF did not yield positive results in PCR amplifications using phenol-choloform isoamyl alcohol method; PCR band profiles obtained through commercial kit yielded positive results. As a result, it is fair to say that the relation of genomic DNA with PCR was found to be more efficient although the maximum amount of genomic DNA was obtained through phenol chloroform isoamyl alcohol method.

  9. Evaluating a method for automated rigid registration

    DEFF Research Database (Denmark)

    Darkner, Sune; Vester-Christensen, Martin; Larsen, Rasmus

    2007-01-01

    to point distance. T-test for common mean are used to determine the performance of the two methods (supported by a Wilcoxon signed rank test). The performance influence of sampling density, sampling quantity, and norms is analyzed using a similar method.......We evaluate a novel method for fully automated rigid registration of 2D manifolds in 3D space based on distance maps, the Gibbs sampler and Iterated Conditional Modes (ICM). The method is tested against the ICP considered as the gold standard for automated rigid registration. Furthermore...

  10. Quantitative Efficiency Evaluation Method for Transportation Networks

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-11-01

    Full Text Available An effective evaluation of transportation network efficiency/performance is essential to the establishment of sustainable development in any transportation system. Based on a redefinition of transportation network efficiency, a quantitative efficiency evaluation method for transportation network is proposed, which could reflect the effects of network structure, traffic demands, travel choice, and travel costs on network efficiency. Furthermore, the efficiency-oriented importance measure for network components is presented, which can be used to help engineers identify the critical nodes and links in the network. The numerical examples show that, compared with existing efficiency evaluation methods, the network efficiency value calculated by the method proposed in this paper can portray the real operation situation of the transportation network as well as the effects of main factors on network efficiency. We also find that the network efficiency and the importance values of the network components both are functions of demands and network structure in the transportation network.

  11. Genomic Analysis of a Marine Bacterium: Bioinformatics for Comparison, Evaluation, and Interpretation of DNA Sequences

    Directory of Open Access Journals (Sweden)

    Bhagwan N. Rekadwad

    2016-01-01

    Full Text Available A total of five highly related strains of an unidentified marine bacterium were analyzed through their short genome sequences (AM260709–AM260713. Genome-to-Genome Distance (GGDC showed high similarity to Pseudoalteromonas haloplanktis (X67024. The generated unique Quick Response (QR codes indicated no identity to other microbial species or gene sequences. Chaos Game Representation (CGR showed the number of bases concentrated in the area. Guanine residues were highest in number followed by cytosine. Frequency of Chaos Game Representation (FCGR indicated that CC and GG blocks have higher frequency in the sequence from the evaluated marine bacterium strains. Maximum GC content for the marine bacterium strains ranged 53-54%. The use of QR codes, CGR, FCGR, and GC dataset helped in identifying and interpreting short genome sequences from specific isolates. A phylogenetic tree was constructed with the bootstrap test (1000 replicates using MEGA6 software. Principal Component Analysis (PCA was carried out using EMBL-EBI MUSCLE program. Thus, generated genomic data are of great assistance for hierarchical classification in Bacterial Systematics which combined with phenotypic features represents a basic procedure for a polyphasic approach on unambiguous bacterial isolate taxonomic classification.

  12. A method for accurate detection of genomic microdeletions using real-time quantitative PCR

    Directory of Open Access Journals (Sweden)

    Bassett Anne S

    2005-12-01

    Full Text Available Abstract Background Quantitative Polymerase Chain Reaction (qPCR is a well-established method for quantifying levels of gene expression, but has not been routinely applied to the detection of constitutional copy number alterations of human genomic DNA. Microdeletions or microduplications of the human genome are associated with a variety of genetic disorders. Although, clinical laboratories routinely use fluorescence in situ hybridization (FISH to identify such cryptic genomic alterations, there remains a significant number of individuals in which constitutional genomic imbalance is suspected, based on clinical parameters, but cannot be readily detected using current cytogenetic techniques. Results In this study, a novel application for real-time qPCR is presented that can be used to reproducibly detect chromosomal microdeletions and microduplications. This approach was applied to DNA from a series of patient samples and controls to validate genomic copy number alteration at cytoband 22q11. The study group comprised 12 patients with clinical symptoms of chromosome 22q11 deletion syndrome (22q11DS, 1 patient trisomic for 22q11 and 4 normal controls. 6 of the patients (group 1 had known hemizygous deletions, as detected by standard diagnostic FISH, whilst the remaining 6 patients (group 2 were classified as 22q11DS negative using the clinical FISH assay. Screening of the patients and controls with a set of 10 real time qPCR primers, spanning the 22q11.2-deleted region and flanking sequence, confirmed the FISH assay results for all patients with 100% concordance. Moreover, this qPCR enabled a refinement of the region of deletion at 22q11. Analysis of DNA from chromosome 22 trisomic sample demonstrated genomic duplication within 22q11. Conclusion In this paper we present a qPCR approach for the detection of chromosomal microdeletions and microduplications. The strategic use of in silico modelling for qPCR primer design to avoid regions of repetitive

  13. "System evaluates system": method for evaluating the efficiency of IS

    Directory of Open Access Journals (Sweden)

    Dita Blazkova

    2016-10-01

    Full Text Available In paper I deal with the possible solution of evaluating the efficiency of information systems in companies. The large number of existing methods used to address the efficiency of information systems is dependent on the subjective responses of the user that may distort output evaluation. Therefore, I propose a method that eliminates the subjective opinion of a user as the primary data source. Applications, which I suggests as part of the method, collects relevant data. In this paper I describe the application in detail. This is a follow-on program on any system that runs parallel with it. The program automatically collects data for evaluation. Data include mainly time data, positions the mouse cursor, printScreens, i-grams of previous, etc. I propose a method of evaluation of the data, which identifies the degree of the friendliness of the information system to the user. Thus, the output of the method is the conclusion whether users, who work with the information system, can handle effectively work with it.

  14. A quantitative comparison of single-cell whole genome amplification methods.

    Directory of Open Access Journals (Sweden)

    Charles F A de Bourcy

    Full Text Available Single-cell sequencing is emerging as an important tool for studies of genomic heterogeneity. Whole genome amplification (WGA is a key step in single-cell sequencing workflows and a multitude of methods have been introduced. Here, we compare three state-of-the-art methods on both bulk and single-cell samples of E. coli DNA: Multiple Displacement Amplification (MDA, Multiple Annealing and Looping Based Amplification Cycles (MALBAC, and the PicoPLEX single-cell WGA kit (NEB-WGA. We considered the effects of reaction gain on coverage uniformity, error rates and the level of background contamination. We compared the suitability of the different WGA methods for the detection of copy-number variations, for the detection of single-nucleotide polymorphisms and for de-novo genome assembly. No single method performed best across all criteria and significant differences in characteristics were observed; the choice of which amplifier to use will depend strongly on the details of the type of question being asked in any given experiment.

  15. Use of simulated data sets to evaluate the fidelity of Metagenomicprocessing methods

    Energy Technology Data Exchange (ETDEWEB)

    Mavromatis, Konstantinos; Ivanova, Natalia; Barry, Kerri; Shapiro, Harris; Goltsman, Eugene; McHardy, Alice C.; Rigoutsos, Isidore; Salamov, Asaf; Korzeniewski, Frank; Land, Miriam; Lapidus, Alla; Grigoriev, Igor; Richardson, Paul; Hugenholtz, Philip; Kyrpides, Nikos C.

    2006-12-01

    Metagenomics is a rapidly emerging field of research for studying microbial communities. To evaluate methods presently used to process metagenomic sequences, we constructed three simulated data sets of varying complexity by combining sequencing reads randomly selected from 113 isolate genomes. These data sets were designed to model real metagenomes in terms of complexity and phylogenetic composition. We assembled sampled reads using three commonly used genome assemblers (Phrap, Arachne and JAZZ), and predicted genes using two popular gene finding pipelines (fgenesb and CRITICA/GLIMMER). The phylogenetic origins of the assembled contigs were predicted using one sequence similarity--based (blast hit distribution) and two sequence composition--based (PhyloPythia, oligonucleotide frequencies) binning methods. We explored the effects of the simulated community structure and method combinations on the fidelity of each processing step by comparison to the corresponding isolate genomes. The simulated data sets are available online to facilitate standardized benchmarking of tools for metagenomic analysis.

  16. Use of simulated data sets to evaluate the fidelity of metagenomic processing methods

    Energy Technology Data Exchange (ETDEWEB)

    Mavromatis, K [U.S. Department of Energy, Joint Genome Institute; Ivanova, N [U.S. Department of Energy, Joint Genome Institute; Barry, Kerrie [U.S. Department of Energy, Joint Genome Institute; Shapiro, Harris [U.S. Department of Energy, Joint Genome Institute; Goltsman, Eugene [U.S. Department of Energy, Joint Genome Institute; McHardy, Alice C. [IBM T. J. Watson Research Center; Rigoutsos, Isidore [IBM T. J. Watson Research Center; Salamov, Asaf [U.S. Department of Energy, Joint Genome Institute; Korzeniewski, Frank [U.S. Department of Energy, Joint Genome Institute; Land, Miriam L [ORNL; Lapidus, Alla L. [U.S. Department of Energy, Joint Genome Institute; Grigoriev, Igor [U.S. Department of Energy, Joint Genome Institute; Hugenholtz, Philip [U.S. Department of Energy, Joint Genome Institute; Kyrpides, Nikos C [U.S. Department of Energy, Joint Genome Institute

    2007-01-01

    Metagenomics is a rapidly emerging field of research for studying microbial communities. To evaluate methods presently used to process metagenomic sequences, we constructed three simulated data sets of varying complexity by combining sequencing reads randomly selected from 113 isolate genomes. These data sets were designed to model real metagenomes in terms of complexity and phylogenetic composition. We assembled sampled reads using three commonly used genome assemblers (Phrap, Arachne and JAZZ), and predicted genes using two popular gene-finding pipelines (fgenesb and CRITICA/GLIMMER). The phylogenetic origins of the assembled contigs were predicted using one sequence similarity-based ( blast hit distribution) and two sequence composition-based (PhyloPythia, oligonucleotide frequencies) binning methods. We explored the effects of the simulated community structure and method combinations on the fidelity of each processing step by comparison to the corresponding isolate genomes. The simulated data sets are available online to facilitate standardized benchmarking of tools for metagenomic analysis.

  17. Research into real-option evaluation method

    International Nuclear Information System (INIS)

    Shiba, Tsuyoshi; Wakamatsu, Hitoshi

    2002-03-01

    As an evaluational method for valuation of a corporation, an investment project, a research and development, or the evaluation technique of an enterprise strategy, a real option analysis attracts attention instead of conventional Discount Cash Flow method. The reason is that it can divert the technique for the option valuation in financial engineering to the decision-making process performed according to change in investment environment. Related references, the analysis tools, the application examples, etc. were investigated about the decision-making technique using real option analysis, and this investigation considered the application method to decision-making of the research and development at Japan Nuclear Cycle Development Institute. Consequently, since the feature is in real option analysis being the evaluation technique on condition of that business conditions and business itself also change, the real option analysis fits for evaluation of a research and development that business conditions were opaque and it turns out that the businesses are highly flexible. Moreover, it turns out that it fits also for evaluation of a capital concentration type investment issue like power plants. (author)

  18. Veterinary and human vaccine evaluation methods

    Science.gov (United States)

    Knight-Jones, T. J. D.; Edmond, K.; Gubbins, S.; Paton, D. J.

    2014-01-01

    Despite the universal importance of vaccines, approaches to human and veterinary vaccine evaluation differ markedly. For human vaccines, vaccine efficacy is the proportion of vaccinated individuals protected by the vaccine against a defined outcome under ideal conditions, whereas for veterinary vaccines the term is used for a range of measures of vaccine protection. The evaluation of vaccine effectiveness, vaccine protection assessed under routine programme conditions, is largely limited to human vaccines. Challenge studies under controlled conditions and sero-conversion studies are widely used when evaluating veterinary vaccines, whereas human vaccines are generally evaluated in terms of protection against natural challenge assessed in trials or post-marketing observational studies. Although challenge studies provide a standardized platform on which to compare different vaccines, they do not capture the variation that occurs under field conditions. Field studies of vaccine effectiveness are needed to assess the performance of a vaccination programme. However, if vaccination is performed without central co-ordination, as is often the case for veterinary vaccines, evaluation will be limited. This paper reviews approaches to veterinary vaccine evaluation in comparison to evaluation methods used for human vaccines. Foot-and-mouth disease has been used to illustrate the veterinary approach. Recommendations are made for standardization of terminology and for rigorous evaluation of veterinary vaccines. PMID:24741009

  19. Systematic differences in the response of genetic variation to pedigree and genome-based selection methods.

    Science.gov (United States)

    Heidaritabar, M; Vereijken, A; Muir, W M; Meuwissen, T; Cheng, H; Megens, H-J; Groenen, M A M; Bastiaansen, J W M

    2014-12-01

    Genomic selection (GS) is a DNA-based method of selecting for quantitative traits in animal and plant breeding, and offers a potentially superior alternative to traditional breeding methods that rely on pedigree and phenotype information. Using a 60 K SNP chip with markers spaced throughout the entire chicken genome, we compared the impact of GS and traditional BLUP (best linear unbiased prediction) selection methods applied side-by-side in three different lines of egg-laying chickens. Differences were demonstrated between methods, both at the level and genomic distribution of allele frequency changes. In all three lines, the average allele frequency changes were larger with GS, 0.056 0.064 and 0.066, compared with BLUP, 0.044, 0.045 and 0.036 for lines B1, B2 and W1, respectively. With BLUP, 35 selected regions (empirical P selected regions were identified. Empirical thresholds for local allele frequency changes were determined from gene dropping, and differed considerably between GS (0.167-0.198) and BLUP (0.105-0.126). Between lines, the genomic regions with large changes in allele frequencies showed limited overlap. Our results show that GS applies selection pressure much more locally than BLUP, resulting in larger allele frequency changes. With these results, novel insights into the nature of selection on quantitative traits have been gained and important questions regarding the long-term impact of GS are raised. The rapid changes to a part of the genetic architecture, while another part may not be selected, at least in the short term, require careful consideration, especially when selection occurs before phenotypes are observed.

  20. Constructs and methods for genome editing and genetic engineering of fungi and protists

    Science.gov (United States)

    Hittinger, Christopher Todd; Alexander, William Gerald

    2018-01-30

    Provided herein are constructs for genome editing or genetic engineering in fungi or protists, methods of using the constructs and media for use in selecting cells. The construct include a polynucleotide encoding a thymidine kinase operably connected to a promoter, suitably a constitutive promoter; a polynucleotide encoding an endonuclease operably connected to an inducible promoter; and a recognition site for the endonuclease. The constructs may also include selectable markers for use in selecting recombinations.

  1. Evaluation of laboratory diagnostic methods for cryptosporidiosis ...

    African Journals Online (AJOL)

    The conventional method of modified Ziehl-Neelsen (MZN) is very laborious, and stool ... Objective: This research was set to evaluate the diagnostic efficacy of ... 93.1% (MZN to ELISA) with ELISA techniques showing the highest sensitivity at ...

  2. Simplified methods for evaluating road prism stability

    Science.gov (United States)

    William J. Elliot; Mark Ballerini; David Hall

    2003-01-01

    Mass failure is one of the most common failures of low-volume roads in mountainous terrain. Current methods for evaluating stability of these roads require a geotechnical specialist. A stability analysis program, XSTABL, was used to estimate the stability of 3,696 combinations of road geometry, soil, and groundwater conditions. A sensitivity analysis was carried out to...

  3. Evaluation of Abiotic Resource LCIA Methods

    Directory of Open Access Journals (Sweden)

    Rodrigo A. F. Alvarenga

    2016-02-01

    Full Text Available In a life cycle assessment (LCA, the impacts on resources are evaluated at the area of protection (AoP with the same name, through life cycle impact assessment (LCIA methods. There are different LCIA methods available in literature that assesses abiotic resources, and the goal of this study was to propose recommendations for that impact category. We evaluated 19 different LCIA methods, through two criteria (scientific robustness and scope, divided into three assessment levels, i.e., resource accounting methods (RAM, midpoint, and endpoint. In order to support the assessment, we applied some LCIA methods to a case study of ethylene production. For RAM, the most suitable LCIA method was CEENE (Cumulative Exergy Extraction from the Natural Environment (but SED (Solar Energy Demand and ICEC (Industrial Cumulative Exergy Consumption/ECEC (Ecological Cumulative Exergy Consumption may also be recommended, while the midpoint level was ADP (Abiotic Depletion Potential, and the endpoint level was both the Recipe Endpoint and EPS2000 (Environmental Priority Strategies. We could notice that the assessment for the AoP Resources is not yet well established in the LCA community, since new LCIA methods (with different approaches and assessment frameworks are showing up, and this trend may continue in the future.

  4. Evaluation of Dynamic Methods for Earthwork Assessment

    Directory of Open Access Journals (Sweden)

    Vlček Jozef

    2015-05-01

    Full Text Available Rapid development of road construction imposes requests on fast and quality methods for earthwork quality evaluation. Dynamic methods are now adopted in numerous civil engineering sections. Especially evaluation of the earthwork quality can be sped up using dynamic equipment. This paper presents the results of the parallel measurements of chosen devices for determining the level of compaction of soils. Measurements were used to develop the correlations between values obtained from various apparatuses. Correlations show that examined apparatuses are suitable for examination of compaction level of fine-grained soils with consideration of boundary conditions of used equipment. Presented methods are quick and results can be obtained immediately after measurement, and they are thus suitable in cases when construction works have to be performed in a short period of time.

  5. r2VIM: A new variable selection method for random forests in genome-wide association studies.

    Science.gov (United States)

    Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E

    2016-01-01

    Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.

  6. A Review of Study Designs and Statistical Methods for Genomic Epidemiology Studies using Next Generation Sequencing

    Directory of Open Access Journals (Sweden)

    Qian eWang

    2015-04-01

    Full Text Available Results from numerous linkage and association studies have greatly deepened scientists’ understanding of the genetic basis of many human diseases, yet some important questions remain unanswered. For example, although a large number of disease-associated loci have been identified from genome-wide association studies (GWAS in the past 10 years, it is challenging to interpret these results as most disease-associated markers have no clear functional roles in disease etiology, and all the identified genomic factors only explain a small portion of disease heritability. With the help of next-generation sequencing (NGS, diverse types of genomic and epigenetic variations can be detected with high accuracy. More importantly, instead of using linkage disequilibrium to detect association signals based on a set of pre-set probes, NGS allows researchers to directly study all the variants in each individual, therefore promises opportunities for identifying functional variants and a more comprehensive dissection of disease heritability. Although the current scale of NGS studies is still limited due to the high cost, the success of several recent studies suggests the great potential for applying NGS in genomic epidemiology, especially as the cost of sequencing continues to drop. In this review, we discuss several pioneer applications of NGS, summarize scientific discoveries for rare and complex diseases, and compare various study designs including targeted sequencing and whole-genome sequencing using population-based and family-based cohorts. Finally, we highlight recent advancements in statistical methods proposed for sequencing analysis, including group-based association tests, meta-analysis techniques, and annotation tools for variant prioritization.

  7. A simple method of genomic DNA extraction suitable for analysis of bulk fungal strains.

    Science.gov (United States)

    Zhang, Y J; Zhang, S; Liu, X Z; Wen, H A; Wang, M

    2010-07-01

    A simple and rapid method (designated thermolysis) for extracting genomic DNA from bulk fungal strains was described. In the thermolysis method, a few mycelia or yeast cells were first rinsed with pure water to remove potential PCR inhibitors and then incubated in a lysis buffer at 85 degrees C to break down cell walls and membranes. This method was used to extract genomic DNA from large numbers of fungal strains (more than 92 species, 35 genera of three phyla) isolated from different sections of natural Ophiocordyceps sinensis specimens. Regions of interest from high as well as single-copy number genes were successfully amplified from the extracted DNA samples. The DNA samples obtained by this method can be stored at -20 degrees C for over 1 year. The method was effective, easy and fast and allowed batch DNA extraction from multiple fungal isolates. Use of the thermolysis method will allow researchers to obtain DNA from fungi quickly for use in molecular assays. This method requires only minute quantities of starting material and is suitable for diverse fungal species.

  8. Evaluation of nine popular de novo assemblers in microbial genome assembly.

    Science.gov (United States)

    Forouzan, Esmaeil; Maleki, Masoumeh Sadat Mousavi; Karkhane, Ali Asghar; Yakhchali, Bagher

    2017-12-01

    Next generation sequencing (NGS) technologies are revolutionizing biology, with Illumina being the most popular NGS platform. Short read assembly is a critical part of most genome studies using NGS. Hence, in this study, the performance of nine well-known assemblers was evaluated in the assembly of seven different microbial genomes. Effect of different read coverage and k-mer parameters on the quality of the assembly were also evaluated on both simulated and actual read datasets. Our results show that the performance of assemblers on real and simulated datasets could be significantly different, mainly because of coverage bias. According to outputs on actual read datasets, for all studied read coverages (of 7×, 25× and 100×), SPAdes and IDBA-UD clearly outperformed other assemblers based on NGA50 and accuracy metrics. Velvet is the most conservative assembler with the lowest NGA50 and error rate. Copyright © 2017. Published by Elsevier B.V.

  9. Ultrahigh-dimensional variable selection method for whole-genome gene-gene interaction analysis

    Directory of Open Access Journals (Sweden)

    Ueki Masao

    2012-05-01

    Full Text Available Abstract Background Genome-wide gene-gene interaction analysis using single nucleotide polymorphisms (SNPs is an attractive way for identification of genetic components that confers susceptibility of human complex diseases. Individual hypothesis testing for SNP-SNP pairs as in common genome-wide association study (GWAS however involves difficulty in setting overall p-value due to complicated correlation structure, namely, the multiple testing problem that causes unacceptable false negative results. A large number of SNP-SNP pairs than sample size, so-called the large p small n problem, precludes simultaneous analysis using multiple regression. The method that overcomes above issues is thus needed. Results We adopt an up-to-date method for ultrahigh-dimensional variable selection termed the sure independence screening (SIS for appropriate handling of numerous number of SNP-SNP interactions by including them as predictor variables in logistic regression. We propose ranking strategy using promising dummy coding methods and following variable selection procedure in the SIS method suitably modified for gene-gene interaction analysis. We also implemented the procedures in a software program, EPISIS, using the cost-effective GPGPU (General-purpose computing on graphics processing units technology. EPISIS can complete exhaustive search for SNP-SNP interactions in standard GWAS dataset within several hours. The proposed method works successfully in simulation experiments and in application to real WTCCC (Wellcome Trust Case–control Consortium data. Conclusions Based on the machine-learning principle, the proposed method gives powerful and flexible genome-wide search for various patterns of gene-gene interaction.

  10. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)-A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes.

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare . However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  11. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    Directory of Open Access Journals (Sweden)

    Karolina Chwialkowska

    2017-11-01

    Full Text Available Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq. We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation

  12. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  13. Genomic signal processing methods for computation of alignment-free distances from DNA sequences.

    Science.gov (United States)

    Borrayo, Ernesto; Mendizabal-Ruiz, E Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P; Morales, J Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments.

  14. Download - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...t_db_link_en.zip (36.3 KB) - 6 Genome analysis methods pgdbj_dna_marker_linkage_map_genome_analysis_methods_... of This Database Site Policy | Contact Us Download - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  15. Update History of This Database - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...B link & Genome analysis methods English archive site is opened. 2012/08/08 PGDBj... Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods is opened. About This...ate History of This Database - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  16. Utilities' nuclear fuel economic evaluation methods

    International Nuclear Information System (INIS)

    Sonz, L.A.

    1987-01-01

    This paper presents the typical perceptions, methods, considerations, and procedures used by an operating electric utility in the economic evaluation of nuclear fuel preparation and utilization scenarios. The means given are probably not an exclusive review of those available, but are the author's recollection of systems employed to select and recommend preferable courses of action. Economic evaluation of proposed nuclear fuel scenarios is an important, but not exclusive, means of deciding on corporate action. If the economic evaluation is performed and coordinated with the other corporate considerations, such as technical and operational ability, electrical system operations management, tax effects, capital management, rates impact, etc., then the resultant recommendation may be employed to the benefit of the customers and, consequently, to the corporation

  17. Methods of Evaluating Performances for Marketing Strategies

    OpenAIRE

    Ioan Cucu

    2005-01-01

    There are specific methods for assessing and improving the effectiveness of a marketing strategy. A marketer should state in the marketing plan what a marketing strategy is supposed to accomplish. These statements should set forth performance standards, which usually are stated in terms of profits, sales, or costs. Actual performance must be measured in similar terms so that comparisons are possible. This paper describes sales analysis and cost analysis, two general ways of evaluating the act...

  18. Assessment methods for the evaluation of vitiligo.

    Science.gov (United States)

    Alghamdi, K M; Kumar, A; Taïeb, A; Ezzedine, K

    2012-12-01

    There is no standardized method for assessing vitiligo. In this article, we review the literature from 1981 to 2011 on different vitiligo assessment methods. We aim to classify the techniques available for vitiligo assessment as subjective, semi-objective or objective; microscopic or macroscopic; and as based on morphometry or colorimetry. Macroscopic morphological measurements include visual assessment, photography in natural or ultraviolet light, photography with computerized image analysis and tristimulus colorimetry or spectrophotometry. Non-invasive micromorphological methods include confocal laser microscopy (CLM). Subjective methods include clinical evaluation by a dermatologist and a vitiligo disease activity score. Semi-objective methods include the Vitiligo Area Scoring Index (VASI) and point-counting methods. Objective methods include software-based image analysis, tristimulus colorimetry, spectrophotometry and CLM. Morphometry is the measurement of the vitiliginous surface area, whereas colorimetry quantitatively analyses skin colour changes caused by erythema or pigment. Most methods involve morphometry, except for the chromameter method, which assesses colorimetry. Some image analysis software programs can assess both morphometry and colorimetry. The details of these programs (Corel Draw, Image Pro Plus, AutoCad and Photoshop) are discussed in the review. Reflectance confocal microscopy provides real-time images and has great potential for the non-invasive assessment of pigmentary lesions. In conclusion, there is no single best method for assessing vitiligo. This review revealed that VASI, the rule of nine and Wood's lamp are likely to be the best techniques available for assessing the degree of pigmentary lesions and measuring the extent and progression of vitiligo in the clinic and in clinical trials. © 2012 The Authors. Journal of the European Academy of Dermatology and Venereology © 2012 European Academy of Dermatology and Venereology.

  19. License - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...t list, Marker list, QTL list, Plant DB link & Genome analysis methods © Satoshi ... Policy | Contact Us License - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  20. Usability Evaluation Method for Agile Software Development

    Directory of Open Access Journals (Sweden)

    Saad Masood Butt

    2015-02-01

    Full Text Available Agile methods are the best fit for tremendously growing software industry due to its flexible and dynamic nature. But the software developed using agile methods do meet the usability standards? To answer this question we can see that majority of agile software development projects currently involve interactive user interface designs, which can only be possible by following User Centered Design (UCD in agile methods. The question here is, how to integrate UCD with agile models. Both Agile models and UCD are iterative in nature but agile models focus on coding and development of software; whereas, UCD focuses on user interface of the software. Similarly, both of them have testing features where the agile model involves automated tested code while UCD involves an expert or a user to test the user interface. In this paper, a new agile usability model is proposed and the evaluation is of the proposed model is presented by practically implementing it in three real life projects. . Key results from these projects clearly show: the proposed agile model incorporates usability evaluation methods, improves the relationship between usability experts to work with agile software experts; in addition, allows agile developers to incorporate the result from UCD into subsequent interactions.

  1. Development of a transient criticality evaluation method

    International Nuclear Information System (INIS)

    Pain, C.C.; Eaton, M.D.; Miles, B.; Ziver, A.K.; Gomes, J.L.M.A.; Umpleby, A.P.; Piggott, M.D.; Goddard, A.J.H.; Oliveira, C.R.E. de

    2005-01-01

    In developing a transient criticality evaluation method we model, in full spatial/temporal detail, the neutron fluxes and consequent power and the evolving material properties - their flows, energies, phase changes etc. These methods are embodied in the generic method FETCH code which is based as far as possible on basic principles and is capable of use in exploring safety-related situations somewhat beyond the range of experiment. FETCH is a general geometry code capable of addressing a range of criticality issues in fissile materials. The code embodies both transient radiation transport and transient fluid dynamics. Work on powders, granular materials, porous media and solutions is reviewed. The capability for modelling transient criticality for chemical plant, waste matrices and advanced reactors is also outlined. (author)

  2. Evaluation of three methods of platelet labelling

    International Nuclear Information System (INIS)

    Mortelmans, L.; Verbruggen, A.; Roo, M. de; Vermylen, J.

    1986-01-01

    The study of the kinetics of labelled platelets makes sense only when the platelets preserve their viability after separation and labelling. The separation and labelling procedures described in the manual of two producers of 111 In-oxinate (Amersham, Mallinckrodt) have been evaluated by in vitro aggregation tests. The method of Mallinckrodt diminished the aggregation capacities of the thrombocytes. The labelled platelets with normal in vitro aggregation response (Amersham) were tested in vivo in 11 patients who underwent peripheral bypass surgery. The platelet half-life and the platelet accumulation on bypass grafts were checked one week post-operatively. Because of the poor in vivo response of both methods (exponential half-life curve and bad graft visualization), a third method based on that described by W.A. Heaton et al. 1979 was optimized in the authors' laboratory with good in vitro and in vivo results in 12 patients. (author)

  3. Evaluation of three methods of platelet labelling.

    Science.gov (United States)

    Mortelmans, L; Verbruggen, A; De Roo, M; Vermylen, J

    1986-07-01

    The study of the kinetics of labelled platelets makes sense only when the platelets preserve their viability after separation and labelling. The separation and labelling procedures described in the manual of two producers of 111In-oxinate (Amersham, Mallinckrodt) have been evaluated by in vitro aggregation tests. The method of Mallinckrodt diminished the aggregation capacities of the thrombocytes. The labelled platelets with normal in vitro aggregation response (Amersham) were tested in vivo in 11 patients who underwent peripheral bypass surgery. The platelet half-life and the platelet accumulation on bypass grafts were checked one week post-operatively. Because of the poor in vivo response of both methods (exponential half-life curve and bad graft visualization), a third method was optimized in our laboratory with good in vitro and in vivo results in 12 patients.

  4. BIOETHICS METHODS IN THE ETHICAL, LEGAL, AND SOCIAL IMPLICATIONS OF THE HUMAN GENOME PROJECT LITERATURE

    Science.gov (United States)

    Walker, Rebecca; Morrissey, Clair

    2013-01-01

    While bioethics as a field has concerned itself with methodological issues since the early years, there has been no systematic examination of how ethics is incorporated into research on the Ethical, Legal and Social Implications (ELSI) of the Human Genome Project. Yet ELSI research may bear a particular burden of investigating and substantiating its methods given public funding, an explicitly cross-disciplinary approach, and the perceived significance of adequate responsiveness to advances in genomics. We undertook a qualitative content analysis of a sample of ELSI publications appearing between 2003-2008 with the aim of better understanding the methods, aims, and approaches to ethics that ELSI researchers employ. We found that the aims of ethics within ELSI are largely prescriptive and address multiple groups. We also found that the bioethics methods used in the ELSI literature are both diverse between publications and multiple within publications, but are usually not themselves discussed or employed as suggested by bioethics method proponents. Ethics in ELSI is also sometimes undistinguished from related inquiries (such as social, legal, or political investigations). PMID:23796275

  5. Evaluation of the Cow Rumen Metagenome: Assembly by Single Copy Gene Analysis and Single Cell Genome Assemblies (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    Energy Technology Data Exchange (ETDEWEB)

    Sczyrba, Alex

    2011-10-13

    DOE JGI's Alex Sczyrba on "Evaluation of the Cow Rumen Metagenome" and "Assembly by Single Copy Gene Analysis and Single Cell Genome Assemblies" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  6. Functional regression method for whole genome eQTL epistasis analysis with sequencing data.

    Science.gov (United States)

    Xu, Kelin; Jin, Li; Xiong, Momiao

    2017-05-18

    Epistasis plays an essential rule in understanding the regulation mechanisms and is an essential component of the genetic architecture of the gene expressions. However, interaction analysis of gene expressions remains fundamentally unexplored due to great computational challenges and data availability. Due to variation in splicing, transcription start sites, polyadenylation sites, post-transcriptional RNA editing across the entire gene, and transcription rates of the cells, RNA-seq measurements generate large expression variability and collectively create the observed position level read count curves. A single number for measuring gene expression which is widely used for microarray measured gene expression analysis is highly unlikely to sufficiently account for large expression variation across the gene. Simultaneously analyzing epistatic architecture using the RNA-seq and whole genome sequencing (WGS) data poses enormous challenges. We develop a nonlinear functional regression model (FRGM) with functional responses where the position-level read counts within a gene are taken as a function of genomic position, and functional predictors where genotype profiles are viewed as a function of genomic position, for epistasis analysis with RNA-seq data. Instead of testing the interaction of all possible pair-wises SNPs, the FRGM takes a gene as a basic unit for epistasis analysis, which tests for the interaction of all possible pairs of genes and use all the information that can be accessed to collectively test interaction between all possible pairs of SNPs within two genome regions. By large-scale simulations, we demonstrate that the proposed FRGM for epistasis analysis can achieve the correct type 1 error and has higher power to detect the interactions between genes than the existing methods. The proposed methods are applied to the RNA-seq and WGS data from the 1000 Genome Project. The numbers of pairs of significantly interacting genes after Bonferroni correction

  7. A Bayesian method and its variational approximation for prediction of genomic breeding values in multiple traits

    Directory of Open Access Journals (Sweden)

    Hayashi Takeshi

    2013-01-01

    Full Text Available Abstract Background Genomic selection is an effective tool for animal and plant breeding, allowing effective individual selection without phenotypic records through the prediction of genomic breeding value (GBV. To date, genomic selection has focused on a single trait. However, actual breeding often targets multiple correlated traits, and, therefore, joint analysis taking into consideration the correlation between traits, which might result in more accurate GBV prediction than analyzing each trait separately, is suitable for multi-trait genomic selection. This would require an extension of the prediction model for single-trait GBV to multi-trait case. As the computational burden of multi-trait analysis is even higher than that of single-trait analysis, an effective computational method for constructing a multi-trait prediction model is also needed. Results We described a Bayesian regression model incorporating variable selection for jointly predicting GBVs of multiple traits and devised both an MCMC iteration and variational approximation for Bayesian estimation of parameters in this multi-trait model. The proposed Bayesian procedures with MCMC iteration and variational approximation were referred to as MCBayes and varBayes, respectively. Using simulated datasets of SNP genotypes and phenotypes for three traits with high and low heritabilities, we compared the accuracy in predicting GBVs between multi-trait and single-trait analyses as well as between MCBayes and varBayes. The results showed that, compared to single-trait analysis, multi-trait analysis enabled much more accurate GBV prediction for low-heritability traits correlated with high-heritability traits, by utilizing the correlation structure between traits, while the prediction accuracy for uncorrelated low-heritability traits was comparable or less with multi-trait analysis in comparison with single-trait analysis depending on the setting for prior probability that a SNP has zero

  8. Genome scale models of yeast: towards standardized evaluation and consistent omic integration

    DEFF Research Database (Denmark)

    Sanchez, Benjamin J.; Nielsen, Jens

    2015-01-01

    Genome scale models (GEMs) have enabled remarkable advances in systems biology, acting as functional databases of metabolism, and as scaffolds for the contextualization of high-throughput data. In the case of Saccharomyces cerevisiae (budding yeast), several GEMs have been published and are curre......Genome scale models (GEMs) have enabled remarkable advances in systems biology, acting as functional databases of metabolism, and as scaffolds for the contextualization of high-throughput data. In the case of Saccharomyces cerevisiae (budding yeast), several GEMs have been published...... in which all levels of omics data (from gene expression to flux) have been integrated in yeast GEMs. Relevant conclusions and current challenges for both GEM evaluation and omic integration are highlighted....

  9. Evaluating Digital PCR for the Quantification of Human Genomic DNA: Accessible Amplifiable Targets.

    Science.gov (United States)

    Kline, Margaret C; Romsos, Erica L; Duewer, David L

    2016-02-16

    Polymerase chain reaction (PCR) multiplexed assays perform best when the input quantity of template DNA is controlled to within about a factor of √2. To help ensure that PCR assays yield consistent results over time and place, results from methods used to determine DNA quantity need to be metrologically traceable to a common reference. Many DNA quantitation systems can be accurately calibrated with solutions of DNA in aqueous buffer. Since they do not require external calibration, end-point limiting dilution technologies, collectively termed "digital PCR (dPCR)", have been proposed as suitable for value assigning such DNA calibrants. The performance characteristics of several commercially available dPCR systems have recently been documented using plasmid, viral, or fragmented genomic DNA; dPCR performance with more complex materials, such as human genomic DNA, has been less studied. With the goal of providing a human genomic reference material traceably certified for mass concentration, we are investigating the measurement characteristics of several dPCR systems. We here report results of measurements from multiple PCR assays, on four human genomic DNAs treated with four endonuclease restriction enzymes using both chamber and droplet dPCR platforms. We conclude that dPCR does not estimate the absolute number of PCR targets in a given volume but rather the number of accessible and amplifiable targets. While enzymatic restriction of human genomic DNA increases accessibility for some assays, in well-optimized PCR assays it can reduce the number of amplifiable targets and increase assay variability relative to uncut sample.

  10. Development of methods for evaluating active faults

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    The report for long-term evaluation of active faults was published by the Headquarters for Earthquake Research Promotion on Nov. 2010. After occurrence of the 2011 Tohoku-oki earthquake, the safety review guide with regard to geology and ground of site was revised by the Nuclear Safety Commission on Mar. 2012 with scientific knowledges of the earthquake. The Nuclear Regulation Authority established on Sep. 2012 is newly planning the New Safety Design Standard related to Earthquakes and Tsunamis of Light Water Nuclear Power Reactor Facilities. With respect to those guides and standards, our investigations for developing the methods of evaluating active faults are as follows; (1) For better evaluation on activities of offshore fault, we proposed a work flow to date marine terrace (indicator for offshore fault activity) during the last 400,000 years. We also developed the analysis of fault-related fold for evaluating of blind fault. (2) To clarify the activities of active faults without superstratum, we carried out the color analysis of fault gouge and divided the activities into thousand of years and tens of thousands. (3) To reduce uncertainties of fault activities and frequency of earthquakes, we compiled the survey data and possible errors. (4) For improving seismic hazard analysis, we compiled the fault activities of the Yunotake and Itozawa faults, induced by the 2011 Tohoku-oki earthquake. (author)

  11. New method for evaluating liquefaction potential

    Energy Technology Data Exchange (ETDEWEB)

    Arulmoli, K.; Arulanandan, K.; Seed, H.B.

    1985-01-01

    A new method of indexing the grain and aggregate properties of sand using electrical parameters is described. Correlations are established between these parameters and relative density, D /sub r/ , cyclic stress ratio, /tau//sigma'/sub 0/, and K2 /sub max/ . An electrical probe, used to predict these parameters from in-situ electrical measurements, is described. Evaluations are made of D /sub r/ and /tau//sigma/sub 0/, which are compared with values measured independently from controlled laboratory tests. Reasonable agreement is found between predicted and measured values. The potential applicability of the electrical probe in the field is shown by evaluation of liquefaction and nonliquefaction at sites affected by the 1906 San Francisco, Niigata and Tangshan earthquakes.

  12. Instrumentation and quantitative methods of evaluation

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1991-01-01

    This report summarizes goals and accomplishments of the research program entitled Instrumentation and Quantitative Methods of Evaluation, during the period January 15, 1989 through July 15, 1991. This program is very closely integrated with the radiopharmaceutical program entitled Quantitative Studies in Radiopharmaceutical Science. Together, they constitute the PROGRAM OF NUCLEAR MEDICINE AND QUANTITATIVE IMAGING RESEARCH within The Franklin McLean Memorial Research Institute (FMI). The program addresses problems involving the basic science and technology that underlie the physical and conceptual tools of radiotracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 234 refs., 11 figs., 2 tabs

  13. NRC methods for evaluation of industry training

    International Nuclear Information System (INIS)

    Morisseau, D.S.; Koontz, J.L.; Persensky, J.J.

    1987-01-01

    On March 20, 1985, the Nuclear Regulatory Commission published the Policy Statement on Training and Qualification. The Policy Statement endorsed the INPO-managed Training Accreditation Program because it encompasses the five elements of performance-based training. This paper described the multiple methods that the NRC is using to monitor industry efforts to improve training and implement the NRC Policy Statement on Training and Qualification. The results of the evaluation of industry training improvement programs will be reviewed by the Commissioners in April 1987 to determine the nature of continuing NRC policy and programs for ensuring effective training for the US nuclear industry

  14. Laboratory methods to evaluate therapeutic radiopharmaceuticals

    International Nuclear Information System (INIS)

    Arteaga de Murphy, C.; Rodriguez-Cortes, J.; Pedraza-Lopez, M.; Ramirez-Iglesias, MT.; Ferro-Flores, G.

    2007-01-01

    The overall aim of this coordinated research project was to develop in vivo and in vitro laboratory methods to evaluate therapeutic radiopharmaceuticals. Towards this end, the laboratory methods used in this study are described in detail. Two peptides - an 8 amino acid minigastrin analogue and octreotate - were labelled with 177 Lu. Bombesin was labelled with 99 mTc, and its diagnostic utility was proven. For comparison, 99 mTc-TOC was used. The cell lines used in this study were AR42J cells, which overexpress somatostatin receptors found in neuroendocrine cancers, and PC3 cells, which overexpress gastric releasing peptide receptors (GRP-r) found in human prostate and breast cancers. The animal model chosen was athymic mice with implanted dorsal tumours of pathologically confirmed cell cancers. The methodology described for labelling, quality control, and in vitro and in vivo assays can be easily used with other radionuclides and other peptides of interest. (author)

  15. Comprehensive evaluation of genome-wide 5-hydroxymethylcytosine profiling approaches in human DNA.

    Science.gov (United States)

    Skvortsova, Ksenia; Zotenko, Elena; Luu, Phuc-Loi; Gould, Cathryn M; Nair, Shalima S; Clark, Susan J; Stirzaker, Clare

    2017-01-01

    The discovery that 5-methylcytosine (5mC) can be oxidized to 5-hydroxymethylcytosine (5hmC) by the ten-eleven translocation (TET) proteins has prompted wide interest in the potential role of 5hmC in reshaping the mammalian DNA methylation landscape. The gold-standard bisulphite conversion technologies to study DNA methylation do not distinguish between 5mC and 5hmC. However, new approaches to mapping 5hmC genome-wide have advanced rapidly, although it is unclear how the different methods compare in accurately calling 5hmC. In this study, we provide a comparative analysis on brain DNA using three 5hmC genome-wide approaches, namely whole-genome bisulphite/oxidative bisulphite sequencing (WG Bis/OxBis-seq), Infinium HumanMethylation450 BeadChip arrays coupled with oxidative bisulphite (HM450K Bis/OxBis) and antibody-based immunoprecipitation and sequencing of hydroxymethylated DNA (hMeDIP-seq). We also perform loci-specific TET-assisted bisulphite sequencing (TAB-seq) for validation of candidate regions. We show that whole-genome single-base resolution approaches are advantaged in providing precise 5hmC values but require high sequencing depth to accurately measure 5hmC, as this modification is commonly in low abundance in mammalian cells. HM450K arrays coupled with oxidative bisulphite provide a cost-effective representation of 5hmC distribution, at CpG sites with 5hmC levels >~10%. However, 5hmC analysis is restricted to the genomic location of the probes, which is an important consideration as 5hmC modification is commonly enriched at enhancer elements. Finally, we show that the widely used hMeDIP-seq method provides an efficient genome-wide profile of 5hmC and shows high correlation with WG Bis/OxBis-seq 5hmC distribution in brain DNA. However, in cell line DNA with low levels of 5hmC, hMeDIP-seq-enriched regions are not detected by WG Bis/OxBis or HM450K, either suggesting misinterpretation of 5hmC calls by hMeDIP or lack of sensitivity of the latter methods. We

  16. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context.

    Science.gov (United States)

    Martinez, Josue G; Carroll, Raymond J; Müller, Samuel; Sampson, Joshua N; Chatterjee, Nilanjan

    2011-11-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso.

  17. Evaluation of plasmid and genomic DNA calibrants used for the quantification of genetically modified organisms.

    Science.gov (United States)

    Caprioara-Buda, M; Meyer, W; Jeynov, B; Corbisier, P; Trapmann, S; Emons, H

    2012-07-01

    The reliable quantification of genetically modified organisms (GMOs) by real-time PCR requires, besides thoroughly validated quantitative detection methods, sustainable calibration systems. The latter establishes the anchor points for the measured value and the measurement unit, respectively. In this paper, the suitability of two types of DNA calibrants, i.e. plasmid DNA and genomic DNA extracted from plant leaves, for the certification of the GMO content in reference materials as copy number ratio between two targeted DNA sequences was investigated. The PCR efficiencies and coefficients of determination of the calibration curves as well as the measured copy number ratios for three powder certified reference materials (CRMs), namely ERM-BF415e (NK603 maize), ERM-BF425c (356043 soya), and ERM-BF427c (98140 maize), originally certified for their mass fraction of GMO, were compared for both types of calibrants. In all three systems investigated, the PCR efficiencies of plasmid DNA were slightly closer to the PCR efficiencies observed for the genomic DNA extracted from seed powders rather than those of the genomic DNA extracted from leaves. Although the mean DNA copy number ratios for each CRM overlapped within their uncertainties, the DNA copy number ratios were significantly different using the two types of calibrants. Based on these observations, both plasmid and leaf genomic DNA calibrants would be technically suitable as anchor points for the calibration of the real-time PCR methods applied in this study. However, the most suitable approach to establish a sustainable traceability chain is to fix a reference system based on plasmid DNA.

  18. Methods and Metrics for Evaluating Environmental Dredging ...

    Science.gov (United States)

    This report documents the objectives, approach, methodologies, results, and interpretation of a collaborative research study conducted by the National Risk Management Research Laboratory (NRMRL) and the National Exposure Research laboratory (NERL) of the U.S. Environmental Protection Agency’s (U.S. EPA’s) Office of Research and Development (ORD) and the U.S. EPA’s Great Lakes National Program Office (GLNPO). The objectives of the research study were to: 1) evaluate remedy effectiveness of environmental dredging as applied to contaminated sediments in the Ashtabula River in northeastern Ohio, and 2) monitor the recovery of the surrounding ecosystem. The project was carried out over 6 years from 2006 through 2011 and consisted of the development and evaluation of methods and approaches to assess river and ecosystem conditions prior to dredging (2006), during dredging (2006 and 2007), and following dredging, both short term (2008) and long term (2009-2011). This project report summarizes and interprets the results of this 6-year study to develop and assess methods for monitoring pollutant fate and transport and ecosystem recovery through the use of biological, chemical, and physical lines of evidence (LOEs) such as: 1) comprehensive sampling of and chemical analysis of contaminants in surface, suspended, and historic sediments; 2) extensive grab and multi-level real time water sampling and analysis of contaminants in the water column; 3) sampling, chemi

  19. Development of methods for evaluating active faults

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-08-15

    The HERP report for long-term evaluation of active faults and the NSC safety review guide with regard to geology and ground of site were published on Nov. 2010 and on Dec. 2010, respectively. With respect to those reports, our investigation is as follows; (1) For assessment of seismic hazard, we estimated seismic sources around NPPs based on information of tectonic geomorphology, earthquake distribution and subsurface geology. (2) For evaluation on the activity of blind fault, we calculated the slip rate on the 2008 Iwate-Miyagi Nairiku Earthquake fault, using information on late Quaternary fluvial terraces. (3) To evaluate the magnitude of earthquakes whose sources are difficult to identify, we proposed a new method for calculation of the seismogenic layer thickness. (4) To clarify the activities of active faults without superstratum, we carried out the color analysis of fault gouge and divided the activities into thousand of years and tens of thousands. (5) For improving chronology of sediments, we detected new widespread cryptotephras using mineral chemistry and developed late Quaternary cryptotephrostratigraphy around NPPs. (author)

  20. A haplotype regression approach for genetic evaluation using sequences from the 1000 bull genomes Project

    International Nuclear Information System (INIS)

    Lakhssassi, K.; González-Recio, O.

    2017-01-01

    Haplotypes from sequencing data may improve the prediction accuracy in genomic evaluations as haplotypes are in stronger linkage disequilibrium with quantitative trait loci than markers from SNP chips. This study focuses first, on the creation of haplotypes in a population sample of 450 Holstein animals, with full-sequence data from the 1000 bull genomes project; and second, on incorporating them into the whole genome prediction model. In total, 38,319,258 SNPs (and indels) from Next Generation Sequencing were included in the analysis. After filtering variants with minor allele frequency (MAF< 0.025) 13,912,326 SNPs were available for the haplotypes extraction with findhap.f90. The number of SNPs in the haploblocks was on average 924 SNP (166,552 bp). Unique haplotypes were around 97% in all chromosomes and were ignored leaving 153,428 haplotypes. Estimated haplotypes had a large contribution to the total variance of genomic estimated breeding values for kilogram of protein, Global Type Index, Somatic Cell Score and Days Open (between 32 and 99.9%). Haploblocks containing haplotypes with large effects were selected by filtering for each trait, haplotypes whose effect was larger/lower than the mean plus/minus 3 times the standard deviation (SD) and 1 SD above the mean of the haplotypes effect distribution. Results showed that filtering by 3 SD would not be enough to capture a large proportion of genetic variance, whereas filtering by 1 SD could be useful but model convergence should be considered. Additionally, sequence haplotypes were able to capture additional genetic variance to the polygenic effect for traits undergoing lower selection intensity like fertility and health traits.

  1. A haplotype regression approach for genetic evaluation using sequences from the 1000 bull genomes Project

    Energy Technology Data Exchange (ETDEWEB)

    Lakhssassi, K.; González-Recio, O.

    2017-07-01

    Haplotypes from sequencing data may improve the prediction accuracy in genomic evaluations as haplotypes are in stronger linkage disequilibrium with quantitative trait loci than markers from SNP chips. This study focuses first, on the creation of haplotypes in a population sample of 450 Holstein animals, with full-sequence data from the 1000 bull genomes project; and second, on incorporating them into the whole genome prediction model. In total, 38,319,258 SNPs (and indels) from Next Generation Sequencing were included in the analysis. After filtering variants with minor allele frequency (MAF< 0.025) 13,912,326 SNPs were available for the haplotypes extraction with findhap.f90. The number of SNPs in the haploblocks was on average 924 SNP (166,552 bp). Unique haplotypes were around 97% in all chromosomes and were ignored leaving 153,428 haplotypes. Estimated haplotypes had a large contribution to the total variance of genomic estimated breeding values for kilogram of protein, Global Type Index, Somatic Cell Score and Days Open (between 32 and 99.9%). Haploblocks containing haplotypes with large effects were selected by filtering for each trait, haplotypes whose effect was larger/lower than the mean plus/minus 3 times the standard deviation (SD) and 1 SD above the mean of the haplotypes effect distribution. Results showed that filtering by 3 SD would not be enough to capture a large proportion of genetic variance, whereas filtering by 1 SD could be useful but model convergence should be considered. Additionally, sequence haplotypes were able to capture additional genetic variance to the polygenic effect for traits undergoing lower selection intensity like fertility and health traits.

  2. A simple and efficient total genomic DNA extraction method for individual zooplankton.

    Science.gov (United States)

    Fazhan, Hanafiah; Waiho, Khor; Shahreza, Md Sheriff

    2016-01-01

    Molecular approaches are widely applied in species identification and taxonomic studies of minute zooplankton. One of the most focused zooplankton nowadays is from Subclass Copepoda. Accurate species identification of all life stages of the generally small sized copepods through molecular analysis is important, especially in taxonomic and systematic assessment of harpacticoid copepod populations and to understand their dynamics within the marine community. However, total genomic DNA (TGDNA) extraction from individual harpacticoid copepods can be problematic due to their small size and epibenthic behavior. In this research, six TGDNA extraction methods done on individual harpacticoid copepods were compared. The first new simple, feasible, efficient and consistent TGDNA extraction method was designed and compared with the commercial kit and modified available TGDNA extraction methods. The newly described TGDNA extraction method, "Incubation in PCR buffer" method, yielded good and consistent results based on the high success rate of PCR amplification (82%) compared to other methods. Coupled with its relatively consistent and economical method the "Incubation in PCR buffer" method is highly recommended in the TGDNA extraction of other minute zooplankton species.

  3. Facilitating comparative effectiveness research in cancer genomics: evaluating stakeholder perceptions of the engagement process.

    Science.gov (United States)

    Deverka, Patricia A; Lavallee, Danielle C; Desai, Priyanka J; Armstrong, Joanne; Gorman, Mark; Hole-Curry, Leah; O'Leary, James; Ruffner, B W; Watkins, John; Veenstra, David L; Baker, Laurence H; Unger, Joseph M; Ramsey, Scott D

    2012-07-01

    The Center for Comparative Effectiveness Research in Cancer Genomics completed a 2-year stakeholder-guided process for the prioritization of genomic tests for comparative effectiveness research studies. We sought to evaluate the effectiveness of engagement procedures in achieving project goals and to identify opportunities for future improvements. The evaluation included an online questionnaire, one-on-one telephone interviews and facilitated discussion. Responses to the online questionnaire were tabulated for descriptive purposes, while transcripts from key informant interviews were analyzed using a directed content analysis approach. A total of 11 out of 13 stakeholders completed both the online questionnaire and interview process, while nine participated in the facilitated discussion. Eighty-nine percent of questionnaire items received overall ratings of agree or strongly agree; 11% of responses were rated as neutral with the exception of a single rating of disagreement with an item regarding the clarity of how stakeholder input was incorporated into project decisions. Recommendations for future improvement included developing standard recruitment practices, role descriptions and processes for improved communication with clinical and comparative effectiveness research investigators. Evaluation of the stakeholder engagement process provided constructive feedback for future improvements and should be routinely conducted to ensure maximal effectiveness of stakeholder involvement.

  4. DNA immunoprecipitation semiconductor sequencing (DIP-SC-seq) as a rapid method to generate genome wide epigenetic signatures

    OpenAIRE

    Thomson, John P.; Fawkes, Angie; Ottaviano, Raffaele; Hunter, Jennifer M.; Shukla, Ruchi; Mjoseng, Heidi K.; Clark, Richard; Coutts, Audrey; Murphy, Lee; Meehan, Richard R.

    2015-01-01

    Modification of DNA resulting in 5-methylcytosine (5 mC) or 5-hydroxymethylcytosine (5hmC) has been shown to influence the local chromatin environment and affect transcription. Although recent advances in next generation sequencing technology allow researchers to map epigenetic modifications across the genome, such experiments are often time-consuming and cost prohibitive. Here we present a rapid and cost effective method of generating genome wide DNA modification maps utilising commercially ...

  5. A sensitive, support-vector-machine method for the detection of horizontal gene transfers in viral, archaeal and bacterial genomes.

    Science.gov (United States)

    Tsirigos, Aristotelis; Rigoutsos, Isidore

    2005-01-01

    In earlier work, we introduced and discussed a generalized computational framework for identifying horizontal transfers. This framework relied on a gene's nucleotide composition, obviated the need for knowledge of codon boundaries and database searches, and was shown to perform very well across a wide range of archaeal and bacterial genomes when compared with previously published approaches, such as Codon Adaptation Index and C + G content. Nonetheless, two considerations remained outstanding: we wanted to further increase the sensitivity of detecting horizontal transfers and also to be able to apply the method to increasingly smaller genomes. In the discussion that follows, we present such a method, Wn-SVM, and show that it exhibits a very significant improvement in sensitivity compared with earlier approaches. Wn-SVM uses a one-class support-vector machine and can learn using rather small training sets. This property makes Wn-SVM particularly suitable for studying small-size genomes, similar to those of viruses, as well as the typically larger archaeal and bacterial genomes. We show experimentally that the new method results in a superior performance across a wide range of organisms and that it improves even upon our own earlier method by an average of 10% across all examined genomes. As a small-genome case study, we analyze the genome of the human cytomegalovirus and demonstrate that Wn-SVM correctly identifies regions that are known to be conserved and prototypical of all beta-herpesvirinae, regions that are known to have been acquired horizontally from the human host and, finally, regions that had not up to now been suspected to be horizontally transferred. Atypical region predictions for many eukaryotic viruses, including the alpha-, beta- and gamma-herpesvirinae, and 123 archaeal and bacterial genomes, have been made available online at http://cbcsrv.watson.ibm.com/HGT_SVM/.

  6. Evaluation of potential novel variations and their interactions related to bipolar disorders: analysis of genome-wide association study data.

    Science.gov (United States)

    Acikel, Cengizhan; Aydin Son, Yesim; Celik, Cemil; Gul, Husamettin

    2016-01-01

    Multifactor dimensionality reduction (MDR) is a nonparametric approach that can be used to detect relevant interactions between single-nucleotide polymorphisms (SNPs). The aim of this study was to build the best genomic model based on SNP associations and to identify candidate polymorphisms that are the underlying molecular basis of the bipolar disorders. This study was performed on Whole-Genome Association Study of Bipolar Disorder (dbGaP [database of Genotypes and Phenotypes] study accession number: phs000017.v3.p1) data. After preprocessing of the genotyping data, three classification-based data mining methods (ie, random forest, naïve Bayes, and k-nearest neighbor) were performed. Additionally, as a nonparametric, model-free approach, the MDR method was used to evaluate the SNP profiles. The validity of these methods was evaluated using true classification rate, recall (sensitivity), precision (positive predictive value), and F-measure. Random forests, naïve Bayes, and k-nearest neighbors identified 16, 13, and ten candidate SNPs, respectively. Surprisingly, the top six SNPs were reported by all three methods. Random forests and k-nearest neighbors were more successful than naïve Bayes, with recall values >0.95. On the other hand, MDR generated a model with comparable predictive performance based on five SNPs. Although different SNP profiles were identified in MDR compared to the classification-based models, all models mapped SNPs to the DOCK10 gene. Three classification-based data mining approaches, random forests, naïve Bayes, and k-nearest neighbors, have prioritized similar SNP profiles as predictors of bipolar disorders, in contrast to MDR, which has found different SNPs through analysis of two-way and three-way interactions. The reduced number of associated SNPs discovered by MDR, without loss in the classification performance, would facilitate validation studies and decision support models, and would reduce the cost to develop predictive and

  7. Genetic evaluation using single-step genomic best linear unbiased predictor in American Angus.

    Science.gov (United States)

    Lourenco, D A L; Tsuruta, S; Fragomeni, B O; Masuda, Y; Aguilar, I; Legarra, A; Bertrand, J K; Amen, T S; Wang, L; Moser, D W; Misztal, I

    2015-06-01

    using the APY and recursions on ref_8k gave 97% gains of full ssGBLUP. Genomic evaluation in beef cattle with ssGBLUP is feasible while keeping the models (maternal, multiple trait, and threshold) already used in regular BLUP. Gains in predictivity are dependent on the composition of the reference population. Indirect predictions via SNP effects derived from ssGBLUP allow for accurate genomic predictions on young animals, with no advantage of including PA in the index if the reference population is large. With the APY conditioning on about 10,000 reference animals, ssGBLUP is potentially applicable to a large number of genotyped animals without compromising predictive ability.

  8. A human-machine interface evaluation method: A difficulty evaluation method in information searching (DEMIS)

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2009-01-01

    A human-machine interface (HMI) evaluation method, which is named 'difficulty evaluation method in information searching (DEMIS)', is proposed and demonstrated with an experimental study. The DEMIS is based on a human performance model and two measures of attentional-resource effectiveness in monitoring and detection tasks in nuclear power plants (NPPs). Operator competence and HMI design are modeled to be most significant factors to human performance. One of the two effectiveness measures is fixation-to-importance ratio (FIR) which represents attentional resource (eye fixations) spent on an information source compared to importance of the information source. The other measure is selective attention effectiveness (SAE) which incorporates FIRs for all information sources. The underlying principle of the measures is that the information source should be selectively attended to according to its informational importance. In this study, poor performance in information searching tasks is modeled to be coupled with difficulties caused by poor mental models of operators or/and poor HMI design. Human performance in information searching tasks is evaluated by analyzing the FIR and the SAE. Operator mental models are evaluated by a questionnaire-based method. Then difficulties caused by a poor HMI design are evaluated by a focused interview based on the FIR evaluation and then root causes leading to poor performance are identified in a systematic way.

  9. Evaluation of determinative methods for sodium impurities

    International Nuclear Information System (INIS)

    Molinari, Marcelo; Guido, Osvaldo; Botbol, Jose; Ares, Osvaldo

    1988-01-01

    Sodium, universally accepted as heat transfer fluid in fast breeder reactors, requires a special technology for every operation involved in any applicable methodology, due to its well known chemical reactivity. The purpose of this work is: a) to study the sources and effects of chemical species which, as traces, accompany sodium used in the nuclear field; b) to classify, taking into account, the present requirements and resources of the National Atomic Energy Commission (CNEA), the procedures found in the literature for determination of the most important impurities which exist in experimental liquid sodium systems and c) to describe the principles of the methods and to evaluate them in order to make a selection. It was concluded the convenience to develop, as a first stage, laboratory procedures to determine carbon, oxygen, hydrogen and non-volatile impurities, which besides serving present needs, will be referential for direct methods with undeferred response. The latter are needed in liquid sodium experimental loops and require, primarily, more complex and extended development. Additionally, a description is made of experimental work performed up-to-now in this laboratory, consisting of a transfer device for sodium sampling and a sodium distillation device, adapted from a previous design, with associated vacuum and inert gas systems. It is intended as a separative technique for indirect determination of oxygen and non-volatile impurities. (Author) [es

  10. Correcting for cryptic relatedness by a regression-based genomic control method

    Directory of Open Access Journals (Sweden)

    Yang Yaning

    2009-12-01

    Full Text Available Abstract Background Genomic control (GC method is a useful tool to correct for the cryptic relatedness in population-based association studies. It was originally proposed for correcting for the variance inflation of Cochran-Armitage's additive trend test by using information from unlinked null markers, and was later generalized to be applicable to other tests with the additional requirement that the null markers are matched with the candidate marker in allele frequencies. However, matching allele frequencies limits the number of available null markers and thus limits the applicability of the GC method. On the other hand, errors in genotype/allele frequencies may cause further bias and variance inflation and thereby aggravate the effect of GC correction. Results In this paper, we propose a regression-based GC method using null markers that are not necessarily matched in allele frequencies with the candidate marker. Variation of allele frequencies of the null markers is adjusted by a regression method. Conclusion The proposed method can be readily applied to the Cochran-Armitage's trend tests other than the additive trend test, the Pearson's chi-square test and other robust efficiency tests. Simulation results show that the proposed method is effective in controlling type I error in the presence of population substructure.

  11. Evaluation of methods to assess physical activity

    Science.gov (United States)

    Leenders, Nicole Y. J. M.

    Epidemiological evidence has accumulated that demonstrates that the amount of physical activity-related energy expenditure during a week reduces the incidence of cardiovascular disease, diabetes, obesity, and all-cause mortality. To further understand the amount of daily physical activity and related energy expenditure that are necessary to maintain or improve the functional health status and quality of life, instruments that estimate total (TDEE) and physical activity-related energy expenditure (PAEE) under free-living conditions should be determined to be valid and reliable. Without evaluation of the various methods that estimate TDEE and PAEE with the doubly labeled water (DLW) method in females there will be eventual significant limitations on assessing the efficacy of physical activity interventions on health status in this population. A triaxial accelerometer (Tritrac-R3D, (TT)), an uniaxial (Computer Science and Applications Inc., (CSA)) activity monitor, a Yamax-Digiwalker-500sp°ler , (YX-stepcounter), by measuring heart rate responses (HR method) and a 7-d Physical Activity Recall questionnaire (7-d PAR) were compared with the "criterion method" of DLW during a 7-d period in female adults. The DLW-TDEE was underestimated on average 9, 11 and 15% using 7-d PAR, HR method and TT. The underestimation of DLW-PAEE by 7-d PAR was 21% compared to 47% and 67% for TT and YX-stepcounter. Approximately 56% of the variance in DLW-PAEE*kgsp{-1} is explained by the registration of body movement with accelerometry. A larger proportion of the variance in DLW-PAEE*kgsp{-1} was explained by jointly incorporating information from the vertical and horizontal movement measured with the CSA and Tritrac-R3D (rsp2 = 0.87). Although only a small amount of variance in DLW-PAEE*kgsp{-1} is explained by the number of steps taken per day, because of its low cost and ease of use, the Yamax-stepcounter is useful in studies promoting daily walking. Thus, studies involving the

  12. Methods for the comparative evaluation of pharmaceuticals

    Directory of Open Access Journals (Sweden)

    Busse, Reinhard

    2005-11-01

    Full Text Available Political background: As a German novelty, the Institute for Quality and Efficiency in Health Care (Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen; IGWiG was established in 2004 to, among other tasks, evaluate the benefit of pharmaceuticals. In this context it is of importance that patented pharmaceuticals are only excluded from the reference pricing system if they offer a therapeutic improvement. The institute is commissioned by the Federal Joint Committee (Gemeinsamer Bundesausschuss, G-BA or by the Ministry of Health and Social Security. The German policy objective expressed by the latest health care reform (Gesetz zur Modernisierung der Gesetzlichen Krankenversicherung, GMG is to base decisions on a scientific assessment of pharmaceuticals in comparison to already available treatments. However, procedures and methods are still to be established. Research questions and methods: This health technology assessment (HTA report was commissioned by the German Agency for HTA at the Institute for Medical Documentation and Information (DAHTA@DIMDI. It analysed criteria, procedures, and methods of comparative drug assessment in other EU-/OECD-countries. The research question was the following: How do national public institutions compare medicines in connection with pharmaceutical regulation, i.e. licensing, reimbursement and pricing of drugs? Institutions as well as documents concerning comparative drug evaluation (e.g. regulations, guidelines were identified through internet, systematic literature, and hand searches. Publications were selected according to pre-defined inclusion and exclusion criteria. Documents were analysed in a qualitative matter following an analytic framework that had been developed in advance. Results were summarised narratively and presented in evidence tables. Results and discussion: Currently licensing agencies do not systematically assess a new drug's added value for patients and society. This is why many

  13. Genome-scale Evaluation of the Biotechnological Potential of Red Sea Bacilli Strains

    KAUST Repository

    Othoum, Ghofran K.

    2018-02-01

    The increasing spectrum of multidrug-resistant bacteria has caused a major global public health concern, necessitating the discovery of novel antimicrobial agents. Additionally, recent advancements in the use of microbial cells for the scalable production of industrial enzymes has encouraged the screening of new environments for efficient microbial cell factories. The unique ecological niche of the Red Sea points to the promising metabolic and biosynthetic potential of its microbial system. Here, ten sequenced Bacilli strains, that are isolated from microbial mat and mangrove mud samples from the Red Sea, were evaluated for their use as platforms for protein production and biosynthesis of bioactive compounds. Two of the species (B.paralicheniformis Bac48 and B. litoralis Bac94) were found to secrete twice as much protein as Bacillus subtilis 168, and B. litoralis Bac94 had complete Tat and Sec protein secretion systems. Additionally, four Red Sea Species (B. paralicheniformis Bac48, Virgibacillus sp. Bac330, B. vallismortis Bac111, B. amyloliquefaciens Bac57) showed capabilities for genetic transformation and possessed competence genes. More specifically, the distinctive biosynthetic potential evident in the genomes of B. paralicheniformis Bac48 and B. paralicheniformis Bac84 was assessed and compared to nine available complete genomes of B. licheniformis and three genomes of B. paralicheniformis. A uniquely-structured trans-acyltransferase (trans-AT) polyketide synthase/nonribosomal peptide synthetase (PKS/NRPS) cluster in strains of this species was identified in the genome of B. paralicheniformis 48. In total, the two B. paralicheniformis Red Sea strains were found to be more enriched in modular clusters compared to B. licheniformis strains and B. paralicheniformis strains from other environments. These findings provided more insights into the potential of B. paralicheniformis 48 as a microbial cell factory and encouraged further focus on the strain

  14. Use of different marker pre-selection methods based on single SNP regression in the estimation of Genomic-EBVs

    Directory of Open Access Journals (Sweden)

    Corrado Dimauro

    2010-01-01

    Full Text Available Two methods of SNPs pre-selection based on single marker regression for the estimation of genomic breeding values (G-EBVs were compared using simulated data provided by the XII QTL-MAS workshop: i Bonferroni correction of the significance threshold and ii Permutation test to obtain the reference distribution of the null hypothesis and identify significant markers at P<0.01 and P<0.001 significance thresholds. From the set of markers significant at P<0.001, random subsets of 50% and 25% markers were extracted, to evaluate the effect of further reducing the number of significant SNPs on G-EBV predictions. The Bonferroni correction method allowed the identification of 595 significant SNPs that gave the best G-EBV accuracies in prediction generations (82.80%. The permutation methods gave slightly lower G-EBV accuracies even if a larger number of SNPs resulted significant (2,053 and 1,352 for 0.01 and 0.001 significance thresholds, respectively. Interestingly, halving or dividing by four the number of SNPs significant at P<0.001 resulted in an only slightly decrease of G-EBV accuracies. The genetic structure of the simulated population with few QTL carrying large effects, might have favoured the Bonferroni method.

  15. Digital image envelope: method and evaluation

    Science.gov (United States)

    Huang, H. K.; Cao, Fei; Zhou, Michael Z.; Mogel, Greg T.; Liu, Brent J.; Zhou, Xiaoqiang

    2003-05-01

    Health data security, characterized in terms of data privacy, authenticity, and integrity, is a vital issue when digital images and other patient information are transmitted through public networks in telehealth applications such as teleradiology. Mandates for ensuring health data security have been extensively discussed (for example The Health Insurance Portability and Accountability Act, HIPAA) and health informatics guidelines (such as the DICOM standard) are beginning to focus on issues of data continue to be published by organizing bodies in healthcare; however, there has not been a systematic method developed to ensure data security in medical imaging Because data privacy and authenticity are often managed primarily with firewall and password protection, we have focused our research and development on data integrity. We have developed a systematic method of ensuring medical image data integrity across public networks using the concept of the digital envelope. When a medical image is generated regardless of the modality, three processes are performed: the image signature is obtained, the DICOM image header is encrypted, and a digital envelope is formed by combining the signature and the encrypted header. The envelope is encrypted and embedded in the original image. This assures the security of both the image and the patient ID. The embedded image is encrypted again and transmitted across the network. The reverse process is performed at the receiving site. The result is two digital signatures, one from the original image before transmission, and second from the image after transmission. If the signatures are identical, there has been no alteration of the image. This paper concentrates in the method and evaluation of the digital image envelope.

  16. Glycogenomics as a mass spectrometry-guided genome-mining method for microbial glycosylated molecules.

    Science.gov (United States)

    Kersten, Roland D; Ziemert, Nadine; Gonzalez, David J; Duggan, Brendan M; Nizet, Victor; Dorrestein, Pieter C; Moore, Bradley S

    2013-11-19

    Glycosyl groups are an essential mediator of molecular interactions in cells and on cellular surfaces. There are very few methods that directly relate sugar-containing molecules to their biosynthetic machineries. Here, we introduce glycogenomics as an experiment-guided genome-mining approach for fast characterization of glycosylated natural products (GNPs) and their biosynthetic pathways from genome-sequenced microbes by targeting glycosyl groups in microbial metabolomes. Microbial GNPs consist of aglycone and glycosyl structure groups in which the sugar unit(s) are often critical for the GNP's bioactivity, e.g., by promoting binding to a target biomolecule. GNPs are a structurally diverse class of molecules with important pharmaceutical and agrochemical applications. Herein, O- and N-glycosyl groups are characterized in their sugar monomers by tandem mass spectrometry (MS) and matched to corresponding glycosylation genes in secondary metabolic pathways by a MS-glycogenetic code. The associated aglycone biosynthetic genes of the GNP genotype then classify the natural product to further guide structure elucidation. We highlight the glycogenomic strategy by the characterization of several bioactive glycosylated molecules and their gene clusters, including the anticancer agent cinerubin B from Streptomyces sp. SPB74 and an antibiotic, arenimycin B, from Salinispora arenicola CNB-527.

  17. Methods for open innovation on a genome-design platform associating scientific, commercial, and educational communities in synthetic biology.

    Science.gov (United States)

    Toyoda, Tetsuro

    2011-01-01

    Synthetic biology requires both engineering efficiency and compliance with safety guidelines and ethics. Focusing on the rational construction of biological systems based on engineering principles, synthetic biology depends on a genome-design platform to explore the combinations of multiple biological components or BIO bricks for quickly producing innovative devices. This chapter explains the differences among various platform models and details a methodology for promoting open innovation within the scope of the statutory exemption of patent laws. The detailed platform adopts a centralized evaluation model (CEM), computer-aided design (CAD) bricks, and a freemium model. It is also important for the platform to support the legal aspects of copyrights as well as patent and safety guidelines because intellectual work including DNA sequences designed rationally by human intelligence is basically copyrightable. An informational platform with high traceability, transparency, auditability, and security is required for copyright proof, safety compliance, and incentive management for open innovation in synthetic biology. GenoCon, which we have organized and explained here, is a competition-styled, open-innovation method involving worldwide participants from scientific, commercial, and educational communities that aims to improve the designs of genomic sequences that confer a desired function on an organism. Using only a Web browser, a participating contributor proposes a design expressed with CAD bricks that generate a relevant DNA sequence, which is then experimentally and intensively evaluated by the GenoCon organizers. The CAD bricks that comprise programs and databases as a Semantic Web are developed, executed, shared, reused, and well stocked on the secure Semantic Web platform called the Scientists' Networking System or SciNetS/SciNeS, based on which a CEM research center for synthetic biology and open innovation should be established. Copyright © 2011 Elsevier Inc

  18. Genomic diversity of Saccharomyces cerevisiae yeasts associated with alcoholic fermentation of bacanora produced by artisanal methods.

    Science.gov (United States)

    Álvarez-Ainza, M L; Zamora-Quiñonez, K A; Moreno-Ibarra, G M; Acedo-Félix, E

    2015-03-01

    Bacanora is a spirituous beverage elaborated with Agave angustifolia Haw in an artisanal process. Natural fermentation is mostly performed with native yeasts and bacteria. In this study, 228 strains of yeast like Saccharomyces were isolated from the natural alcoholic fermentation on the production of bacanora. Restriction analysis of the amplified region ITS1-5.8S-ITS2 of the ribosomal DNA genes (RFLPr) were used to confirm the genus, and 182 strains were identified as Saccharomyces cerevisiae. These strains displayed high genomic variability in their chromosomes profiles by karyotyping. Electrophoretic profiles of the strains evaluated showed a large number of chromosomes the size of which ranged between 225 and 2200 kpb approximately.

  19. Systematic evaluation of bias in microbial community profiles induced by whole genome amplification.

    Science.gov (United States)

    Direito, Susana O L; Zaura, Egija; Little, Miranda; Ehrenfreund, Pascale; Röling, Wilfred F M

    2014-03-01

    Whole genome amplification methods facilitate the detection and characterization of microbial communities in low biomass environments. We examined the extent to which the actual community structure is reliably revealed and factors contributing to bias. One widely used [multiple displacement amplification (MDA)] and one new primer-free method [primase-based whole genome amplification (pWGA)] were compared using a polymerase chain reaction (PCR)-based method as control. Pyrosequencing of an environmental sample and principal component analysis revealed that MDA impacted community profiles more strongly than pWGA and indicated that this related to species GC content, although an influence of DNA integrity could not be excluded. Subsequently, biases by species GC content, DNA integrity and fragment size were separately analysed using defined mixtures of DNA from various species. We found significantly less amplification of species with the highest GC content for MDA-based templates and, to a lesser extent, for pWGA. DNA fragmentation also interfered severely: species with more fragmented DNA were less amplified with MDA and pWGA. pWGA was unable to amplify low molecular weight DNA (microbial communities in low-biomass environments and for currently planned astrobiological missions to Mars. © 2013 Society for Applied Microbiology and John Wiley & Sons Ltd.

  20. Methods for the comparative evaluation of pharmaceuticals.

    Science.gov (United States)

    Zentner, Annette; Velasco-Garrido, Marcial; Busse, Reinhard

    2005-11-15

    POLITICAL BACKGROUND: As a German novelty, the Institute for Quality and Efficiency in Health Care (Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen; IGWiG) was established in 2004 to, among other tasks, evaluate the benefit of pharmaceuticals. In this context it is of importance that patented pharmaceuticals are only excluded from the reference pricing system if they offer a therapeutic improvement. The institute is commissioned by the Federal Joint Committee (Gemeinsamer Bundesausschuss, G-BA) or by the Ministry of Health and Social Security. The German policy objective expressed by the latest health care reform (Gesetz zur Modernisierung der Gesetzlichen Krankenversicherung, GMG) is to base decisions on a scientific assessment of pharmaceuticals in comparison to already available treatments. However, procedures and methods are still to be established. This health technology assessment (HTA) report was commissioned by the German Agency for HTA at the Institute for Medical Documentation and Information (DAHTA@DIMDI). It analysed criteria, procedures, and methods of comparative drug assessment in other EU-/OECD-countries. The research question was the following: How do national public institutions compare medicines in connection with pharmaceutical regulation, i.e. licensing, reimbursement and pricing of drugs? Institutions as well as documents concerning comparative drug evaluation (e.g. regulations, guidelines) were identified through internet, systematic literature, and hand searches. Publications were selected according to pre-defined inclusion and exclusion criteria. Documents were analysed in a qualitative matter following an analytic framework that had been developed in advance. Results were summarised narratively and presented in evidence tables. Currently licensing agencies do not systematically assess a new drug's added value for patients and society. This is why many countries made post-licensing evaluation of pharmaceuticals a

  1. Database Description - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ... QTL list, Plant DB link & Genome analysis methods Alternative name - DOI 10.18908/lsdba.nbdc01194-01-000 Cr...ers and QTLs are curated manually from the published literature. The marker information includes marker sequences, genotyping methods... Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  2. A comparison of statistical methods for genomic selection in a mice population

    Directory of Open Access Journals (Sweden)

    Neves Haroldo HR

    2012-11-01

    Full Text Available Abstract Background The availability of high-density panels of SNP markers has opened new perspectives for marker-assisted selection strategies, such that genotypes for these markers are used to predict the genetic merit of selection candidates. Because the number of markers is often much larger than the number of phenotypes, marker effect estimation is not a trivial task. The objective of this research was to compare the predictive performance of ten different statistical methods employed in genomic selection, by analyzing data from a heterogeneous stock mice population. Results For the five traits analyzed (W6W: weight at six weeks, WGS: growth slope, BL: body length, %CD8+: percentage of CD8+ cells, CD4+/ CD8+: ratio between CD4+ and CD8+ cells, within-family predictions were more accurate than across-family predictions, although this superiority in accuracy varied markedly across traits. For within-family prediction, two kernel methods, Reproducing Kernel Hilbert Spaces Regression (RKHS and Support Vector Regression (SVR, were the most accurate for W6W, while a polygenic model also had comparable performance. A form of ridge regression assuming that all markers contribute to the additive variance (RR_GBLUP figured among the most accurate for WGS and BL, while two variable selection methods ( LASSO and Random Forest, RF had the greatest predictive abilities for %CD8+ and CD4+/ CD8+. RF, RKHS, SVR and RR_GBLUP outperformed the remainder methods in terms of bias and inflation of predictions. Conclusions Methods with large conceptual differences reached very similar predictive abilities and a clear re-ranking of methods was observed in function of the trait analyzed. Variable selection methods were more accurate than the remainder in the case of %CD8+ and CD4+/CD8+ and these traits are likely to be influenced by a smaller number of QTL than the remainder. Judged by their overall performance across traits and computational requirements, RR

  3. A MITE-based genotyping method to reveal hundreds of DNA polymorphisms in an animal genome after a few generations of artificial selection

    Directory of Open Access Journals (Sweden)

    Tetreau Guillaume

    2008-10-01

    Full Text Available Abstract Background For most organisms, developing hundreds of genetic markers spanning the whole genome still requires excessive if not unrealistic efforts. In this context, there is an obvious need for methodologies allowing the low-cost, fast and high-throughput genotyping of virtually any species, such as the Diversity Arrays Technology (DArT. One of the crucial steps of the DArT technique is the genome complexity reduction, which allows obtaining a genomic representation characteristic of the studied DNA sample and necessary for subsequent genotyping. In this article, using the mosquito Aedes aegypti as a study model, we describe a new genome complexity reduction method taking advantage of the abundance of miniature inverted repeat transposable elements (MITEs in the genome of this species. Results Ae. aegypti genomic representations were produced following a two-step procedure: (1 restriction digestion of the genomic DNA and simultaneous ligation of a specific adaptor to compatible ends, and (2 amplification of restriction fragments containing a particular MITE element called Pony using two primers, one annealing to the adaptor sequence and one annealing to a conserved sequence motif of the Pony element. Using this protocol, we constructed a library comprising more than 6,000 DArT clones, of which at least 5.70% were highly reliable polymorphic markers for two closely related mosquito strains separated by only a few generations of artificial selection. Within this dataset, linkage disequilibrium was low, and marker redundancy was evaluated at 2.86% only. Most of the detected genetic variability was observed between the two studied mosquito strains, but individuals of the same strain could still be clearly distinguished. Conclusion The new complexity reduction method was particularly efficient to reveal genetic polymorphisms in Ae. egypti. Overall, our results testify of the flexibility of the DArT genotyping technique and open new

  4. In silico method for modelling metabolism and gene product expression at genome scale

    Energy Technology Data Exchange (ETDEWEB)

    Lerman, Joshua A.; Hyduke, Daniel R.; Latif, Haythem; Portnoy, Vasiliy A.; Lewis, Nathan E.; Orth, Jeffrey D.; Rutledge, Alexandra C.; Smith, Richard D.; Adkins, Joshua N.; Zengler, Karsten; Palsson, Bernard O.

    2012-07-03

    Transcription and translation use raw materials and energy generated metabolically to create the macromolecular machinery responsible for all cellular functions, including metabolism. A biochemically accurate model of molecular biology and metabolism will facilitate comprehensive and quantitative computations of an organism's molecular constitution as a function of genetic and environmental parameters. Here we formulate a model of metabolism and macromolecular expression. Prototyping it using the simple microorganism Thermotoga maritima, we show our model accurately simulates variations in cellular composition and gene expression. Moreover, through in silico comparative transcriptomics, the model allows the discovery of new regulons and improving the genome and transcription unit annotations. Our method presents a framework for investigating molecular biology and cellular physiology in silico and may allow quantitative interpretation of multi-omics data sets in the context of an integrated biochemical description of an organism.

  5. G-MAPSEQ – a new method for mapping reads to a reference genome

    Directory of Open Access Journals (Sweden)

    Wojciechowski Pawel

    2016-06-01

    Full Text Available The problem of reads mapping to a reference genome is one of the most essential problems in modern computational biology. The most popular algorithms used to solve this problem are based on the Burrows-Wheeler transform and the FM-index. However, this causes some issues with highly mutated sequences due to a limited number of mutations allowed. G-MAPSEQ is a novel, hybrid algorithm combining two interesting methods: alignment-free sequence comparison and an ultra fast sequence alignment. The former is a fast heuristic algorithm which uses k-mer characteristics of nucleotide sequences to find potential mapping places. The latter is a very fast GPU implementation of sequence alignment used to verify the correctness of these mapping positions. The source code of G-MAPSEQ along with other bioinformatic software is available at: http://gpualign.cs.put.poznan.pl.

  6. SEAM PUCKERING EVALUATION METHOD FOR SEWING PROCESS

    Directory of Open Access Journals (Sweden)

    BRAD Raluca

    2014-07-01

    Full Text Available The paper presents an automated method for the assessment and classification of puckering defects detected during the preproduction control stage of the sewing machine or product inspection. In this respect, we have presented the possible causes and remedies of the wrinkle nonconformities. Subjective factors related to the control environment and operators during the seams evaluation can be reduced using an automated system whose operation is based on image processing. Our implementation involves spectral image analysis using Fourier transform and an unsupervised neural network, the Kohonen Map, employed to classify material specimens, the input images, into five discrete degrees of quality, from grade 5 (best to grade 1 (the worst. The puckering features presented in the learning and test images have been pre-classified using the seam puckering quality standard. The network training stage will consist in presenting five input vectors (derived from the down-sampled arrays, representing the puckering grades. The puckering classification consists in providing an input vector derived from the image supposed to be classified. A scalar product between the input values vectors and the weighted training images is computed. The result will be assigned to one of the five classes of which the input image belongs. Using the Kohonen network the puckering defects were correctly classified in proportion of 71.42%.

  7. Evaluation method of nuclear nonproliferation credibility

    International Nuclear Information System (INIS)

    Kwon, Eun-ha; Ko, Won Il

    2009-01-01

    This paper presents an integrated multicriteria analysis method for the quantitative evaluation of a state's nuclear nonproliferation credibility level. Underscoring the implications of policy on the sources of political demand for nuclear weapons rather than focusing on efforts to restrict the supply of specific weapons technology from the 'haves' to the 'have-nots', the proposed methodology considers the political, social, and cultural dimensions of nuclear proliferation. This methodology comprises three steps: (1) identifying the factors that influence credibility formation and employing them to construct a criteria tree that will illustrate the relationships among these factors; (2) defining the weight coefficients of each criterion through pairwise comparisons of the Analytical Hierarchy Process (AHP); and (3) assigning numerical scores to a state under each criterion and combining them with the weight coefficients in order to provide an overall assessment of the state. The functionality of this methodology is examined by assessing the current level of nuclear nonproliferation credibility of four countries: Japan, North Korea, South Korea, and Switzerland.

  8. Identification of genomic insertion and flanking sequence of G2-EPSPS and GAT transgenes in soybean using whole genome sequencing method

    Directory of Open Access Journals (Sweden)

    Bingfu Guo

    2016-07-01

    Full Text Available Molecular characterization of sequences flanking exogenous fragment insertions is essential for safety assessment and labeling of genetically modified organisms (GMO. In this study, the T-DNA insertion sites and flanking sequences were identified in two newly developed transgenic glyphosate-tolerant soybeans GE-J16 and ZH10-6 based on whole genome sequencing (WGS method. About 21 Gb sequence data (~21× coverage for each line was generated on Illumina HiSeq 2500 platform. The junction reads mapped to boundary of T-DNA and flanking sequences in these two events were identified by comparing all sequencing reads with soybean reference genome and sequence of transgenic vector. The putative insertion loci and flanking sequences were further confirmed by PCR amplification, Sanger sequencing, and co-segregation analysis. All these analyses supported that exogenous T-DNA fragments were integrated in positions of Chr19: 50543767-50543792 and Chr17: 7980527-7980541 in these two transgenic lines. Identification of the genomic insertion site of the G2-EPSPS and GAT transgenes will facilitate the use of their glyphosate-tolerant traits in soybean breeding program. These results also demonstrated that WGS is a cost-effective and rapid method of identifying sites of T-DNA insertions and flanking sequences in soybean.

  9. Quantitative Methods to Evaluate Timetable Attractiveness

    DEFF Research Database (Denmark)

    Schittenhelm, Bernd; Landex, Alex

    2009-01-01

    The article describes how the attractiveness of timetables can be evaluated quantitatively to ensure a consistent evaluation of timetables. Since the different key stakeholders (infrastructure manager, train operating company, customers, and society) have different opinions on what an attractive...

  10. A new method to cluster genomes based on cumulative Fourier power spectrum.

    Science.gov (United States)

    Dong, Rui; Zhu, Ziyue; Yin, Changchuan; He, Rong L; Yau, Stephen S-T

    2018-06-20

    Analyzing phylogenetic relationships using mathematical methods has always been of importance in bioinformatics. Quantitative research may interpret the raw biological data in a precise way. Multiple Sequence Alignment (MSA) is used frequently to analyze biological evolutions, but is very time-consuming. When the scale of data is large, alignment methods cannot finish calculation in reasonable time. Therefore, we present a new method using moments of cumulative Fourier power spectrum in clustering the DNA sequences. Each sequence is translated into a vector in Euclidean space. Distances between the vectors can reflect the relationships between sequences. The mapping between the spectra and moment vector is one-to-one, which means that no information is lost in the power spectra during the calculation. We cluster and classify several datasets including Influenza A, primates, and human rhinovirus (HRV) datasets to build up the phylogenetic trees. Results show that the new proposed cumulative Fourier power spectrum is much faster and more accurately than MSA and another alignment-free method known as k-mer. The research provides us new insights in the study of phylogeny, evolution, and efficient DNA comparison algorithms for large genomes. The computer programs of the cumulative Fourier power spectrum are available at GitHub (https://github.com/YaulabTsinghua/cumulative-Fourier-power-spectrum). Copyright © 2018. Published by Elsevier B.V.

  11. Multidisciplinary eHealth Survey Evaluation Methods

    Science.gov (United States)

    Karras, Bryant T.; Tufano, James T.

    2006-01-01

    This paper describes the development process of an evaluation framework for describing and comparing web survey tools. We believe that this approach will help shape the design, development, deployment, and evaluation of population-based health interventions. A conceptual framework for describing and evaluating web survey systems will enable the…

  12. KGCAK: a K-mer based database for genome-wide phylogeny and complexity evaluation.

    Science.gov (United States)

    Wang, Dapeng; Xu, Jiayue; Yu, Jun

    2015-09-16

    The K-mer approach, treating genomic sequences as simple characters and counting the relative abundance of each string upon a fixed K, has been extensively applied to phylogeny inference for genome assembly, annotation, and comparison. To meet increasing demands for comparing large genome sequences and to promote the use of the K-mer approach, we develop a versatile database, KGCAK ( http://kgcak.big.ac.cn/KGCAK/ ), containing ~8,000 genomes that include genome sequences of diverse life forms (viruses, prokaryotes, protists, animals, and plants) and cellular organelles of eukaryotic lineages. It builds phylogeny based on genomic elements in an alignment-free fashion and provides in-depth data processing enabling users to compare the complexity of genome sequences based on K-mer distribution. We hope that KGCAK becomes a powerful tool for exploring relationship within and among groups of species in a tree of life based on genomic data.

  13. Methods of marketing and advertising activity evaluation

    Directory of Open Access Journals (Sweden)

    A.I. Yakovlev

    2016-09-01

    Full Text Available The result of the business entities’ activities is associated with the development of instruments of the economic processes efficiency determination, including marketing activities. It has determined the purpose of the article. The methodological principles in this area are developed. It is proved that the increase in sales of the profit margin is only partly dependent on the implementation of advertising measures. The methodical approaches for estimation of exhibition and advertising activity and promotion of its employees are specified. The results of work involve evaluation of the advertising effect value on the basis of share of the advertising impact on the increase of sales and revenue from the sale of products. The corresponding proportion of such impact is determined based on the consumer inquiry. The index of trade fair works, its calculation based on two components: how many times a specific company participated in such events; and how well the company was presented at relevant trade fairs. The indices of the cost on advertising and promotion of certain products manufacturer are provided. The scientific innovation of the research is as follows. It is proved that the sales increase effect should not be assigned to advertising only. The compositions that influence the consumer preferences and their share in the total value effect are determined. The new is the proposed index of influence of the trade fair work results depending on the selected factors. The practical importance of the research results involve more accurate calculation of the effect of the activities made and, consequently, increase efficiency of the business entities.

  14. 4C-ker: A Method to Reproducibly Identify Genome-Wide Interactions Captured by 4C-Seq Experiments.

    Science.gov (United States)

    Raviram, Ramya; Rocha, Pedro P; Müller, Christian L; Miraldi, Emily R; Badri, Sana; Fu, Yi; Swanzey, Emily; Proudhon, Charlotte; Snetkova, Valentina; Bonneau, Richard; Skok, Jane A

    2016-03-01

    4C-Seq has proven to be a powerful technique to identify genome-wide interactions with a single locus of interest (or "bait") that can be important for gene regulation. However, analysis of 4C-Seq data is complicated by the many biases inherent to the technique. An important consideration when dealing with 4C-Seq data is the differences in resolution of signal across the genome that result from differences in 3D distance separation from the bait. This leads to the highest signal in the region immediately surrounding the bait and increasingly lower signals in far-cis and trans. Another important aspect of 4C-Seq experiments is the resolution, which is greatly influenced by the choice of restriction enzyme and the frequency at which it can cut the genome. Thus, it is important that a 4C-Seq analysis method is flexible enough to analyze data generated using different enzymes and to identify interactions across the entire genome. Current methods for 4C-Seq analysis only identify interactions in regions near the bait or in regions located in far-cis and trans, but no method comprehensively analyzes 4C signals of different length scales. In addition, some methods also fail in experiments where chromatin fragments are generated using frequent cutter restriction enzymes. Here, we describe 4C-ker, a Hidden-Markov Model based pipeline that identifies regions throughout the genome that interact with the 4C bait locus. In addition, we incorporate methods for the identification of differential interactions in multiple 4C-seq datasets collected from different genotypes or experimental conditions. Adaptive window sizes are used to correct for differences in signal coverage in near-bait regions, far-cis and trans chromosomes. Using several datasets, we demonstrate that 4C-ker outperforms all existing 4C-Seq pipelines in its ability to reproducibly identify interaction domains at all genomic ranges with different resolution enzymes.

  15. Ancient genomes

    OpenAIRE

    Hoelzel, A Rus

    2005-01-01

    Ever since its invention, the polymerase chain reaction has been the method of choice for work with ancient DNA. In an application of modern genomic methods to material from the Pleistocene, a recent study has instead undertaken to clone and sequence a portion of the ancient genome of the cave bear.

  16. Comparison on genomic predictions using GBLUP models and two single-step blending methods with different relationship matrices in the Nordic Holstein population

    DEFF Research Database (Denmark)

    Gao, Hongding; Christensen, Ole Fredslund; Madsen, Per

    2012-01-01

    Background A single-step blending approach allows genomic prediction using information of genotyped and non-genotyped animals simultaneously. However, the combined relationship matrix in a single-step method may need to be adjusted because marker-based and pedigree-based relationship matrices may...... not be on the same scale. The same may apply when a GBLUP model includes both genomic breeding values and residual polygenic effects. The objective of this study was to compare single-step blending methods and GBLUP methods with and without adjustment of the genomic relationship matrix for genomic prediction of 16......) a simple GBLUP method, 2) a GBLUP method with a polygenic effect, 3) an adjusted GBLUP method with a polygenic effect, 4) a single-step blending method, and 5) an adjusted single-step blending method. In the adjusted GBLUP and single-step methods, the genomic relationship matrix was adjusted...

  17. A New Method for the Evaluation of Vaccine Safety Based on Comprehensive Gene Expression Analysis

    Directory of Open Access Journals (Sweden)

    Haruka Momose

    2010-01-01

    Full Text Available For the past 50 years, quality control and safety tests have been used to evaluate vaccine safety. However, conventional animal safety tests need to be improved in several aspects. For example, the number of test animals used needs to be reduced and the test period shortened. It is, therefore, necessary to develop a new vaccine evaluation system. In this review, we show that gene expression patterns are well correlated to biological responses in vaccinated rats. Our findings and methods using experimental biology and genome science provide an important means of assessment for vaccine toxicity.

  18. Genome analysis methods: Arabidopsis lyrata [PGDBj Registered plant list, Marker list, QTL list, Plant DB link and Genome analysis methods[Archive

    Lifescience Database Archive (English)

    Full Text Available (http://genome.imim.es/software/geneid/) applying dicot and A. thaliana specific matrices 32,670 (v1.0) JGI; http://www.phytozome.net/alyrata v1.0 v1.0 10.1038/ng.807 21478890 ... ...8.3x Arachne 1,309 ... Fgenesh package of ab initio and homology-based gene predictors, EuGene12, and GeneID13

  19. Genome analysis methods: Sorghum bicolor [PGDBj Registered plant list, Marker list, QTL list, Plant DB link and Genome analysis methods[Archive

    Lifescience Database Archive (English)

    Full Text Available Sorghum bicolor Finished 2n=20 760 Mb 2009 Sanger (Clone-based) 10,717,203 reads 7...30 Mb 8.5x Arachne2 v.20060705 3,304 12,873 BLAST, GenomeScan 34,496 (Sbi1.4) JGI; http://www.phytozome.net/sorghum Sbi1 Sbi1.4 10.1038/nature07723 19189423 ...

  20. Genome analysis methods: Lotus japonicus [PGDBj Registered plant list, Marker list, QTL list, Plant DB link and Genome analysis methods[Archive

    Lifescience Database Archive (English)

    Full Text Available Lotus japonicus Draft 2n=12 472 Mb 2008 Sanger (Clone-based) ... 315.1 Mb 3-5x Parace...l Genome Assembler 954 110,940 Kazusa Annotation PipelinE for Lotus japonicus (KAPSEL) 37,971 (v2.5) KDRI; http://www.kazusa.or.jp/lotus/ v2.5 v2.5 10.1093/dnares/dsn008 18511435 ...

  1. Comparison of manual methods of extracting genomic DNA from dried blood spots collected on different cards: implications for clinical practice.

    Science.gov (United States)

    Molteni, C G; Terranova, L; Zampiero, A; Galeone, C; Principi, N; Esposito, S

    2013-01-01

    Isolating genomic DNA from blood samples is essential when studying the associations between genetic variants and susceptibility to a given clinical condition, or its severity. This study of three extraction techniques and two types of commercially available cards involved 219 children attending our outpatient pediatric clinic for follow-up laboratory tests after they had been hospitalised. An aliquot of venous blood was drawn into plastic tubes without additives and, after several inversions, 80 microL were put on circles of common paper cards and Whatman FTA-treated cards. Three extraction methods were compared: the Qiagen Investigator, Gensolve, and Masterpure. The best method in terms of final DNA yield was Masterpure, which led to a significantly higher yield regardless of the type of card (p less than 0.001), followed by Qiagen Investigator and Gensolve. Masterpure was also the best in terms of price, seemed to be simple and reliable, and required less hands-on time than other techniques. These conclusions support the use of Masterpure in studies that evaluate the associations between genetic variants and the severity or prevalence of infectious diseases.

  2. Research on psychological evaluation method for nuclear power plant operators

    International Nuclear Information System (INIS)

    Fang Xiang; He Xuhong; Zhao Bingquan

    2007-01-01

    The qualitative and quantitative psychology evaluation methods to the nuclear power plant operators were analyzed and discussed in the paper. The comparison analysis to the scope and result of application was carried out between method of outline figure fitted and method of fuzzy synthetic evaluation. The research results can be referenced to the evaluation of nuclear power plant operators. (authors)

  3. Platform comparison for evaluation of ALK protein immunohistochemical expression, genomic copy number and hotspot mutation status in neuroblastomas.

    Directory of Open Access Journals (Sweden)

    Benedict Yan

    Full Text Available ALK is an established causative oncogenic driver in neuroblastoma, and is likely to emerge as a routine biomarker in neuroblastoma diagnostics. At present, the optimal strategy for clinical diagnostic evaluation of ALK protein, genomic and hotspot mutation status is not well-studied. We evaluated ALK immunohistochemical (IHC protein expression using three different antibodies (ALK1, 5A4 and D5F3 clones, ALK genomic status using single-color chromogenic in situ hybridization (CISH, and ALK hotspot mutation status using conventional Sanger sequencing and a next-generation sequencing platform (Ion Torrent Personal Genome Machine (IT-PGM, in archival formalin-fixed, paraffin-embedded neuroblastoma samples. We found a significant difference in IHC results using the three different antibodies, with the highest percentage of positive cases seen on D5F3 immunohistochemistry. Correlation with ALK genomic and hotspot mutational status revealed that the majority of D5F3 ALK-positive cases did not possess either ALK genomic amplification or hotspot mutations. Comparison of sequencing platforms showed a perfect correlation between conventional Sanger and IT-PGM sequencing. Our findings suggest that D5F3 immunohistochemistry, single-color CISH and IT-PGM sequencing are suitable assays for evaluation of ALK status in future neuroblastoma clinical trials.

  4. Analysis of IAV Replication and Co-infection Dynamics by a Versatile RNA Viral Genome Labeling Method

    Directory of Open Access Journals (Sweden)

    Dan Dou

    2017-07-01

    Full Text Available Genome delivery to the proper cellular compartment for transcription and replication is a primary goal of viruses. However, methods for analyzing viral genome localization and differentiating genomes with high identity are lacking, making it difficult to investigate entry-related processes and co-examine heterogeneous RNA viral populations. Here, we present an RNA labeling approach for single-cell analysis of RNA viral replication and co-infection dynamics in situ, which uses the versatility of padlock probes. We applied this method to identify influenza A virus (IAV infections in cells and lung tissue with single-nucleotide specificity and to classify entry and replication stages by gene segment localization. Extending the classification strategy to co-infections of IAVs with single-nucleotide variations, we found that the dependence on intracellular trafficking places a time restriction on secondary co-infections necessary for genome reassortment. Altogether, these data demonstrate how RNA viral genome labeling can help dissect entry and co-infections.

  5. A comparison of genomic selection models across time in interior spruce (Picea engelmannii × glauca) using unordered SNP imputation methods.

    Science.gov (United States)

    Ratcliffe, B; El-Dien, O G; Klápště, J; Porth, I; Chen, C; Jaquish, B; El-Kassaby, Y A

    2015-12-01

    Genomic selection (GS) potentially offers an unparalleled advantage over traditional pedigree-based selection (TS) methods by reducing the time commitment required to carry out a single cycle of tree improvement. This quality is particularly appealing to tree breeders, where lengthy improvement cycles are the norm. We explored the prospect of implementing GS for interior spruce (Picea engelmannii × glauca) utilizing a genotyped population of 769 trees belonging to 25 open-pollinated families. A series of repeated tree height measurements through ages 3-40 years permitted the testing of GS methods temporally. The genotyping-by-sequencing (GBS) platform was used for single nucleotide polymorphism (SNP) discovery in conjunction with three unordered imputation methods applied to a data set with 60% missing information. Further, three diverse GS models were evaluated based on predictive accuracy (PA), and their marker effects. Moderate levels of PA (0.31-0.55) were observed and were of sufficient capacity to deliver improved selection response over TS. Additionally, PA varied substantially through time accordingly with spatial competition among trees. As expected, temporal PA was well correlated with age-age genetic correlation (r=0.99), and decreased substantially with increasing difference in age between the training and validation populations (0.04-0.47). Moreover, our imputation comparisons indicate that k-nearest neighbor and singular value decomposition yielded a greater number of SNPs and gave higher predictive accuracies than imputing with the mean. Furthermore, the ridge regression (rrBLUP) and BayesCπ (BCπ) models both yielded equal, and better PA than the generalized ridge regression heteroscedastic effect model for the traits evaluated.

  6. An analysis of normalization methods for Drosophila RNAi genomic screens and development of a robust validation scheme

    Science.gov (United States)

    Wiles, Amy M.; Ravi, Dashnamoorthy; Bhavani, Selvaraj; Bishop, Alexander J.R.

    2010-01-01

    Genome-wide RNAi screening is a powerful, yet relatively immature technology that allows investigation into the role of individual genes in a process of choice. Most RNAi screens identify a large number of genes with a continuous gradient in the assessed phenotype. Screeners must then decide whether to examine just those genes with the most robust phenotype or to examine the full gradient of genes that cause an effect and how to identify the candidate genes to be validated. We have used RNAi in Drosophila cells to examine viability in a 384-well plate format and compare two screens, untreated control and treatment. We compare multiple normalization methods, which take advantage of different features within the data, including quantile normalization, background subtraction, scaling, cellHTS2 1, and interquartile range measurement. Considering the false-positive potential that arises from RNAi technology, a robust validation method was designed for the purpose of gene selection for future investigations. In a retrospective analysis, we describe the use of validation data to evaluate each normalization method. While no normalization method worked ideally, we found that a combination of two methods, background subtraction followed by quantile normalization and cellHTS2, at different thresholds, captures the most dependable and diverse candidate genes. Thresholds are suggested depending on whether a few candidate genes are desired or a more extensive systems level analysis is sought. In summary, our normalization approaches and experimental design to perform validation experiments are likely to apply to those high-throughput screening systems attempting to identify genes for systems level analysis. PMID:18753689

  7. Whole-genome regression and prediction methods applied to plant and animal breeding

    NARCIS (Netherlands)

    Los Campos, De G.; Hickey, J.M.; Pong-Wong, R.; Daetwyler, H.D.; Calus, M.P.L.

    2013-01-01

    Genomic-enabled prediction is becoming increasingly important in animal and plant breeding, and is also receiving attention in human genetics. Deriving accurate predictions of complex traits requires implementing whole-genome regression (WGR) models where phenotypes are regressed on thousands of

  8. A simple, rapid and efficient method for the extraction of genomic ...

    African Journals Online (AJOL)

    The isolation of intact, high-molecular-mass genomic DNA is essential for many molecular biology applications including long range PCR, endonuclease restriction digestion, southern blot analysis, and genomic library construction. Many protocols are available for the extraction of DNA from plant material, but obtain it is ...

  9. Evaluating Methods for Evaluating Instruction: The Case of Higher Education

    OpenAIRE

    Bruce A. Weinberg; Belton M. Fleisher; Masanori Hashimoto

    2007-01-01

    This paper develops an original measure of learning in higher education, based on grades in subsequent courses. Using this measure of learning, this paper shows that student evaluations are positively related to current grades but unrelated to learning once current grades are controlled. It offers evidence that the weak relationship between learning and student evaluations arises, in part, because students are unaware of how much they have learned in a course. The paper concludes with a discu...

  10. Evaluation of an Optimal Epidemiological Typing Scheme for Legionella pneumophila with Whole-Genome Sequence Data Using Validation Guidelines.

    Science.gov (United States)

    David, Sophia; Mentasti, Massimo; Tewolde, Rediat; Aslett, Martin; Harris, Simon R; Afshar, Baharak; Underwood, Anthony; Fry, Norman K; Parkhill, Julian; Harrison, Timothy G

    2016-08-01

    Sequence-based typing (SBT), analogous to multilocus sequence typing (MLST), is the current "gold standard" typing method for investigation of legionellosis outbreaks caused by Legionella pneumophila However, as common sequence types (STs) cause many infections, some investigations remain unresolved. In this study, various whole-genome sequencing (WGS)-based methods were evaluated according to published guidelines, including (i) a single nucleotide polymorphism (SNP)-based method, (ii) extended MLST using different numbers of genes, (iii) determination of gene presence or absence, and (iv) a kmer-based method. L. pneumophila serogroup 1 isolates (n = 106) from the standard "typing panel," previously used by the European Society for Clinical Microbiology Study Group on Legionella Infections (ESGLI), were tested together with another 229 isolates. Over 98% of isolates were considered typeable using the SNP- and kmer-based methods. Percentages of isolates with complete extended MLST profiles ranged from 99.1% (50 genes) to 86.8% (1,455 genes), while only 41.5% produced a full profile with the gene presence/absence scheme. Replicates demonstrated that all methods offer 100% reproducibility. Indices of discrimination range from 0.972 (ribosomal MLST) to 0.999 (SNP based), and all values were higher than that achieved with SBT (0.940). Epidemiological concordance is generally inversely related to discriminatory power. We propose that an extended MLST scheme with ∼50 genes provides optimal epidemiological concordance while substantially improving the discrimination offered by SBT and can be used as part of a hierarchical typing scheme that should maintain backwards compatibility and increase discrimination where necessary. This analysis will be useful for the ESGLI to design a scheme that has the potential to become the new gold standard typing method for L. pneumophila. Copyright © 2016 David et al.

  11. Genomic Testing

    Science.gov (United States)

    ... this database. Top of Page Evaluation of Genomic Applications in Practice and Prevention (EGAPP™) In 2004, the Centers for Disease Control and Prevention launched the EGAPP initiative to establish and test a ... and other applications of genomic technology that are in transition from ...

  12. Gamma cameras - a method of evaluation

    International Nuclear Information System (INIS)

    Oates, L.; Bibbo, G.

    2000-01-01

    Full text: With the sophistication and longevity of the modern gamma camera it is not often that the need arises to evaluate a gamma camera for purchase. We have recently been placed in the position of retiring our two single headed cameras of some vintage and replacing them with a state of the art dual head variable angle gamma camera. The process used for the evaluation consisted of five parts: (1) Evaluation of the technical specification as expressed in the tender document; (2) A questionnaire adapted from the British Society of Nuclear Medicine; (3) Site visits to assess gantry configuration, movement, patient access and occupational health, welfare and safety considerations; (4) Evaluation of the processing systems offered; (5) Whole of life costing based on equally configured systems. The results of each part of the evaluation were expressed using a weighted matrix analysis with each of the criteria assessed being weighted in accordance with their importance to the provision of an effective nuclear medicine service for our centre and the particular importance to paediatric nuclear medicine. This analysis provided an objective assessment of each gamma camera system from which a purchase recommendation was made. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  13. Cytogenetic method of determining effect of threshold values of anthropogenic factors on the plant and animal genome

    International Nuclear Information System (INIS)

    Arkhipchuk, V.V.; Romanenko, V.D.; Arkhipchuk, M.V.; Kipnis, L.S.

    1993-01-01

    The use of nucleolar characteristics to access the action of physical and chemical factors on living objects is a promising trend in the creation of new and highly sensitive biological tests. The advantages of this process are that the effect of the threshold values of the anthropogenic factors is recorded as a change in functional activity of the cell genome and not as the restructuring of the karyotype. The aim of this research was to test a cytogenetic method of determining the modifying action of various factors on the plant and animal genome, based on analysis of quantitative characteristics of the nucleoli and to extend its use to different groups of organisms

  14. Evaluation of multiple approaches to identify genome-wide polymorphisms in closely related genotypes of sweet cherry (Prunus avium L.

    Directory of Open Access Journals (Sweden)

    Seanna Hewitt

    Full Text Available Identification of genetic polymorphisms and subsequent development of molecular markers is important for marker assisted breeding of superior cultivars of economically important species. Sweet cherry (Prunus avium L. is an economically important non-climacteric tree fruit crop in the Rosaceae family and has undergone a genetic bottleneck due to breeding, resulting in limited genetic diversity in the germplasm that is utilized for breeding new cultivars. Therefore, it is critical to recognize the best platforms for identifying genome-wide polymorphisms that can help identify, and consequently preserve, the diversity in a genetically constrained species. For the identification of polymorphisms in five closely related genotypes of sweet cherry, a gel-based approach (TRAP, reduced representation sequencing (TRAPseq, a 6k cherry SNParray, and whole genome sequencing (WGS approaches were evaluated in the identification of genome-wide polymorphisms in sweet cherry cultivars. All platforms facilitated detection of polymorphisms among the genotypes with variable efficiency. In assessing multiple SNP detection platforms, this study has demonstrated that a combination of appropriate approaches is necessary for efficient polymorphism identification, especially between closely related cultivars of a species. The information generated in this study provides a valuable resource for future genetic and genomic studies in sweet cherry, and the insights gained from the evaluation of multiple approaches can be utilized for other closely related species with limited genetic diversity in the breeding germplasm. Keywords: Polymorphisms, Prunus avium, Next-generation sequencing, Target region amplification polymorphism (TRAP, Genetic diversity, SNParray, Reduced representation sequencing, Whole genome sequencing (WGS

  15. A Rapid and Efficient Method for Purifying High Quality Total RNA from Peaches (Prunus persica for Functional Genomics Analyses

    Directory of Open Access Journals (Sweden)

    LEE MEISEL

    2005-01-01

    Full Text Available Prunus persica has been proposed as a genomic model for deciduous trees and the Rosaceae family. Optimized protocols for RNA isolation are necessary to further advance studies in this model species such that functional genomics analyses may be performed. Here we present an optimized protocol to rapidly and efficiently purify high quality total RNA from peach fruits (Prunus persica. Isolating high-quality RNA from fruit tissue is often difficult due to large quantities of polysaccharides and polyphenolic compounds that accumulate in this tissue and co-purify with the RNA. Here we demonstrate that a modified version of the method used to isolate RNA from pine trees and the woody plant Cinnamomun tenuipilum is ideal for isolating high quality RNA from the fruits of Prunus persica. This RNA may be used for many functional genomic based experiments such as RT-PCR and the construction of large-insert cDNA libraries.

  16. Partial digestion with restriction enzymes of ultraviolet-irradiated human genomic DNA: a method for identifying restriction site polymorphisms

    International Nuclear Information System (INIS)

    Nobile, C.; Romeo, G.

    1988-01-01

    A method for partial digestion of total human DNA with restriction enzymes has been developed on the basis of a principle already utilized by P.A. Whittaker and E. Southern for the analysis of phage lambda recombinants. Total human DNA irradiated with uv light of 254 nm is partially digested by restriction enzymes that recognize sequences containing adjacent thymidines because of TT dimer formation. The products resulting from partial digestion of specific genomic regions are detected in Southern blots by genomic-unique DNA probes with high reproducibility. This procedure is rapid and simple to perform because the same conditions of uv irradiation are used for different enzymes and probes. It is shown that restriction site polymorphisms occurring in the genomic regions analyzed are recognized by the allelic partial digest patterns they determine

  17. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  18. Evaluation of Ponseti method in neglected clubfoot

    Directory of Open Access Journals (Sweden)

    Abhinav Sinha

    2016-01-01

    Conclusions: Painless, supple, plantigrade, and cosmetically acceptable feet were achieved in neglected clubfeet without any extensive surgery. A fair trial of conservative Ponseti method should be tried before resorting to extensive soft tissue procedure.

  19. Quantitative Methods for Software Selection and Evaluation

    National Research Council Canada - National Science Library

    Bandor, Michael S

    2006-01-01

    ... (the ability of the product to meet the need) and the cost. The method used for the analysis and selection activities can range from the use of basic intuition to counting the number of requirements fulfilled, or something...

  20. Mitochondrial genome of the Komodo dragon: efficient sequencing method with reptile-oriented primers and novel gene rearrangements.

    Science.gov (United States)

    Kumazawa, Yoshinori; Endo, Hideki

    2004-04-30

    The mitochondrial genome of the Komodo dragon (Varanus komodoensis) was nearly completely sequenced, except for two highly repetitive noncoding regions. An efficient sequencing method for squamate mitochondrial genomes was established by combining the long polymerase chain reaction (PCR) technology and a set of reptile-oriented primers designed for nested PCR amplifications. It was found that the mitochondrial genome had novel gene arrangements in which genes from NADH dehydrogenase subunit 6 to proline tRNA were extensively shuffled with duplicate control regions. These control regions had 99% sequence similarity over 700 bp. Although snake mitochondrial genomes are also known to possess duplicate control regions with nearly identical sequences, the location of the second control region suggested independent occurrence of the duplication on lineages leading to snakes and the Komodo dragon. Another feature of the mitochondrial genome of the Komodo dragon was the considerable number of tandem repeats, including sequences with a strong secondary structure, as a possible site for the slipped-strand mispairing in replication. These observations are consistent with hypotheses that tandem duplications via the slipped-strand mispairing may induce mitochondrial gene rearrangements and may serve to maintain similar copies of the control region.

  1. Methods of evaluating market transformation programmes: experience in Sweden

    International Nuclear Information System (INIS)

    Neij, L.

    2001-01-01

    The evaluation of market transformation programmes requires the development of new methods, relative to methods used for the evaluation of traditional energy efficiency programmes. In this paper, a model for the evaluation of market transformation programmes is proposed, based in part on evaluation methods discussed in the literature. The proposed model entails an extensive evaluation process, including the evaluation of market transformation effects, the impact of these effects, and the evaluation of the outline of the programme. Furthermore, evaluations of Swedish market transformation programmes have been analysed in relation to the proposed model. The analysis shows that not all of the evaluations have been focused on market transformation, and those that have, are only partly consistent with the evaluation model proposed here. It is concluded that future evaluations of Swedish market transformation programmes should be extended and improved in accordance with the proposed model. (author)

  2. Evaluation of blind signal separation methods

    NARCIS (Netherlands)

    Schobben, D.W.E.; Torkkola, K.; Smaragdis, P.

    1999-01-01

    Recently many new Blind Signal Separation BSS algorithms have been introduced Authors evaluate the performance of their algorithms in various ways Among these are speech recognition rates plots of separated signals plots of cascaded mixingunmixing impulse responses and signal to noise ratios Clearly

  3. A Ranking Method for Evaluating Constructed Responses

    Science.gov (United States)

    Attali, Yigal

    2014-01-01

    This article presents a comparative judgment approach for holistically scored constructed response tasks. In this approach, the grader rank orders (rather than rate) the quality of a small set of responses. A prior automated evaluation of responses guides both set formation and scaling of rankings. Sets are formed to have similar prior scores and…

  4. Improved methods to evaluate realised energy savings

    NARCIS (Netherlands)

    Boonekamp, P.G.M.

    2005-01-01

    This thesis regards the calculation of realised energy savings at national and sectoral level, and the policy contribution to total savings. It is observed that the results of monitoring and evaluation studies on realised energy savings are hardly applied in energy saving policy. Causes are the lack

  5. Use of Genomic Data in Risk Assessment Caes Study: II. Evaluation of the Dibutyl Phthalate Toxicogenomic Dataset

    Science.gov (United States)

    An evaluation of the toxicogenomic data set for dibutyl phthalate (DBP) and male reproductive developmental effects was performed as part of a larger case study to test an approach for incorporating genomic data in risk assessment. The DBP toxicogenomic data set is composed of ni...

  6. Comprehensive evaluation of SNP identification with the Restriction Enzyme-based Reduced Representation Library (RRL method

    Directory of Open Access Journals (Sweden)

    Du Ye

    2012-02-01

    Full Text Available Abstract Background Restriction Enzyme-based Reduced Representation Library (RRL method represents a relatively feasible and flexible strategy used for Single Nucleotide Polymorphism (SNP identification in different species. It has remarkable advantage of reducing the complexity of the genome by orders of magnitude. However, comprehensive evaluation for actual efficacy of SNP identification by this method is still unavailable. Results In order to evaluate the efficacy of Restriction Enzyme-based RRL method, we selected Tsp 45I enzyme which covers 266 Mb flanking region of the enzyme recognition site according to in silico simulation on human reference genome, then we sequenced YH RRL after Tsp 45I treatment and obtained reads of which 80.8% were mapped to target region with an 20-fold average coverage, about 96.8% of target region was covered by at least one read and 257 K SNPs were identified in the region using SOAPsnp software. Compared with whole genome resequencing data, we observed false discovery rate (FDR of 13.95% and false negative rate (FNR of 25.90%. The concordance rate of homozygote loci was over 99.8%, but that of heterozygote were only 92.56%. Repeat sequences and bases quality were proved to have a great effect on the accuracy of SNP calling, SNPs in recognition sites contributed evidently to the high FNR and the low concordance rate of heterozygote. Our results indicated that repeat masking and high stringent filter criteria could significantly decrease both FDR and FNR. Conclusions This study demonstrates that Restriction Enzyme-based RRL method was effective for SNP identification. The results highlight the important role of bias and the method-derived defects represented in this method and emphasize the special attentions noteworthy.

  7. A comprehensive evaluation of rodent malaria parasite genomes and gene expression

    KAUST Repository

    Otto, Thomas D

    2014-10-30

    Background: Rodent malaria parasites (RMP) are used extensively as models of human malaria. Draft RMP genomes have been published for Plasmodium yoelii, P. berghei ANKA (PbA) and P. chabaudi AS (PcAS). Although availability of these genomes made a significant impact on recent malaria research, these genomes were highly fragmented and were annotated with little manual curation. The fragmented nature of the genomes has hampered genome wide analysis of Plasmodium gene regulation and function. Results: We have greatly improved the genome assemblies of PbA and PcAS, newly sequenced the virulent parasite P. yoelii YM genome, sequenced additional RMP isolates/lines and have characterized genotypic diversity within RMP species. We have produced RNA-seq data and utilized it to improve gene-model prediction and to provide quantitative, genome-wide, data on gene expression. Comparison of the RMP genomes with the genome of the human malaria parasite P. falciparum and RNA-seq mapping permitted gene annotation at base-pair resolution. Full-length chromosomal annotation permitted a comprehensive classification of all subtelomeric multigene families including the `Plasmodium interspersed repeat genes\\' (pir). Phylogenetic classification of the pir family, combined with pir expression patterns, indicates functional diversification within this family. Conclusions: Complete RMP genomes, RNA-seq and genotypic diversity data are excellent and important resources for gene-function and post-genomic analyses and to better interrogate Plasmodium biology. Genotypic diversity between P. chabaudi isolates makes this species an excellent parasite to study genotype-phenotype relationships. The improved classification of multigene families will enhance studies on the role of (variant) exported proteins in virulence and immune evasion/modulation.

  8. Evaluation of local corrosion life by statistical method

    International Nuclear Information System (INIS)

    Kato, Shunji; Kurosawa, Tatsuo; Takaku, Hiroshi; Kusanagi, Hideo; Hirano, Hideo; Kimura, Hideo; Hide, Koichiro; Kawasaki, Masayuki

    1987-01-01

    In this paper, for the purpose of achievement of life extension of light water reactor, we examined the evaluation of local corrosion by satistical method and its application of nuclear power plant components. There are many evaluation examples of maximum cracking depth of local corrosion by dowbly exponential distribution. This evaluation method has been established. But, it has not been established that we evaluate service lifes of construction materials by satistical method. In order to establish of service life evaluation by satistical method, we must strive to collect local corrosion dates and its analytical researchs. (author)

  9. Land management planning: a method of evaluating alternatives

    Science.gov (United States)

    Andres Weintraub; Richard Adams; Linda Yellin

    1982-01-01

    A method is described for developing and evaluating alternatives in land management planning. A structured set of 15 steps provides a framework for such an evaluation. when multiple objectives and uncertainty must be considered in the planning process. The method is consistent with other processes used in organizational evaluation, and allows for the interaction of...

  10. Marker list - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...Database Site Policy | Contact Us Marker list - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  11. QTL list - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...Policy | Contact Us QTL list - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  12. Plant DB link - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...e Site Policy | Contact Us Plant DB link - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  13. Capturing the target genes of BldD in Saccharopolyspora erythraea using improved genomic SELEX method.

    Science.gov (United States)

    Wu, Hang; Mao, Yongrong; Chen, Meng; Pan, Hui; Huang, Xunduan; Ren, Min; Wu, Hao; Li, Jiali; Xu, Zhongdong; Yuan, Hualing; Geng, Ming; Weaver, David T; Zhang, Lixin; Zhang, Buchang

    2015-03-01

    BldD (SACE_2077), a key developmental regulator in actinomycetes, is the first identified transcriptional factor in Saccharopolyspora erythraea positively regulating erythromycin production and morphological differentiation. Although the BldD of S. erythraea binds to the promoters of erythromycin biosynthetic genes, the interaction affinities are relatively low, implying the existence of its other target genes in S. erythraea. Through the genomic systematic evolution of ligands by exponential enrichment (SELEX) method that we herein improved, four DNA sequences of S. erythraea A226, corresponding to the promoter regions of SACE_0306 (beta-galactosidase), SACE_0811 (50S ribosomal protein L25), SACE_3410 (fumarylacetoacetate hydrolase), and SACE_6014 (aldehyde dehydrogenase), were captured with all three BldD concentrations of 0.5, 1, and 2 μM, while the previously identified intergenic regions of eryBIV-eryAI and ermE-eryCI plus the promoter region of SACE_7115, the amfC homolog for aerial mycelium formation, could be captured only when the BldD's concentration reached 2 μM. Electrophoretic mobility shift assay (EMSA) analysis indicated that BldD specifically bound to above seven DNA sequences, and quantitative real-time PCR (qRT-PCR) assay showed that the transcriptional levels of the abovementioned target genes decreased when bldD was disrupted in A226. Furthermore, SACE_7115 and SACE_0306 in A226 were individually inactivated, showing that SACE_7115 was predominantly involved in aerial mycelium formation, while SACE_0306 mainly controlled erythromycin production. This study provides valuable information for better understanding of the pleiotropic regulator BldD in S. erythraea, and the improved method may be useful for uncovering regulatory networks of other transcriptional factors.

  14. Multiple-Trait Genomic Selection Methods Increase Genetic Value Prediction Accuracy

    Science.gov (United States)

    Jia, Yi; Jannink, Jean-Luc

    2012-01-01

    Genetic correlations between quantitative traits measured in many breeding programs are pervasive. These correlations indicate that measurements of one trait carry information on other traits. Current single-trait (univariate) genomic selection does not take advantage of this information. Multivariate genomic selection on multiple traits could accomplish this but has been little explored and tested in practical breeding programs. In this study, three multivariate linear models (i.e., GBLUP, BayesA, and BayesCπ) were presented and compared to univariate models using simulated and real quantitative traits controlled by different genetic architectures. We also extended BayesA with fixed hyperparameters to a full hierarchical model that estimated hyperparameters and BayesCπ to impute missing phenotypes. We found that optimal marker-effect variance priors depended on the genetic architecture of the trait so that estimating them was beneficial. We showed that the prediction accuracy for a low-heritability trait could be significantly increased by multivariate genomic selection when a correlated high-heritability trait was available. Further, multiple-trait genomic selection had higher prediction accuracy than single-trait genomic selection when phenotypes are not available on all individuals and traits. Additional factors affecting the performance of multiple-trait genomic selection were explored. PMID:23086217

  15. Evaluation bases for calculation methods in radioecology

    International Nuclear Information System (INIS)

    Bleck-Neuhaus, J.; Boikat, U.; Franke, B.; Hinrichsen, K.; Hoepfner, U.; Ratka, R.; Steinhilber-Schwab, B.; Teufel, D.; Urbach, M.

    1982-03-01

    The seven contributions in this book deal with the state and problems of radioecology. In particular it analyses: The propagation of radioactive materials in the atmosphere, the transfer of radioactive substances from the soil into plants, respectively from animal feed into meat, the exposure pathways for, and high-risk groups of the population, the uncertainties and the band width of the ingestion factor, as well as the treatment of questions of radioecology in practice. The calculation model is assessed and the difficulty evaluated of laying down data in the general calculation basis. (DG) [de

  16. Calculation of 3D genome structures for comparison of chromosome conformation capture experiments with microscopy: An evaluation of single-cell Hi-C protocols.

    Science.gov (United States)

    Lando, David; Stevens, Tim J; Basu, Srinjan; Laue, Ernest D

    2018-01-01

    Single-cell chromosome conformation capture approaches are revealing the extent of cell-to-cell variability in the organization and packaging of genomes. These single-cell methods, unlike their multi-cell counterparts, allow straightforward computation of realistic chromosome conformations that may be compared and combined with other, independent, techniques to study 3D structure. Here we discuss how single-cell Hi-C and subsequent 3D genome structure determination allows comparison with data from microscopy. We then carry out a systematic evaluation of recently published single-cell Hi-C datasets to establish a computational approach for the evaluation of single-cell Hi-C protocols. We show that the calculation of genome structures provides a useful tool for assessing the quality of single-cell Hi-C data because it requires a self-consistent network of interactions, relating to the underlying 3D conformation, with few errors, as well as sufficient longer-range cis- and trans-chromosomal contacts.

  17. Evaluations of Three Methods for Remote Training

    Science.gov (United States)

    Woolford, B.; Chmielewski, C.; Pandya, A.; Adolf, J.; Whitmore, M.; Berman, A.; Maida, J.

    1999-01-01

    Long duration space missions require a change in training methods and technologies. For Shuttle missions, crew members could train for all the planned procedures, and carry documentation of planned procedures for a variety of contingencies. As International Space Station (ISS) missions of three months or longer are carried out, many more tasks will need to be performed for which little or no training was received prior to launch. Eventually, exploration missions will last several years, and communications with Earth will have long time delays or be impossible at times. This series of three studies was performed to identify the advantages and disadvantages of three types of training for self-instruction: video-conferencing; multimedia; and virtual reality. These studies each compared two types of training methods, on two different types of tasks. In two of the studies, the subject's were in an isolated, confined environment analogous to space flight; the third study was performed in a laboratory.

  18. Systematic evaluation of nondestructive testing methods

    International Nuclear Information System (INIS)

    Segal, Y.; Notea, A.; Segal, E.

    1977-01-01

    The main task of an NDT engineer is to select the best method, considering the cost-benefit value of different available systems and taking into account the special existing constraints. The aim of the paper is to suggest a tool that will enable characterization of measuring systems. The derivation of the characterization parameters and functions has to be general, i.e., suitable for all possible measuring methods, independent of their principle of operation. Quite often the properties measured during the NDT procedure are not the wanted ones, but there must be a correlation between the measured property and the performance of the product. One has to bear in mind that the ultimate choice between systems is not, in practice, just based on the mathematical optimization approach that is presented. Factors like cost-benefit, availability of trained manpower, service, real-time information, weight, volume, etc., may be crucial problems, and they may well dictate the final selection

  19. [Corneal transparency: anatomical basis and evaluation methods].

    Science.gov (United States)

    Avetisov, S E; Narbut, M N

    Being just a relatively small part of the fibrous tunic of the eyeball, the cornea is, nevertheless, unique in terms of the variety of its functions. Because the cornea differs significantly from other protective frames in its structure, it provides the possibility of light transmission and strong refraction that largely contributes to the total refraction of the eye. The development of ophthalmology is impossible without improving methods of adequate anatomical and functional assessment of the eye not only as a whole, but also as a collection of interacting structures.In this regard, examination methods of the cornea have undergone significant advances in recent years. So far, the level of corneal transparency has been judged by biomicroscopy findings or indirect characteristics (thickness, structure, etc.). Confocal microscopy of the cornea and wave-based examinations involving one of the available laser interferometers (OCT or HRT) are also used. However, the data obtained with these methods resembles that of layer-specific reflectometry, i.e. the magnitude of directed reflection of the light beam from corneal corpuscles, which does not completely agree with the classical idea of transparency.

  20. Evaluation of toothbrush disinfection via different methods

    Directory of Open Access Journals (Sweden)

    Adil BASMAN

    2016-01-01

    Full Text Available The aim of this study was to compare the efficacy of using a dishwasher or different chemical agents, including 0.12% chlorhexidine gluconate, 2% sodium hypochlorite (NaOCl, a mouthrinse containing essential oils and alcohol, and 50% white vinegar, for toothbrush disinfection. Sixty volunteers were divided into five experimental groups and one control group (n = 10. Participants brushed their teeth using toothbrushes with standard bristles, and they disinfected the toothbrushes according to instructed methods. Bacterial contamination of the toothbrushes was compared between the experimental groups and the control group. Data were analyzed by Kruskal–Wallis and Duncan's multiple range tests, with 95% confidence intervals for multiple comparisons. Bacterial contamination of toothbrushes from individuals in the experimental groups differed from those in the control group (p < 0.05. The most effective method for elimination of all tested bacterial species was 50% white vinegar, followed in order by 2% NaOCl, mouthrinse containing essential oils and alcohol, 0.12% chlorhexidine gluconate, dishwasher use, and tap water (control. The results of this study show that the most effective method for disinfecting toothbrushes was submersion in 50% white vinegar, which is cost-effective, easy to access, and appropriate for household use.

  1. Performance Evaluation Methods for Assistive Robotic Technology

    Science.gov (United States)

    Tsui, Katherine M.; Feil-Seifer, David J.; Matarić, Maja J.; Yanco, Holly A.

    Robots have been developed for several assistive technology domains, including intervention for Autism Spectrum Disorders, eldercare, and post-stroke rehabilitation. Assistive robots have also been used to promote independent living through the use of devices such as intelligent wheelchairs, assistive robotic arms, and external limb prostheses. Work in the broad field of assistive robotic technology can be divided into two major research phases: technology development, in which new devices, software, and interfaces are created; and clinical, in which assistive technology is applied to a given end-user population. Moving from technology development towards clinical applications is a significant challenge. Developing performance metrics for assistive robots poses a related set of challenges. In this paper, we survey several areas of assistive robotic technology in order to derive and demonstrate domain-specific means for evaluating the performance of such systems. We also present two case studies of applied performance measures and a discussion regarding the ubiquity of functional performance measures across the sampled domains. Finally, we present guidelines for incorporating human performance metrics into end-user evaluations of assistive robotic technologies.

  2. A novel data mining method to identify assay-specific signatures in functional genomic studies

    Directory of Open Access Journals (Sweden)

    Guidarelli Jack W

    2006-08-01

    Full Text Available Abstract Background: The highly dimensional data produced by functional genomic (FG studies makes it difficult to visualize relationships between gene products and experimental conditions (i.e., assays. Although dimensionality reduction methods such as principal component analysis (PCA have been very useful, their application to identify assay-specific signatures has been limited by the lack of appropriate methodologies. This article proposes a new and powerful PCA-based method for the identification of assay-specific gene signatures in FG studies. Results: The proposed method (PM is unique for several reasons. First, it is the only one, to our knowledge, that uses gene contribution, a product of the loading and expression level, to obtain assay signatures. The PM develops and exploits two types of assay-specific contribution plots, which are new to the application of PCA in the FG area. The first type plots the assay-specific gene contribution against the given order of the genes and reveals variations in distribution between assay-specific gene signatures as well as outliers within assay groups indicating the degree of importance of the most dominant genes. The second type plots the contribution of each gene in ascending or descending order against a constantly increasing index. This type of plots reveals assay-specific gene signatures defined by the inflection points in the curve. In addition, sharp regions within the signature define the genes that contribute the most to the signature. We proposed and used the curvature as an appropriate metric to characterize these sharp regions, thus identifying the subset of genes contributing the most to the signature. Finally, the PM uses the full dataset to determine the final gene signature, thus eliminating the chance of gene exclusion by poor screening in earlier steps. The strengths of the PM are demonstrated using a simulation study, and two studies of real DNA microarray data – a study of

  3. Evaluation of Three Automated Genome Annotations for Halorhabdus utahensis

    DEFF Research Database (Denmark)

    Bakke, Peter; Carney, Nick; DeLoache, Will

    2009-01-01

    in databases such as NCBI and used to validate subsequent annotation errors. We submitted the genome sequence of halophilic archaeon Halorhabdus utahensis to be analyzed by three genome annotation services. We have examined the output from each service in a variety of ways in order to compare the methodology...

  4. Credit Institutions Management Evaluation using Quantitative Methods

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-02-01

    Full Text Available Credit institutions supervising mission by state authorities is mostly assimilated with systemic risk prevention. In present, the mission is orientated on analyzing the risk profile of the credit institutions, the mechanism and existing systems as management tools providing to bank rules the proper instruments to avoid and control specific bank risks. Rating systems are sophisticated measurement instruments which are capable to assure the above objectives, such as success in banking risk management. The management quality is one of the most important elements from the set of variables used in the quoting process in credit operations. Evaluation of this quality is – generally speaking – fundamented on quantitative appreciations which can induce subjectivism and heterogeneity in quotation. The problem can be solved by using, complementary, quantitative technics such us DEA (Data Envelopment Analysis.

  5. [Evaluation of Wits appraisal with superimposition method].

    Science.gov (United States)

    Xu, T; Ahn, J; Baumrind, S

    1999-07-01

    To compare the conventional Wits appraisal with superimposed Wits appraisal in evaluation of sagittal jaw relationship change between pre and post orthodontic treatment. The sample consists of 48-case pre and post treatment lateral head films. Computerized digitizing is used to get the cephalometric landmarks and measure conventional Wits value, superimposed Wits value and ANB angle. The correlation analysis among these three measures was done by SAS statistical package. The change of ANB angle has higher correlation with the change of superimposed Wits than that of the conventional Wits. The r-value is as high as 0.849 (P < 0.001). The superimposed Wits appraisal reflects the change of sagittal jaw relationship more objectively than the conventional one.

  6. Auditing as method of QA programme evaluation

    International Nuclear Information System (INIS)

    Wilhelm, H.

    1980-01-01

    The status and adequacy of a quality assurance programme should be regularly reviewed by the cognizant management. The programme audit is an independent review to determine the compliance with respective quality assurance requirements and to determine the effectiveness of that programme. This lecture gives an introduction of the method to perform audits under the following topics: 1. Definition and purpose of quality audits. 2. Organization of the quality audit function. 3. Unique requirements for auditors. 4. Audit preparation and planning. 5. Conduct of the audit. 6. Reporting the audit results. 7. Follow-up activities. (RW)

  7. CEPIAH, a method for Evaluation and Design of Pedagogical Hypermedia

    OpenAIRE

    Trigano , Philippe; Pacurar Giacomini , Ecaterina

    2004-01-01

    We are working on a method, called CEPIAH. We propose a web based system used to help teachers to design multimedia documents and to evaluate their prototypes. Our tool integrates two modules such as EMPI (Evaluation of Multimedia Pedagogical and Interactive software), and SP/UL/FC, a method for designing pedagogical hypermedia. The EMPI module is used to evaluate multimedia software used in educational context. We structured a knowledge base composed of a list of evaluation criteria, grouped...

  8. Long-term response to genomic selection: effects of estimation method and reference population structure for different genetic architectures.

    Science.gov (United States)

    Bastiaansen, John W M; Coster, Albart; Calus, Mario P L; van Arendonk, Johan A M; Bovenhuis, Henk

    2012-01-24

    Genomic selection has become an important tool in the genetic improvement of animals and plants. The objective of this study was to investigate the impacts of breeding value estimation method, reference population structure, and trait genetic architecture, on long-term response to genomic selection without updating marker effects. Three methods were used to estimate genomic breeding values: a BLUP method with relationships estimated from genome-wide markers (GBLUP), a Bayesian method, and a partial least squares regression method (PLSR). A shallow (individuals from one generation) or deep reference population (individuals from five generations) was used with each method. The effects of the different selection approaches were compared under four different genetic architectures for the trait under selection. Selection was based on one of the three genomic breeding values, on pedigree BLUP breeding values, or performed at random. Selection continued for ten generations. Differences in long-term selection response were small. For a genetic architecture with a very small number of three to four quantitative trait loci (QTL), the Bayesian method achieved a response that was 0.05 to 0.1 genetic standard deviation higher than other methods in generation 10. For genetic architectures with approximately 30 to 300 QTL, PLSR (shallow reference) or GBLUP (deep reference) had an average advantage of 0.2 genetic standard deviation over the Bayesian method in generation 10. GBLUP resulted in 0.6% and 0.9% less inbreeding than PLSR and BM and on average a one third smaller reduction of genetic variance. Responses in early generations were greater with the shallow reference population while long-term response was not affected by reference population structure. The ranking of estimation methods was different with than without selection. Under selection, applying GBLUP led to lower inbreeding and a smaller reduction of genetic variance while a similar response to selection was

  9. A new method used to evaluate organic working fluids

    International Nuclear Information System (INIS)

    Zhang, Xinxin; He, Maogang; Wang, Jingfu

    2014-01-01

    In this paper, we propose a method named “Weight Classification-Hasse Dominance” to evaluate organic working fluids. This new method combines the advantages of both the method of weight determination and the Hasse Diagram Technique (HDT). It can be used to evaluate the thermodynamic performance, environmental protection indicator, and safety requirement of organic working fluid simultaneously. This evaluation method can offer good reference for working fluid selection. Using this method, the organic working fluids which have been phased out and will be phased out by the Montreal Protocol including CFCs (chlorofluorocarbons), HCFCs (hydrochlorofluorocarbons), and HFCs (hydrofluorocarbons) were evaluated. Moreover, HCs (hydrocarbons) can be considered as a completely different kind of organic working fluid from CFCs, HCFCs, and HFCs according to the comparison based on this new evaluation method. - Highlights: • We propose a new method used to evaluate organic working fluids. • This evaluation method can offer good reference for working fluid selection. • CFC, HCFC, and HFC working fluids were evaluated using this evaluation method. • HC can be considered as a totally different working fluid from CFC, HCFC, and HFC

  10. Impact of QTL minor allele frequency on genomic evaluation using real genotype data and simulated phenotypes in Japanese Black cattle.

    Science.gov (United States)

    Uemoto, Yoshinobu; Sasaki, Shinji; Kojima, Takatoshi; Sugimoto, Yoshikazu; Watanabe, Toshio

    2015-11-19

    Genetic variance that is not captured by single nucleotide polymorphisms (SNPs) is due to imperfect linkage disequilibrium (LD) between SNPs and quantitative trait loci (QTLs), and the extent of LD between SNPs and QTLs depends on different minor allele frequencies (MAF) between them. To evaluate the impact of MAF of QTLs on genomic evaluation, we performed a simulation study using real cattle genotype data. In total, 1368 Japanese Black cattle and 592,034 SNPs (Illumina BovineHD BeadChip) were used. We simulated phenotypes using real genotypes under different scenarios, varying the MAF categories, QTL heritability, number of QTLs, and distribution of QTL effect. After generating true breeding values and phenotypes, QTL heritability was estimated and the prediction accuracy of genomic estimated breeding value (GEBV) was assessed under different SNP densities, prediction models, and population size by a reference-test validation design. The extent of LD between SNPs and QTLs in this population was higher in the QTLs with high MAF than in those with low MAF. The effect of MAF of QTLs depended on the genetic architecture, evaluation strategy, and population size in genomic evaluation. In genetic architecture, genomic evaluation was affected by the MAF of QTLs combined with the QTL heritability and the distribution of QTL effect. The number of QTL was not affected on genomic evaluation if the number of QTL was more than 50. In the evaluation strategy, we showed that different SNP densities and prediction models affect the heritability estimation and genomic prediction and that this depends on the MAF of QTLs. In addition, accurate QTL heritability and GEBV were obtained using denser SNP information and the prediction model accounted for the SNPs with low and high MAFs. In population size, a large sample size is needed to increase the accuracy of GEBV. The MAF of QTL had an impact on heritability estimation and prediction accuracy. Most genetic variance can be captured

  11. Household batteries: Evaluation of collection methods

    Energy Technology Data Exchange (ETDEWEB)

    Seeberger, D.A.

    1992-12-31

    While it is difficult to prove that a specific material is causing contamination in a landfill, tests have been conducted at waste-to-energy facilities that indicate that household batteries contribute significant amounts of heavy metals to both air emissions and ash residue. Hennepin County, MN, used a dual approach for developing and implementing a special household battery collection. Alternative collection methods were examined; test collections were conducted. The second phase examined operating and disposal policy issues. This report describes the results of the grant project, moving from a broad examination of the construction and content of batteries, to a description of the pilot collection programs, and ending with a discussion of variables affecting the cost and operation of a comprehensive battery collection program. Three out-of-state companies (PA, NY) were found that accept spent batteries; difficulties in reclaiming household batteries are discussed.

  12. Household batteries: Evaluation of collection methods

    Energy Technology Data Exchange (ETDEWEB)

    Seeberger, D.A.

    1992-01-01

    While it is difficult to prove that a specific material is causing contamination in a landfill, tests have been conducted at waste-to-energy facilities that indicate that household batteries contribute significant amounts of heavy metals to both air emissions and ash residue. Hennepin County, MN, used a dual approach for developing and implementing a special household battery collection. Alternative collection methods were examined; test collections were conducted. The second phase examined operating and disposal policy issues. This report describes the results of the grant project, moving from a broad examination of the construction and content of batteries, to a description of the pilot collection programs, and ending with a discussion of variables affecting the cost and operation of a comprehensive battery collection program. Three out-of-state companies (PA, NY) were found that accept spent batteries; difficulties in reclaiming household batteries are discussed.

  13. RS-SNP: a random-set method for genome-wide association studies

    Directory of Open Access Journals (Sweden)

    Mukherjee Sayan

    2011-03-01

    Full Text Available Abstract Background The typical objective of Genome-wide association (GWA studies is to identify single-nucleotide polymorphisms (SNPs and corresponding genes with the strongest evidence of association (the 'most-significant SNPs/genes' approach. Borrowing ideas from micro-array data analysis, we propose a new method, named RS-SNP, for detecting sets of genes enriched in SNPs moderately associated to the phenotype. RS-SNP assesses whether the number of significant SNPs, with p-value P ≤ α, belonging to a given SNP set is statistically significant. The rationale of proposed method is that two kinds of null hypotheses are taken into account simultaneously. In the first null model the genotype and the phenotype are assumed to be independent random variables and the null distribution is the probability of the number of significant SNPs in greater than observed by chance. The second null model assumes the number of significant SNPs in depends on the size of and not on the identity of the SNPs in . Statistical significance is assessed using non-parametric permutation tests. Results We applied RS-SNP to the Crohn's disease (CD data set collected by the Wellcome Trust Case Control Consortium (WTCCC and compared the results with GENGEN, an approach recently proposed in literature. The enrichment analysis using RS-SNP and the set of pathways contained in the MSigDB C2 CP pathway collection highlighted 86 pathways rich in SNPs weakly associated to CD. Of these, 47 were also indicated to be significant by GENGEN. Similar results were obtained using the MSigDB C5 pathway collection. Many of the pathways found to be enriched by RS-SNP have a well-known connection to CD and often with inflammatory diseases. Conclusions The proposed method is a valuable alternative to other techniques for enrichment analysis of SNP sets. It is well founded from a theoretical and statistical perspective. Moreover, the experimental comparison with GENGEN highlights that it is

  14. Study on comprehensive evaluation methods for nuclear fuel cycle

    International Nuclear Information System (INIS)

    Arie, Kazuo

    1999-03-01

    This investigation on comprehensive-evaluation-methods for nuclear fuel cycle has been performed through open-literature search. As the results, no proper comprehensive-evaluation-method has been found which integrate several factors to be considered into only one factor. In the evaluation of future advanced nuclear energy systems, it is required to evaluate from both view points of natural resources and natural environment, in addition to the other factors such as safety, economy, and proliferation resistance. It is recommended that clarification of specific items or targets to be evaluated is most important as the first thing to be done. Second, methodology for the evaluation should be discussed. (author)

  15. Evaluation method of offensive odor. Shuki no hyoka hoho

    Energy Technology Data Exchange (ETDEWEB)

    Iwasaki, Y [The Tokyo Metropolitan Research Institute for Environmental Protection, Tokyo (Japan)

    1991-06-01

    As the evaluation method of offensive odor, two kinds of methods were outlined, a concentration measurement method for compounds emitting offensive odors and a sensory method by human olfactory organ. On the former, the method for measuring separately odors of twelve compounds controlled by the regulation act such as ammonia and hydrogen sulfide was outlined as well as the method for measuring odors of compound groups such as total reduced sulfur and total hydrocarbon. On the later, the evaluation scale of essential properties of odor such as quality, intensity, acceptability and pervasiveness was discussed. As typical sensory techniques, a scentometer, syringe method, odorless chamber method and olfactometer were outlined, and a triangle odor bag method widely used for the evaluation at present was described in detail which was developed to reduce demerits of a syringe method such as adsorption of odorants on a syringe surface. 24 refs., 6 figs., 3 tabs.

  16. 4C-ker: A Method to Reproducibly Identify Genome-Wide Interactions Captured by 4C-Seq Experiments.

    Directory of Open Access Journals (Sweden)

    Ramya Raviram

    2016-03-01

    Full Text Available 4C-Seq has proven to be a powerful technique to identify genome-wide interactions with a single locus of interest (or "bait" that can be important for gene regulation. However, analysis of 4C-Seq data is complicated by the many biases inherent to the technique. An important consideration when dealing with 4C-Seq data is the differences in resolution of signal across the genome that result from differences in 3D distance separation from the bait. This leads to the highest signal in the region immediately surrounding the bait and increasingly lower signals in far-cis and trans. Another important aspect of 4C-Seq experiments is the resolution, which is greatly influenced by the choice of restriction enzyme and the frequency at which it can cut the genome. Thus, it is important that a 4C-Seq analysis method is flexible enough to analyze data generated using different enzymes and to identify interactions across the entire genome. Current methods for 4C-Seq analysis only identify interactions in regions near the bait or in regions located in far-cis and trans, but no method comprehensively analyzes 4C signals of different length scales. In addition, some methods also fail in experiments where chromatin fragments are generated using frequent cutter restriction enzymes. Here, we describe 4C-ker, a Hidden-Markov Model based pipeline that identifies regions throughout the genome that interact with the 4C bait locus. In addition, we incorporate methods for the identification of differential interactions in multiple 4C-seq datasets collected from different genotypes or experimental conditions. Adaptive window sizes are used to correct for differences in signal coverage in near-bait regions, far-cis and trans chromosomes. Using several datasets, we demonstrate that 4C-ker outperforms all existing 4C-Seq pipelines in its ability to reproducibly identify interaction domains at all genomic ranges with different resolution enzymes.

  17. A simple method for encapsulating single cells in alginate microspheres allows for direct PCR and whole genome amplification.

    Directory of Open Access Journals (Sweden)

    Saharnaz Bigdeli

    Full Text Available Microdroplets are an effective platform for segregating individual cells and amplifying DNA. However, a key challenge is to recover the contents of individual droplets for downstream analysis. This paper offers a method for embedding cells in alginate microspheres and performing multiple serial operations on the isolated cells. Rhodobacter sphaeroides cells were diluted in alginate polymer and sprayed into microdroplets using a fingertip aerosol sprayer. The encapsulated cells were lysed and subjected either to conventional PCR, or whole genome amplification using either multiple displacement amplification (MDA or a two-step PCR protocol. Microscopic examination after PCR showed that the lumen of the occupied microspheres contained fluorescently stained DNA product, but multiple displacement amplification with phi29 produced only a small number of polymerase colonies. The 2-step WGA protocol was successful in generating fluorescent material, and quantitative PCR from DNA extracted from aliquots of microspheres suggested that the copy number inside the microspheres was amplified up to 3 orders of magnitude. Microspheres containing fluorescent material were sorted by a dilution series and screened with a fluorescent plate reader to identify single microspheres. The DNA was extracted from individual isolates, re-amplified with full-length sequencing adapters, and then a single isolate was sequenced using the Illumina MiSeq platform. After filtering the reads, the only sequences that collectively matched a genome in the NCBI nucleotide database belonged to R. sphaeroides. This demonstrated that sequencing-ready DNA could be generated from the contents of a single microsphere without culturing. However, the 2-step WGA strategy showed limitations in terms of low genome coverage and an uneven frequency distribution of reads across the genome. This paper offers a simple method for embedding cells in alginate microspheres and performing PCR on isolated

  18. Investment Portfolio Formation Using Multi-criteria evaluation Method MULTIMOORA

    Directory of Open Access Journals (Sweden)

    Vilius Vaišvilas

    2017-06-01

    Full Text Available Information that has to be analyzed by investors is complicated and can be interpreted differently by different people, which is why choosing what should be added to the investment portfolio is complicated task. Complexity grows substantially when there are more alternatives to choose from. Multi – criteria evaluation method can be used to choose the best alternatives. Multi–criteria evaluation method MULTIMOORA is not subjective because there is no need to decide ratio of any given variable that is evaluated. MULTIMOORA consists of: formation of ratio system, application of multi – criteria evaluation method as well as investment evaluation and ranking. Purpose of this article is to apply multi – criteria evaluation method MULTIMOORA for the formation and management of investment portfolio from stocks of the Baltic stock market companies. Methods used in the analysis for the article: analysis of scientific literature, statistical analysis, organization and comparison of data, idealization, calculations of MULTIMOORA.

  19. Evaluation of splenic autotransplants by radionuclide methods

    International Nuclear Information System (INIS)

    Nawaz, K.; Nema, T.A.; Al-Mohannadi, S.; Abdel-Dayem, H.M.

    1986-01-01

    The viability of omental autotransplantation of splenic tissue after splenectomy has been disputed. The authors followed up splenic implants by imaging with either Tc-99m tin colloid or heat-damaged RBCs to determine how early implants can be visualized and whether a difference exists between patients who underwent emergency splenectomy for trauma (nine patients) and those who underwent elective splenectomy (seven patients). In the latter group, splenectomy was performed for portal hypertension in six patients and for hematologic disorder (Wiscott Aldrich syndrome) in one. All patients were imaged 2-4 weeks and 6 months after surgery. In the first group, seven implants were seen at 2-4 weeks and all nine were seen by 6 months. In the second group, only two implants were seen at 2-4 weeks and four were seen at 6 months; two implants were not visualized even at 6 months. The implant of the patient with hematologic disorder was not seen before 6 months. The authors conclude that splenic implants can be visualized bu scintigraphic methods as early as 2-4 weeks after surgery, and that by 6 months all implants from normal spleen are viable. By contrast, spleen implants placed for portal hypertension or hematologic disorders may fail

  20. An efficient genotyping method for genome-modified animals and human cells generated with CRISPR/Cas9 system.

    Science.gov (United States)

    Zhu, Xiaoxiao; Xu, Yajie; Yu, Shanshan; Lu, Lu; Ding, Mingqin; Cheng, Jing; Song, Guoxu; Gao, Xing; Yao, Liangming; Fan, Dongdong; Meng, Shu; Zhang, Xuewen; Hu, Shengdi; Tian, Yong

    2014-09-19

    The rapid generation of various species and strains of laboratory animals using CRISPR/Cas9 technology has dramatically accelerated the interrogation of gene function in vivo. So far, the dominant approach for genotyping of genome-modified animals has been the T7E1 endonuclease cleavage assay. Here, we present a polyacrylamide gel electrophoresis-based (PAGE) method to genotype mice harboring different types of indel mutations. We developed 6 strains of genome-modified mice using CRISPR/Cas9 system, and utilized this approach to genotype mice from F0 to F2 generation, which included single and multiplexed genome-modified mice. We also determined the maximal detection sensitivity for detecting mosaic DNA using PAGE-based assay as 0.5%. We further applied PAGE-based genotyping approach to detect CRISPR/Cas9-mediated on- and off-target effect in human 293T and induced pluripotent stem cells (iPSCs). Thus, PAGE-based genotyping approach meets the rapidly increasing demand for genotyping of the fast-growing number of genome-modified animals and human cell lines created using CRISPR/Cas9 system or other nuclease systems such as TALEN or ZFN.

  1. Systematic drug safety evaluation based on public genomic expression (Connectivity Map) data: myocardial and infectious adverse reactions as application cases.

    Science.gov (United States)

    Wang, Kejian; Weng, Zuquan; Sun, Liya; Sun, Jiazhi; Zhou, Shu-Feng; He, Lin

    2015-02-13

    Adverse drug reaction (ADR) is of great importance to both regulatory agencies and the pharmaceutical industry. Various techniques, such as quantitative structure-activity relationship (QSAR) and animal toxicology, are widely used to identify potential risks during the preclinical stage of drug development. Despite these efforts, drugs with safety liabilities can still pass through safety checkpoints and enter the market. This situation raises the concern that conventional chemical structure analysis and phenotypic screening are not sufficient to avoid all clinical adverse events. Genomic expression data following in vitro drug treatments characterize drug actions and thus have become widely used in drug repositioning. In the present study, we explored prediction of ADRs based on the drug-induced gene-expression profiles from cultured human cells in the Connectivity Map (CMap) database. The results showed that drugs inducing comparable ADRs generally lead to similar CMap expression profiles. Based on such ADR-gene expression association, we established prediction models for various ADRs, including severe myocardial and infectious events. Drugs with FDA boxed warnings of safety liability were effectively identified. We therefore suggest that drug-induced gene expression change, in combination with effective computational methods, may provide a new dimension of information to facilitate systematic drug safety evaluation. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Accounting for discovery bias in genomic prediction

    Science.gov (United States)

    Our objective was to evaluate an approach to mitigating discovery bias in genomic prediction. Accuracy may be improved by placing greater emphasis on regions of the genome expected to be more influential on a trait. Methods emphasizing regions result in a phenomenon known as “discovery bias” if info...

  3. Prevalence of Evaluation Method Courses in Education Leader Doctoral Preparation

    Science.gov (United States)

    Shepperson, Tara L.

    2013-01-01

    This exploratory study investigated the prevalence of single evaluation methods courses in doctoral education leadership programs. Analysis of websites of 132 leading U.S. university programs found 62 evaluation methods courses in 54 programs. Content analysis of 49 course catalog descriptions resulted in five categories: survey, planning and…

  4. An Evaluation Method of Equipment Reliability Configuration Management

    Science.gov (United States)

    Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan

    2018-01-01

    At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.

  5. Bridging the Gulf: Mixed Methods and Library Service Evaluation

    Science.gov (United States)

    Haynes, Abby

    2004-01-01

    This paper explores library evaluation in Australia and proposes a return to research fundamentals in which evaluators are asked to consider the centrality of philosophical issues and the role of different research methods. A critique of current evaluation examples demonstrates a system-centred, quantitative, input/output focus which fails to…

  6. An evaluation of multiple annealing and looping based genome amplification using a synthetic bacterial community

    KAUST Repository

    Wang, Yong; Gao, Zhaoming; Xu, Ying; Li, Guangyu; He, Lisheng; Qian, Peiyuan

    2016-01-01

    -generation-sequencing technology. Using a synthetic bacterial community, the amplification efficiency of the Multiple Annealing and Looping Based Amplification Cycles (MALBAC) kit that is originally developed to amplify the single-cell genomic DNA of mammalian organisms

  7. Assessing Student Understanding of the "New Biology": Development and Evaluation of a Criterion-Referenced Genomics and Bioinformatics Assessment

    Science.gov (United States)

    Campbell, Chad Edward

    Over the past decade, hundreds of studies have introduced genomics and bioinformatics (GB) curricula and laboratory activities at the undergraduate level. While these publications have facilitated the teaching and learning of cutting-edge content, there has yet to be an evaluation of these assessment tools to determine if they are meeting the quality control benchmarks set forth by the educational research community. An analysis of these assessment tools indicated that valid and reliable inferences about student learning. To remedy this situation the development of a robust GB assessment aligned with the quality control benchmarks was undertaken in order to ensure evidence-based evaluation of student learning outcomes. Content validity is a central piece of construct validity, and it must be used to guide instrument and item development. This study reports on: (1) the correspondence of content validity evidence gathered from independent sources; (2) the process of item development using this evidence; (3) the results from a pilot administration of the assessment; (4) the subsequent modification of the assessment based on the pilot administration results and; (5) the results from the second administration of the assessment. Twenty-nine different subtopics within GB (Appendix B: Genomics and Bioinformatics Expert Survey) were developed based on preliminary GB textbook analyses. These subtopics were analyzed using two methods designed to gather content validity evidence: (1) a survey of GB experts (n=61) and (2) a detailed content analyses of GB textbooks (n=6). By including only the subtopics that were shown to have robust support across these sources, 22 GB subtopics were established for inclusion in the assessment. An expert panel subsequently developed, evaluated, and revised two multiple-choice items to align with each of the 22 subtopics, producing a final item pool of 44 items. These items were piloted with student samples of varying content exposure levels

  8. EVALUATION METHODS USED FOR TANGIBLE ASSETS BY ECONOMIC ENTITIES

    Directory of Open Access Journals (Sweden)

    Csongor CSŐSZ

    2014-06-01

    Full Text Available At many entities the net asset value is influenced by the evaluation methods applied for tangible assets, because the value of intangible assets and financial assets is small in most cases. The objective of this paper is to analyze the differences between the procedures / methods of evaluation applied by micro and small entities and medium and large entities for tangible assets in Romania and Hungary. Furthermore, we analyze the differences between the procedures / methods of evaluation applied by micro and small entities in Romania and Hungary, respectively the differences between medium and large entities regarding de evaluation methods for tangible assets in Romania and Hungary. For this empirical study the questionnaire is used – as research technique, and to demonstrate the significant differences between the evaluation methods we used the Kolmogorov – Smirnov Z test.

  9. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    Science.gov (United States)

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  10. New approach to equipment quality evaluation method with distinct functions

    Directory of Open Access Journals (Sweden)

    Milisavljević Vladimir M.

    2016-01-01

    Full Text Available The paper presents new approach for improving method for quality evaluation and selection of equipment (devices and machinery by applying distinct functions. Quality evaluation and selection of devices and machinery is a multi-criteria problem which involves the consideration of numerous parameters of various origins. Original selection method with distinct functions is based on technical parameters with arbitrary evaluation of each parameter importance (weighting. Improvement of this method, presented in this paper, addresses the issue of weighting of parameters by using Delphi Method. Finally, two case studies are provided, which included quality evaluation of standard boilers for heating and evaluation of load-haul-dump (LHD machines, to demonstrate applicability of this approach. Analytical Hierarchical Process (AHP is used as a control method.

  11. Demographically-Based Evaluation of Genomic Regions under Selection in Domestic Dogs.

    Directory of Open Access Journals (Sweden)

    Adam H Freedman

    2016-03-01

    Full Text Available Controlling for background demographic effects is important for accurately identifying loci that have recently undergone positive selection. To date, the effects of demography have not yet been explicitly considered when identifying loci under selection during dog domestication. To investigate positive selection on the dog lineage early in the domestication, we examined patterns of polymorphism in six canid genomes that were previously used to infer a demographic model of dog domestication. Using an inferred demographic model, we computed false discovery rates (FDR and identified 349 outlier regions consistent with positive selection at a low FDR. The signals in the top 100 regions were frequently centered on candidate genes related to brain function and behavior, including LHFPL3, CADM2, GRIK3, SH3GL2, MBP, PDE7B, NTAN1, and GLRA1. These regions contained significant enrichments in behavioral ontology categories. The 3rd top hit, CCRN4L, plays a major role in lipid metabolism, that is supported by additional metabolism related candidates revealed in our scan, including SCP2D1 and PDXC1. Comparing our method to an empirical outlier approach that does not directly account for demography, we found only modest overlaps between the two methods, with 60% of empirical outliers having no overlap with our demography-based outlier detection approach. Demography-aware approaches have lower-rates of false discovery. Our top candidates for selection, in addition to expanding the set of neurobehavioral candidate genes, include genes related to lipid metabolism, suggesting a dietary target of selection that was important during the period when proto-dogs hunted and fed alongside hunter-gatherers.

  12. Restriction site extension PCR: a novel method for high-throughput characterization of tagged DNA fragments and genome walking.

    Directory of Open Access Journals (Sweden)

    Jiabing Ji

    Full Text Available BACKGROUND: Insertion mutant isolation and characterization are extremely valuable for linking genes to physiological function. Once an insertion mutant phenotype is identified, the challenge is to isolate the responsible gene. Multiple strategies have been employed to isolate unknown genomic DNA that flanks mutagenic insertions, however, all these methods suffer from limitations due to inefficient ligation steps, inclusion of restriction sites within the target DNA, and non-specific product generation. These limitations become close to insurmountable when the goal is to identify insertion sites in a high throughput manner. METHODOLOGY/PRINCIPAL FINDINGS: We designed a novel strategy called Restriction Site Extension PCR (RSE-PCR to efficiently conduct large-scale isolation of unknown genomic DNA fragments linked to DNA insertions. The strategy is a modified adaptor-mediated PCR without ligation. An adapter, with complementarity to the 3' overhang of the endonuclease (KpnI, NsiI, PstI, or SacI restricted DNA fragments, extends the 3' end of the DNA fragments in the first cycle of the primary RSE-PCR. During subsequent PCR cycles and a second semi-nested PCR (secondary RSE-PCR, touchdown and two-step PCR are combined to increase the amplification specificity of target fragments. The efficiency and specificity was demonstrated in our characterization of 37 tex mutants of Arabidopsis. All the steps of RSE-PCR can be executed in a 96 well PCR plate. Finally, RSE-PCR serves as a successful alternative to Genome Walker as demonstrated by gene isolation from maize, a plant with a more complex genome than Arabidopsis. CONCLUSIONS/SIGNIFICANCE: RSE-PCR has high potential application in identifying tagged (T-DNA or transposon sequence or walking from known DNA toward unknown regions in large-genome plants, with likely application in other organisms as well.

  13. Evaluation of SNP Data from the Malus Infinium Array Identifies Challenges for Genetic Analysis of Complex Genomes of Polyploid Origin.

    Directory of Open Access Journals (Sweden)

    Michela Troggio

    Full Text Available High throughput arrays for the simultaneous genotyping of thousands of single-nucleotide polymorphisms (SNPs have made the rapid genetic characterisation of plant genomes and the development of saturated linkage maps a realistic prospect for many plant species of agronomic importance. However, the correct calling of SNP genotypes in divergent polyploid genomes using array technology can be problematic due to paralogy, and to divergence in probe sequences causing changes in probe binding efficiencies. An Illumina Infinium II whole-genome genotyping array was recently developed for the cultivated apple and used to develop a molecular linkage map for an apple rootstock progeny (M432, but a large proportion of segregating SNPs were not mapped in the progeny, due to unexpected genotype clustering patterns. To investigate the causes of this unexpected clustering we performed BLAST analysis of all probe sequences against the 'Golden Delicious' genome sequence and discovered evidence for paralogous annealing sites and probe sequence divergence for a high proportion of probes contained on the array. Following visual re-evaluation of the genotyping data generated for 8,788 SNPs for the M432 progeny using the array, we manually re-scored genotypes at 818 loci and mapped a further 797 markers to the M432 linkage map. The newly mapped markers included the majority of those that could not be mapped previously, as well as loci that were previously scored as monomorphic, but which segregated due to divergence leading to heterozygosity in probe annealing sites. An evaluation of the 8,788 probes in a diverse collection of Malus germplasm showed that more than half the probes returned genotype clustering patterns that were difficult or impossible to interpret reliably, highlighting implications for the use of the array in genome-wide association studies.

  14. A Unified and Comprehensible View of Parametric and Kernel Methods for Genomic Prediction with Application to Rice.

    Science.gov (United States)

    Jacquin, Laval; Cao, Tuong-Vi; Ahmadi, Nourollah

    2016-01-01

    One objective of this study was to provide readers with a clear and unified understanding of parametric statistical and kernel methods, used for genomic prediction, and to compare some of these in the context of rice breeding for quantitative traits. Furthermore, another objective was to provide a simple and user-friendly R package, named KRMM, which allows users to perform RKHS regression with several kernels. After introducing the concept of regularized empirical risk minimization, the connections between well-known parametric and kernel methods such as Ridge regression [i.e., genomic best linear unbiased predictor (GBLUP)] and reproducing kernel Hilbert space (RKHS) regression were reviewed. Ridge regression was then reformulated so as to show and emphasize the advantage of the kernel "trick" concept, exploited by kernel methods in the context of epistatic genetic architectures, over parametric frameworks used by conventional methods. Some parametric and kernel methods; least absolute shrinkage and selection operator (LASSO), GBLUP, support vector machine regression (SVR) and RKHS regression were thereupon compared for their genomic predictive ability in the context of rice breeding using three real data sets. Among the compared methods, RKHS regression and SVR were often the most accurate methods for prediction followed by GBLUP and LASSO. An R function which allows users to perform RR-BLUP of marker effects, GBLUP and RKHS regression, with a Gaussian, Laplacian, polynomial or ANOVA kernel, in a reasonable computation time has been developed. Moreover, a modified version of this function, which allows users to tune kernels for RKHS regression, has also been developed and parallelized for HPC Linux clusters. The corresponding KRMM package and all scripts have been made publicly available.

  15. Omni-PolyA: a method and tool for accurate recognition of Poly(A) signals in human genomic DNA

    KAUST Repository

    Magana-Mora, Arturo

    2017-08-15

    BackgroundPolyadenylation is a critical stage of RNA processing during the formation of mature mRNA, and is present in most of the known eukaryote protein-coding transcripts and many long non-coding RNAs. The correct identification of poly(A) signals (PAS) not only helps to elucidate the 3′-end genomic boundaries of a transcribed DNA region and gene regulatory mechanisms but also gives insight into the multiple transcript isoforms resulting from alternative PAS. Although progress has been made in the in-silico prediction of genomic signals, the recognition of PAS in DNA genomic sequences remains a challenge.ResultsIn this study, we analyzed human genomic DNA sequences for the 12 most common PAS variants. Our analysis has identified a set of features that helps in the recognition of true PAS, which may be involved in the regulation of the polyadenylation process. The proposed features, in combination with a recognition model, resulted in a novel method and tool, Omni-PolyA. Omni-PolyA combines several machine learning techniques such as different classifiers in a tree-like decision structure and genetic algorithms for deriving a robust classification model. We performed a comparison between results obtained by state-of-the-art methods, deep neural networks, and Omni-PolyA. Results show that Omni-PolyA significantly reduced the average classification error rate by 35.37% in the prediction of the 12 considered PAS variants relative to the state-of-the-art results.ConclusionsThe results of our study demonstrate that Omni-PolyA is currently the most accurate model for the prediction of PAS in human and can serve as a useful complement to other PAS recognition methods. Omni-PolyA is publicly available as an online tool accessible at www.cbrc.kaust.edu.sa/omnipolya/.

  16. A method for evaluating discoverability and navigability of recommendation algorithms.

    Science.gov (United States)

    Lamprecht, Daniel; Strohmaier, Markus; Helic, Denis

    2017-01-01

    Recommendations are increasingly used to support and enable discovery, browsing, and exploration of items. This is especially true for entertainment platforms such as Netflix or YouTube, where frequently, no clear categorization of items exists. Yet, the suitability of a recommendation algorithm to support these use cases cannot be comprehensively evaluated by any recommendation evaluation measures proposed so far. In this paper, we propose a method to expand the repertoire of existing recommendation evaluation techniques with a method to evaluate the discoverability and navigability of recommendation algorithms. The proposed method tackles this by means of first evaluating the discoverability of recommendation algorithms by investigating structural properties of the resulting recommender systems in terms of bow tie structure, and path lengths. Second, the method evaluates navigability by simulating three different models of information seeking scenarios and measuring the success rates. We show the feasibility of our method by applying it to four non-personalized recommendation algorithms on three data sets and also illustrate its applicability to personalized algorithms. Our work expands the arsenal of evaluation techniques for recommendation algorithms, extends from a one-click-based evaluation towards multi-click analysis, and presents a general, comprehensive method to evaluating navigability of arbitrary recommendation algorithms.

  17. A hybrid method for evaluating enterprise architecture implementation.

    Science.gov (United States)

    Nikpay, Fatemeh; Ahmad, Rodina; Yin Kia, Chiam

    2017-02-01

    Enterprise Architecture (EA) implementation evaluation provides a set of methods and practices for evaluating the EA implementation artefacts within an EA implementation project. There are insufficient practices in existing EA evaluation models in terms of considering all EA functions and processes, using structured methods in developing EA implementation, employing matured practices, and using appropriate metrics to achieve proper evaluation. The aim of this research is to develop a hybrid evaluation method that supports achieving the objectives of EA implementation. To attain this aim, the first step is to identify EA implementation evaluation practices. To this end, a Systematic Literature Review (SLR) was conducted. Second, the proposed hybrid method was developed based on the foundation and information extracted from the SLR, semi-structured interviews with EA practitioners, program theory evaluation and Information Systems (ISs) evaluation. Finally, the proposed method was validated by means of a case study and expert reviews. This research provides a suitable foundation for researchers who wish to extend and continue this research topic with further analysis and exploration, and for practitioners who would like to employ an effective and lightweight evaluation method for EA projects. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Evaluating the effectiveness of methods for capturing meetings

    OpenAIRE

    Hall, Mark John; Bermell-Garcia, Pablo; McMahon, Chris A.; Johansson, Anders; Gonzalez-Franco, Mar

    2015-01-01

    The purpose of this paper is to evaluate the effectiveness of commonly used methods to capture synchronous meetings for information and knowledge retrieval. Four methods of capture are evaluated in the form of a case study whereby a technical design meeting was captured by; (i) transcription; (ii) diagrammatic argumentation; (iii) meeting minutes; and (iv) video. The paper describes an experiment where participants undertook an information retrieval task and provided feedback on the methods. ...

  19. INTEGRATED SENSOR EVALUATION CIRCUIT AND METHOD FOR OPERATING SAID CIRCUIT

    OpenAIRE

    Krüger, Jens; Gausa, Dominik

    2015-01-01

    WO15090426A1 Sensor evaluation device and method for operating said device Integrated sensor evaluation circuit for evaluating a sensor signal (14) received from a sensor (12), having a first connection (28a) for connection to the sensor and a second connection (28b) for connection to the sensor. The integrated sensor evaluation circuit comprises a configuration data memory (16) for storing configuration data which describe signal properties of a plurality of sensor control signals (26a-c). T...

  20. Evaluation and Validation of Assembling Corrected PacBio Long Reads for Microbial Genome Completion via Hybrid Approaches.

    Science.gov (United States)

    Lin, Hsin-Hung; Liao, Yu-Chieh

    2015-01-01

    Despite the ever-increasing output of next-generation sequencing data along with developing assemblers, dozens to hundreds of gaps still exist in de novo microbial assemblies due to uneven coverage and large genomic repeats. Third-generation single-molecule, real-time (SMRT) sequencing technology avoids amplification artifacts and generates kilobase-long reads with the potential to complete microbial genome assembly. However, due to the low accuracy (~85%) of third-generation sequences, a considerable amount of long reads (>50X) are required for self-correction and for subsequent de novo assembly. Recently-developed hybrid approaches, using next-generation sequencing data and as few as 5X long reads, have been proposed to improve the completeness of microbial assembly. In this study we have evaluated the contemporary hybrid approaches and demonstrated that assembling corrected long reads (by runCA) produced the best assembly compared to long-read scaffolding (e.g., AHA, Cerulean and SSPACE-LongRead) and gap-filling (SPAdes). For generating corrected long reads, we further examined long-read correction tools, such as ECTools, LSC, LoRDEC, PBcR pipeline and proovread. We have demonstrated that three microbial genomes including Escherichia coli K12 MG1655, Meiothermus ruber DSM1279 and Pdeobacter heparinus DSM2366 were successfully hybrid assembled by runCA into near-perfect assemblies using ECTools-corrected long reads. In addition, we developed a tool, Patch, which implements corrected long reads and pre-assembled contigs as inputs, to enhance microbial genome assemblies. With the additional 20X long reads, short reads of S. cerevisiae W303 were hybrid assembled into 115 contigs using the verified strategy, ECTools + runCA. Patch was subsequently applied to upgrade the assembly to a 35-contig draft genome. Our evaluation of the hybrid approaches shows that assembling the ECTools-corrected long reads via runCA generates near complete microbial genomes, suggesting

  1. Bacterial whole genome-based phylogeny: construction of a new benchmarking dataset and assessment of some existing methods.

    Science.gov (United States)

    Ahrenfeldt, Johanne; Skaarup, Carina; Hasman, Henrik; Pedersen, Anders Gorm; Aarestrup, Frank Møller; Lund, Ole

    2017-01-05

    Whole genome sequencing (WGS) is increasingly used in diagnostics and surveillance of infectious diseases. A major application for WGS is to use the data for identifying outbreak clusters, and there is therefore a need for methods that can accurately and efficiently infer phylogenies from sequencing reads. In the present study we describe a new dataset that we have created for the purpose of benchmarking such WGS-based methods for epidemiological data, and also present an analysis where we use the data to compare the performance of some current methods. Our aim was to create a benchmark data set that mimics sequencing data of the sort that might be collected during an outbreak of an infectious disease. This was achieved by letting an E. coli hypermutator strain grow in the lab for 8 consecutive days, each day splitting the culture in two while also collecting samples for sequencing. The result is a data set consisting of 101 whole genome sequences with known phylogenetic relationship. Among the sequenced samples 51 correspond to internal nodes in the phylogeny because they are ancestral, while the remaining 50 correspond to leaves. We also used the newly created data set to compare three different online available methods that infer phylogenies from whole-genome sequencing reads: NDtree, CSI Phylogeny and REALPHY. One complication when comparing the output of these methods with the known phylogeny is that phylogenetic methods typically build trees where all observed sequences are placed as leafs, even though some of them are in fact ancestral. We therefore devised a method for post processing the inferred trees by collapsing short branches (thus relocating some leafs to internal nodes), and also present two new measures of tree similarity that takes into account the identity of both internal and leaf nodes. Based on this analysis we find that, among the investigated methods, CSI Phylogeny had the best performance, correctly identifying 73% of all branches in the

  2. Intellectual Data Analysis Method for Evaluation of Virtual Teams

    Directory of Open Access Journals (Sweden)

    Sandra Strigūnaitė

    2013-01-01

    Full Text Available The purpose of the article is to present a method for virtual team performance evaluation based on intelligent team member collaboration data analysis. The motivation for the research is based on the ability to create an evaluation method that is similar to ambiguous expert evaluations. The concept of the hierarchical fuzzy rule based method aims to evaluate the data from virtual team interaction instances related to implementation of project tasks. The suggested method is designed for project managers or virtual team leaders to help in virtual teamwork evaluation that is based on captured data analysis. The main point of the method is the ability to repeat human thinking and expert valuation process for data analysis by applying fuzzy logic: fuzzy sets, fuzzy signatures and fuzzy rules. The fuzzy set principle used in the method allows evaluation criteria numerical values to transform into linguistic terms and use it in constructing fuzzy rules. Using a fuzzy signature is possible in constructing a hierarchical criteria structure. This structure helps to solve the problem of exponential increase of fuzzy rules including more input variables. The suggested method is aimed to be applied in the virtual collaboration software as a real time teamwork evaluation tool. The research shows that by applying fuzzy logic for team collaboration data analysis it is possible to get evaluations equal to expert insights. The method includes virtual team, project task and team collaboration data analysis. The advantage of the suggested method is the possibility to use variables gained from virtual collaboration systems as fuzzy rules inputs. Information on fuzzy logic based virtual teamwork collaboration evaluation has evidence that can be investigated in the future. Also the method can be seen as the next virtual collaboration software development step.

  3. A unified and comprehensible view of parametric and kernel methods for genomic prediction with application to rice

    Directory of Open Access Journals (Sweden)

    Laval Jacquin

    2016-08-01

    Full Text Available One objective of this study was to provide readers with a clear and unified understanding ofparametric statistical and kernel methods, used for genomic prediction, and to compare some ofthese in the context of rice breeding for quantitative traits. Furthermore, another objective wasto provide a simple and user-friendly R package, named KRMM, which allows users to performRKHS regression with several kernels. After introducing the concept of regularized empiricalrisk minimization, the connections between well-known parametric and kernel methods suchas Ridge regression (i.e. genomic best linear unbiased predictor (GBLUP and reproducingkernel Hilbert space (RKHS regression were reviewed. Ridge regression was then reformulatedso as to show and emphasize the advantage of the kernel trick concept, exploited by kernelmethods in the context of epistatic genetic architectures, over parametric frameworks used byconventional methods. Some parametric and kernel methods; least absolute shrinkage andselection operator (LASSO, GBLUP, support vector machine regression (SVR and RKHSregression were thereupon compared for their genomic predictive ability in the context of ricebreeding using three real data sets. Among the compared methods, RKHS regression and SVRwere often the most accurate methods for prediction followed by GBLUP and LASSO. An Rfunction which allows users to perform RR-BLUP of marker effects, GBLUP and RKHS regression,with a Gaussian, Laplacian, polynomial or ANOVA kernel, in a reasonable computation time hasbeen developed. Moreover, a modified version of this function, which allows users to tune kernelsfor RKHS regression, has also been developed and parallelized for HPC Linux clusters. The corresponding KRMM package and all scripts have been made publicly available.

  4. Evaluating methods to improve safeguards training courses of ISCN

    International Nuclear Information System (INIS)

    Okumura, Yukiko; Nakamura, Yo; Kawata, Norio

    2014-01-01

    Although questionnaires were used to receive feedbacks from participants at the end of each training course, Integrated Support Center for Nuclear Nonproliferation and Nuclear Security (ISCN) of Japan Atomic Energy Agency (JAEA) did not establish a structured evaluation method. To this end, ISCN has started to study on methods to accurately evaluate the courses since April and started to introduce the evaluation method on trial, according to the Donald Kirkpatrick's Four-Level Training Evaluation Model, so as to better develop and conduct more effective courses. This paper will focus on how ISCN has modified the Kirkpatrick's Four-level to adapt to its safeguards training courses. This will then be followed by two particular cases of how the evaluation method functioned for the Additional Protocol training courses held in Malaysia in 2014, and the feedbacks received to improve future training courses. (author)

  5. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  6. A Preliminary Rubric Design to Evaluate Mixed Methods Research

    Science.gov (United States)

    Burrows, Timothy J.

    2013-01-01

    With the increase in frequency of the use of mixed methods, both in research publications and in externally funded grants there are increasing calls for a set of standards to assess the quality of mixed methods research. The purpose of this mixed methods study was to conduct a multi-phase analysis to create a preliminary rubric to evaluate mixed…

  7. Evaluation of binding energies by using quantum mechanical methods

    International Nuclear Information System (INIS)

    Postolache, Cristian; Matei, Lidia; Postolache, Carmen

    2002-01-01

    Evaluation of binding energies (BE) in molecular structure is needed for modelling chemical and radiochemical processes by quantum-chemical methods. An important field of application is evaluation of radiolysis and autoradiolysis stability of organic and inorganic compounds as well as macromolecular structures. The current methods of calculation do not allow direct determination of BE but only of total binding energies (TBE) and enthalpies. BEs were evaluated indirectly by determining the homolytic dissociation energies. The molecular structures were built and geometrically optimized by the molecular mechanics methods MM+ and AMBER. The energy minimizations were refined by semi-empirical methods. Depending on the chosen molecular structure, the CNDO, INDO, PM3 and AM1 methods were used. To reach a high confidence level the minimizations were done for gradients lower than 10 -3 RMS. The energy values obtained by the difference of the fragment TBLs, of the transition states and initial molecular structures, respectively, were associated to the hemolytic fragmentation energy and BE, respectively. In order to evaluate the method's accuracy and to establish the application fields of the evaluation methods, the obtained values of BEs were compared with the experimental data taken from literature. To this goal there were built, geometrically optimized by semi-empirical methods and evaluated the BEs for 74 organic and inorganic compounds (alkanes, alkene, alkynes, halogenated derivatives, alcohols, aldehydes, ketones, carboxylic acids, nitrogen and sulfur compounds, water, hydrogen peroxide, ammonia, hydrazine, etc. (authors)

  8. Evaluation of Residual Stresses using Ring Core Method

    Directory of Open Access Journals (Sweden)

    Holý S.

    2010-06-01

    Full Text Available The method for measuring residual stresses using ring-core method is described. Basic relations are given for residual stress measurement along the specimen depth and simplified method is described for average residual stress estimation in the drilled layer for known principal stress directions. The estimation of calculated coefficients using FEM is described. Comparison of method sensitivity is made with hole-drilling method. The device for method application is described and an example of experiment is introduced. The accuracy of method is discussed. The influence of strain gauge rosette misalignment to the evaluated residual stresses is performed using FEM.

  9. Benchmark calculations for evaluation methods of gas volumetric leakage rate

    International Nuclear Information System (INIS)

    Asano, R.; Aritomi, M.; Matsuzaki, M.

    1998-01-01

    A containment function of radioactive materials transport casks is essential for safe transportation to prevent the radioactive materials from being released into environment. Regulations such as IAEA standard determined the limit of radioactivity to be released. Since is not practical for the leakage tests to measure directly the radioactivity release from a package, as gas volumetric leakages rates are proposed in ANSI N14.5 and ISO standards. In our previous works, gas volumetric leakage rates for several kinds of gas from various leaks were measured and two evaluation methods, 'a simple evaluation method' and 'a strict evaluation method', were proposed based on the results. The simple evaluation method considers the friction loss of laminar flow with expansion effect. The strict evaluating method considers an exit loss in addition to the friction loss. In this study, four worked examples were completed for on assumed large spent fuel transport cask (Type B Package) with wet or dry capacity and at three transport conditions; normal transport with intact fuels or failed fuels, and an accident in transport. The standard leakage rates and criteria for two kinds of leak test were calculated for each example by each evaluation method. The following observations are made based upon the calculations and evaluations: the choked flow model of ANSI method greatly overestimates the criteria for tests ; the laminar flow models of both ANSI and ISO methods slightly overestimate the criteria for tests; the above two results are within the design margin for ordinary transport condition and all methods are useful for the evaluation; for severe condition such as failed fuel transportation, it should pay attention to apply a choked flow model of ANSI method. (authors)

  10. Advanced evaluation method of SG TSP BEC hole blockage rate

    International Nuclear Information System (INIS)

    Izumida, Hiroyuki; Nagata, Yasuyuki; Harada, Yutaka; Murakami, Ryuji

    2003-01-01

    In spite of the control of the water chemistry of SG secondary feed-water in PWR-SG, SG TSP BEC holes, which are the flow path of secondary water, are often clogged. In the past, the trending of BEC hole blockage rate has conducted by evaluating ECT original signals and visual inspections. However, the ECT original signals of deposits are diversified, it becomes difficult to analyze them with the existing evaluation method using the ECT original signals. In this regard, we have developed the secondary side visual inspection system, which enables the high-accuracy evaluation of BEC hole blockage rate, and new ECT signal evaluation method. (author)

  11. An evaluation of multiple annealing and looping based genome amplification using a synthetic bacterial community

    KAUST Repository

    Wang, Yong

    2016-02-23

    The low biomass in environmental samples is a major challenge for microbial metagenomic studies. The amplification of a genomic DNA was frequently applied to meeting the minimum requirement of the DNA for a high-throughput next-generation-sequencing technology. Using a synthetic bacterial community, the amplification efficiency of the Multiple Annealing and Looping Based Amplification Cycles (MALBAC) kit that is originally developed to amplify the single-cell genomic DNA of mammalian organisms is examined. The DNA template of 10 pg in each reaction of the MALBAC amplification may generate enough DNA for Illumina sequencing. Using 10 pg and 100 pg templates for each reaction set, the MALBAC kit shows a stable and homogeneous amplification as indicated by the highly consistent coverage of the reads from the two amplified samples on the contigs assembled by the original unamplified sample. Although GenomePlex whole genome amplification kit allows one to generate enough DNA using 100 pg of template in each reaction, the minority of the mixed bacterial species is not linearly amplified. For both of the kits, the GC-rich regions of the genomic DNA are not efficiently amplified as suggested by the low coverage of the contigs with the high GC content. The high efficiency of the MALBAC kit is supported for the amplification of environmental microbial DNA samples, and the concerns on its application are also raised to bacterial species with the high GC content.

  12. Evaluation of different sources of DNA for use in genome wide studies and forensic application.

    Science.gov (United States)

    Al Safar, Habiba S; Abidi, Fatima H; Khazanehdari, Kamal A; Dadour, Ian R; Tay, Guan K

    2011-02-01

    In the field of epidemiology, Genome-Wide Association Studies (GWAS) are commonly used to identify genetic predispositions of many human diseases. Large repositories housing biological specimens for clinical and genetic investigations have been established to store material and data for these studies. The logistics of specimen collection and sample storage can be onerous, and new strategies have to be explored. This study examines three different DNA sources (namely, degraded genomic DNA, amplified degraded genomic DNA and amplified extracted DNA from FTA card) for GWAS using the Illumina platform. No significant difference in call rate was detected between amplified degraded genomic DNA extracted from whole blood and amplified DNA retrieved from FTA™ cards. However, using unamplified-degraded genomic DNA reduced the call rate to a mean of 42.6% compared to amplified DNA extracted from FTA card (mean of 96.6%). This study establishes the utility of FTA™ cards as a viable storage matrix for cells from which DNA can be extracted to perform GWAS analysis.

  13. A online credit evaluation method based on AHP and SPA

    Science.gov (United States)

    Xu, Yingtao; Zhang, Ying

    2009-07-01

    Online credit evaluation is the foundation for the establishment of trust and for the management of risk between buyers and sellers in e-commerce. In this paper, a new credit evaluation method based on the analytic hierarchy process (AHP) and the set pair analysis (SPA) is presented to determine the credibility of the electronic commerce participants. It solves some of the drawbacks found in classical credit evaluation methods and broadens the scope of current approaches. Both qualitative and quantitative indicators are considered in the proposed method, then a overall credit score is achieved from the optimal perspective. In the end, a case analysis of China Garment Network is provided for illustrative purposes.

  14. Accuracy of Genomic Evaluations of Juvenile Growth Rate in Common Carp (Cyprinus carpio Using Genotyping by Sequencing

    Directory of Open Access Journals (Sweden)

    Christos Palaiokostas

    2018-03-01

    Full Text Available Cyprinids are the most important group of farmed fish globally in terms of production volume, with common carp (Cyprinus carpio being one of the most valuable species of the group. The use of modern selective breeding methods in carp is at a formative stage, implying a large scope for genetic improvement of key production traits. In the current study, a population of 1,425 carp juveniles, originating from a partial factorial cross between 40 sires and 20 dams, was used for investigating the potential of genomic selection (GS for juvenile growth, an exemplar polygenic production trait. RAD sequencing was used to identify and genotype SNP markers for subsequent parentage assignment, construction of a medium density genetic map (12,311 SNPs, genome-wide association study (GWAS, and testing of GS. A moderate heritability was estimated for body length of carp at 120 days (as a proxy of juvenile growth of 0.33 (s.e. 0.05. No genome-wide significant QTL was identified using a single marker GWAS approach. Genomic prediction of breeding values outperformed pedigree-based prediction, resulting in 18% improvement in prediction accuracy. The impact of reduced SNP densities on prediction accuracy was tested by varying minor allele frequency (MAF thresholds, with no drop in prediction accuracy until the MAF threshold is set <0.3 (2,744 SNPs. These results point to the potential for GS to improve economically important traits in common carp breeding programs.

  15. Novel degenerate PCR method for whole genome amplification applied to Peru Margin (ODP Leg 201 subsurface samples

    Directory of Open Access Journals (Sweden)

    Amanda eMartino

    2012-01-01

    Full Text Available A degenerate PCR-based method of whole-genome amplification, designed to work fluidly with 454 sequencing technology, was developed and tested for use on deep marine subsurface DNA samples. The method, which we have called Random Amplification Metagenomic PCR (RAMP, involves the use of specific primers from Roche 454 amplicon sequencing, modified by the addition of a degenerate region at the 3’ end. It utilizes a PCR reaction, which resulted in no amplification from blanks, even after 50 cycles of PCR. After efforts to optimize experimental conditions, the method was tested with DNA extracted from cultured E. coli cells, and genome coverage was estimated after sequencing on three different occasions. Coverage did not vary greatly with the different experimental conditions tested, and was around 62% with a sequencing effort equivalent to a theoretical genome coverage of 14.10X. The GC content of the sequenced amplification product was within 2% of the predicted values for this strain of E. coli. The method was also applied to DNA extracted from marine subsurface samples from ODP Leg 201 site 1229 (Peru Margin, and results of a taxonomic analysis revealed microbial communities dominated by Proteobacteria, Chloroflexi, Firmicutes, Euryarchaeota, and Crenarchaeota, among others. These results were similar to those obtained previously for those samples; however, variations in the proportions of taxa show that community analysis can be sensitive to both the amplification technique used and the method of assigning sequences to taxonomic groups. Overall, we find that RAMP represents a valid methodology for amplifying metagenomes from low biomass samples.

  16. Novel Degenerate PCR Method for Whole-Genome Amplification Applied to Peru Margin (ODP Leg 201) Subsurface Samples

    Science.gov (United States)

    Martino, Amanda J.; Rhodes, Matthew E.; Biddle, Jennifer F.; Brandt, Leah D.; Tomsho, Lynn P.; House, Christopher H.

    2011-01-01

    A degenerate polymerase chain reaction (PCR)-based method of whole-genome amplification, designed to work fluidly with 454 sequencing technology, was developed and tested for use on deep marine subsurface DNA samples. While optimized here for use with Roche 454 technology, the general framework presented may be applicable to other next generation sequencing systems as well (e.g., Illumina, Ion Torrent). The method, which we have called random amplification metagenomic PCR (RAMP), involves the use of specific primers from Roche 454 amplicon sequencing, modified by the addition of a degenerate region at the 3′ end. It utilizes a PCR reaction, which resulted in no amplification from blanks, even after 50 cycles of PCR. After efforts to optimize experimental conditions, the method was tested with DNA extracted from cultured E. coli cells, and genome coverage was estimated after sequencing on three different occasions. Coverage did not vary greatly with the different experimental conditions tested, and was around 62% with a sequencing effort equivalent to a theoretical genome coverage of 14.10×. The GC content of the sequenced amplification product was within 2% of the predicted values for this strain of E. coli. The method was also applied to DNA extracted from marine subsurface samples from ODP Leg 201 site 1229 (Peru Margin), and results of a taxonomic analysis revealed microbial communities dominated by Proteobacteria, Chloroflexi, Firmicutes, Euryarchaeota, and Crenarchaeota, among others. These results were similar to those obtained previously for those samples; however, variations in the proportions of taxa identified illustrates well the generally accepted view that community analysis is sensitive to both the amplification technique used and the method of assigning sequences to taxonomic groups. Overall, we find that RAMP represents a valid methodology for amplifying metagenomes from low-biomass samples. PMID:22319519

  17. A Critical Review of Concepts and Methods Used in Classical Genome Analysis

    DEFF Research Database (Denmark)

    Seberg, Ole; Petersen, Gitte

    1998-01-01

    A short account of the development of classical genome analysis, the analysis of chromosome behaviour in metaphase I of meiosis, primarily in interspecific hybrids, is given. The application of the concept of homology to describe chromosome pairing between the respective chromosomes of a pair...... breeding but it has no place in systematics. With an increased knowledge and understanding of the mechanism behind meiosis, data useful in a systematic context may eventually be produced....

  18. Combining genomic sequencing methods to explore viral diversity and reveal potential virus-host interactions

    Directory of Open Access Journals (Sweden)

    Cheryl-Emiliane Tien Chow

    2015-04-01

    Full Text Available Viral diversity and virus-host interactions in oxygen-starved regions of the ocean, also known as oxygen minimum zones (OMZs, remain relatively unexplored. Microbial community metabolism in OMZs alters nutrient and energy flow through marine food webs, resulting in biological nitrogen loss and greenhouse gas production. Thus, viruses infecting OMZ microbes have the potential to modulate community metabolism with resulting feedback on ecosystem function. Here, we describe viral communities inhabiting oxic surface (10m and oxygen-starved basin (200m waters of Saanich Inlet, a seasonally anoxic fjord on the coast of Vancouver Island, British Columbia using viral metagenomics and complete viral fosmid sequencing on samples collected between April 2007 and April 2010. Of 6459 open reading frames (ORFs predicted across all 34 viral fosmids, 77.6% (n=5010 had no homology to reference viral genomes. These fosmids recruited a higher proportion of viral metagenomic sequences from Saanich Inlet than from nearby northeastern subarctic Pacific Ocean (Line P waters, indicating differences in the viral communities between coastal and open ocean locations. While functional annotations of fosmid ORFs were limited, recruitment to NCBI’s non-redundant ‘nr’ database and publicly available single-cell genomes identified putative viruses infecting marine thaumarchaeal and SUP05 proteobacteria to provide potential host linkages with relevance to coupled biogeochemical cycling processes in OMZ waters. Taken together, these results highlight the power of coupled analyses of multiple sequence data types, such as viral metagenomic and fosmid sequence data with prokaryotic single cell genomes, to chart viral diversity, elucidate genomic and ecological contexts for previously unclassifiable viral sequences, and identify novel host interactions in natural and engineered ecosystems.

  19. Novel method to ascertain chromatin accessibility at specific genomic loci from frozen brain homogenates and laser capture microdissected defined cells.

    Science.gov (United States)

    Delvaux, Elaine; Mastroeni, Diego; Nolz, Jennifer; Coleman, Paul D

    2016-06-01

    We describe a novel method for assessing the "open" or "closed" state of chromatin at selected locations within the genome. This method combines the use of Benzonase, which can digest DNA in the presence of actin, with qPCR to define digested regions. We demonstrate the application of this method in brain homogenates and laser captured cells. We also demonstrate application to selected sites within more than one gene and multiple sites within one gene. We demonstrate the validity of the method by treating cells with valproate, known to render chromatin more permissive, and by comparison with classical digestion with DNase I in an in vitro preparation. Although we demonstrate the use of this method in brain tissue we also recognize its applicability to other tissue types.

  20. Novel method to ascertain chromatin accessibility at specific genomic loci from frozen brain homogenates and laser capture microdissected defined cells

    Directory of Open Access Journals (Sweden)

    Elaine Delvaux

    2016-06-01

    Full Text Available We describe a novel method for assessing the “open” or “closed” state of chromatin at selected locations within the genome. This method combines the use of Benzonase, which can digest DNA in the presence of actin, with quantitative polymerase chain reaction to define digested regions. We demonstrate the application of this method in brain homogenates and laser captured cells. We also demonstrate application to selected sites within more than 1 gene and multiple sites within 1 gene. We demonstrate the validity of the method by treating cells with valproate, known to render chromatin more permissive, and by comparison with classical digestion with DNase I in an in vitro preparation. Although we demonstrate the use of this method in brain tissue, we also recognize its applicability to other tissue types.

  1. A new method for detecting signal regions in ordered sequences of real numbers, and application to viral genomic data.

    Science.gov (United States)

    Gog, Julia R; Lever, Andrew M L; Skittrall, Jordan P

    2018-01-01

    We present a fast, robust and parsimonious approach to detecting signals in an ordered sequence of numbers. Our motivation is in seeking a suitable method to take a sequence of scores corresponding to properties of positions in virus genomes, and find outlying regions of low scores. Suitable statistical methods without using complex models or making many assumptions are surprisingly lacking. We resolve this by developing a method that detects regions of low score within sequences of real numbers. The method makes no assumptions a priori about the length of such a region; it gives the explicit location of the region and scores it statistically. It does not use detailed mechanistic models so the method is fast and will be useful in a wide range of applications. We present our approach in detail, and test it on simulated sequences. We show that it is robust to a wide range of signal morphologies, and that it is able to capture multiple signals in the same sequence. Finally we apply it to viral genomic data to identify regions of evolutionary conservation within influenza and rotavirus.

  2. A Pattern-Oriented Approach to a Methodical Evaluation of Modeling Methods

    Directory of Open Access Journals (Sweden)

    Michael Amberg

    1996-11-01

    Full Text Available The paper describes a pattern-oriented approach to evaluate modeling methods and to compare various methods with each other from a methodical viewpoint. A specific set of principles (the patterns is defined by investigating the notations and the documentation of comparable modeling methods. Each principle helps to examine some parts of the methods from a specific point of view. All principles together lead to an overall picture of the method under examination. First the core ("method neutral" meaning of each principle is described. Then the methods are examined regarding the principle. Afterwards the method specific interpretations are compared with each other and with the core meaning of the principle. By this procedure, the strengths and weaknesses of modeling methods regarding methodical aspects are identified. The principles are described uniformly using a principle description template according to descriptions of object oriented design patterns. The approach is demonstrated by evaluating a business process modeling method.

  3. Enriching the gene set analysis of genome-wide data by incorporating directionality of gene expression and combining statistical hypotheses and methods

    Science.gov (United States)

    Väremo, Leif; Nielsen, Jens; Nookaew, Intawat

    2013-01-01

    Gene set analysis (GSA) is used to elucidate genome-wide data, in particular transcriptome data. A multitude of methods have been proposed for this step of the analysis, and many of them have been compared and evaluated. Unfortunately, there is no consolidated opinion regarding what methods should be preferred, and the variety of available GSA software and implementations pose a difficulty for the end-user who wants to try out different methods. To address this, we have developed the R package Piano that collects a range of GSA methods into the same system, for the benefit of the end-user. Further on we refine the GSA workflow by using modifications of the gene-level statistics. This enables us to divide the resulting gene set P-values into three classes, describing different aspects of gene expression directionality at gene set level. We use our fully implemented workflow to investigate the impact of the individual components of GSA by using microarray and RNA-seq data. The results show that the evaluated methods are globally similar and the major separation correlates well with our defined directionality classes. As a consequence of this, we suggest to use a consensus scoring approach, based on multiple GSA runs. In combination with the directionality classes, this constitutes a more thorough basis for an enriched biological interpretation. PMID:23444143

  4. Entrepreneur environment management behavior evaluation method derived from environmental economy.

    Science.gov (United States)

    Zhang, Lili; Hou, Xilin; Xi, Fengru

    2013-12-01

    Evaluation system can encourage and guide entrepreneurs, and impel them to perform well in environment management. An evaluation method based on advantage structure is established. It is used to analyze entrepreneur environment management behavior in China. Entrepreneur environment management behavior evaluation index system is constructed based on empirical research. Evaluation method of entrepreneurs is put forward, from the point of objective programming-theory to alert entrepreneurs concerned to think much of it, which means to take minimized objective function as comprehensive evaluation result and identify disadvantage structure pattern. Application research shows that overall behavior of Chinese entrepreneurs environmental management are good, specially, environment strategic behavior are best, environmental management behavior are second, cultural behavior ranks last. Application results show the efficiency and feasibility of this method. Copyright © 2013 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.

  5. Method of evaluation of diagnostics reference levels in computerized tomography

    International Nuclear Information System (INIS)

    Vega, Walter Flores

    1999-04-01

    Computerized tomography is a complex technique with several selectable exposition parameters delivering high doses to the patient. In this work it was developed a simple methodology to evaluate diagnostic reference levels in computerized tomography, using the concept of Multiple Scan Average Dose (MSAD), recently adopted by the Health Ministry. For evaluation of the MSAD, a dose distribution was obtained through a measured dose profile on the axial axis of a water phantom with thermoluminescence dosemeters, TLD-100, for different exam technique. The MSAD was evaluated hrough two distinct methods. First, it was evaluated by the integration of the dose profile of a single slice and, second, obtained by the integration on central slice of the profile of several slices. The latter is in of accordance with the ionization chamber method, suggesting to be the most practical method of dose evaluation to be applied in the diagnostic reference level assessment routine for CT, using TLDs. (author)

  6. Selection of Suitable DNA Extraction Methods for Genetically Modified Maize 3272, and Development and Evaluation of an Event-Specific Quantitative PCR Method for 3272.

    Science.gov (United States)

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize, 3272. We first attempted to obtain genome DNA from this maize using a DNeasy Plant Maxi kit and a DNeasy Plant Mini kit, which have been widely utilized in our previous studies, but DNA extraction yields from 3272 were markedly lower than those from non-GM maize seeds. However, lowering of DNA extraction yields was not observed with GM quicker or Genomic-tip 20/G. We chose GM quicker for evaluation of the quantitative method. We prepared a standard plasmid for 3272 quantification. The conversion factor (Cf), which is required to calculate the amount of a genetically modified organism (GMO), was experimentally determined for two real-time PCR instruments, the Applied Biosystems 7900HT (the ABI 7900) and the Applied Biosystems 7500 (the ABI7500). The determined Cf values were 0.60 and 0.59 for the ABI 7900 and the ABI 7500, respectively. To evaluate the developed method, a blind test was conducted as part of an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSDr). The determined values were similar to those in our previous validation studies. The limit of quantitation for the method was estimated to be 0.5% or less, and we concluded that the developed method would be suitable and practical for detection and quantification of 3272.

  7. Methods, measures and indicators for evaluating benefits of transportation research

    CSIR Research Space (South Africa)

    Du Plessis, Louw

    2016-04-01

    Full Text Available The purpose of this article is to provide updated information by identifying and discussing methods, measures and indicators for evaluating benefits appropriate for transportation-related research facilities/programmes. The information has been...

  8. EVALUATION OF ANALYTICAL METHODS FOR DETERMINING PESTICIDES IN BABY FOOD

    Science.gov (United States)

    Three extraction methods and two detection techniques for determining pesticides in baby food were evaluated. The extraction techniques examined were supercritical fluid extraction (SFE), enhanced solvent extraction (ESE), and solid phase extraction (SPE). The detection techni...

  9. ASTM test methods for composite characterization and evaluation

    Science.gov (United States)

    Masters, John E.

    1994-01-01

    A discussion of the American Society for Testing and Materials is given. Under the topic of composite materials characterization and evaluation, general industry practice and test methods for textile composites are presented.

  10. Equitable Financial Evaluation Method for Public-Private Partnership Projects

    Institute of Scientific and Technical Information of China (English)

    KE Yongjian; LIU Xinping; WANG Shouqing

    2008-01-01

    The feasibility study of a public-private partnership (PPP) project is regarded as one of the critical factors for successful implementation,but unfortunately the common financial evaluation methods currently used only represent the benefits of the private sector.There is,therefore,an urgent need to develop an equitable financial evaluation method for PPP projects.This paper presents a comprehensive literature review that examines international practices.An equitable financial evaluation method was then developed taking into account the inherent characteristics of PPP projects using six separate indicators and Monte Carlo simulations.The result for a bridge project in Romania shows that the method combines the viewpoints of all the relevant stakeholders to achieve an equitable financial evaluation of PPP projects.

  11. Summary Report for Evaluation of Compost Sample Drying Methods

    National Research Council Canada - National Science Library

    Frye, Russell

    1994-01-01

    The U.S. Army Environmental Center (USAEC), formerly the U.S. Army Toxic and Hazardous Materials Agency, has evaluated composting methods for treatment of explosive-contaminated soils and sediments at Army installations...

  12. Multivariate Methods Based Soft Measurement for Wine Quality Evaluation

    Directory of Open Access Journals (Sweden)

    Shen Yin

    2014-01-01

    a decision. However, since the physicochemical indexes of wine can to some extent reflect the quality of wine, the multivariate statistical methods based soft measure can help the oenologist in wine evaluation.

  13. Toward an Improved Method of HSI Evaluation in Defense Acquisition

    National Research Council Canada - National Science Library

    Simpson, Matthew

    2006-01-01

    Each of the domains of HSI is of itself a discipline with vast amounts of research, analytic techniques, educational programs, and methods for evaluating the effectiveness of the system with respect...

  14. Radiochemistry methods in DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Fadeff, S.K.; Goheen, S.C.

    1994-08-01

    Current standard sources of radiochemistry methods are often inappropriate for use in evaluating US Department of Energy environmental and waste management (DOE/EW) samples. Examples of current sources include EPA, ASTM, Standard Methods for the Examination of Water and Wastewater and HASL-300. Applicability of these methods is limited to specific matrices (usually water), radiation levels (usually environmental levels), and analytes (limited number). Radiochemistry methods in DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) attempt to fill the applicability gap that exists between standard methods and those needed for DOE/EM activities. The Radiochemistry chapter in DOE Methods includes an ''analysis and reporting'' guidance section as well as radiochemistry methods. A basis for identifying the DOE/EM radiochemistry needs is discussed. Within this needs framework, the applicability of standard methods and targeted new methods is identified. Sources of new methods (consolidated methods from DOE laboratories and submissions from individuals) and the methods review process will be discussed. The processes involved in generating consolidated methods add editing individually submitted methods will be compared. DOE Methods is a living document and continues to expand by adding various kinds of methods. Radiochemistry methods are highlighted in this paper. DOE Methods is intended to be a resource for methods applicable to DOE/EM problems. Although it is intended to support DOE, the guidance and methods are not necessarily exclusive to DOE. The document is available at no cost through the Laboratory Management Division of DOE, Office of Technology Development

  15. Evaluation of full-scope simulator testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Feher, M P; Moray, N; Senders, J W; Biron, K [Human Factors North Inc., Toronto, ON (Canada)

    1995-03-01

    This report discusses the use of full scope nuclear power plant simulators in licensing examinations for Unit First Operators of CANDU reactors. The existing literature is reviewed, and an annotated bibliography of the more important sources provided. Since existing methods are judged inadequate, conceptual bases for designing a system for licensing are discussed, and a method proposed which would make use of objective scoring methods based on data collection in full-scope simulators. A field trial of such a method is described. The practicality of such a method is critically discussed and possible advantages of subjective methods of evaluation considered. (author). 32 refs., 1 tab., 4 figs.

  16. Evaluation of full-scope simulator testing methods

    International Nuclear Information System (INIS)

    Feher, M.P.; Moray, N.; Senders, J.W.; Biron, K.

    1995-03-01

    This report discusses the use of full scope nuclear power plant simulators in licensing examinations for Unit First Operators of CANDU reactors. The existing literature is reviewed, and an annotated bibliography of the more important sources provided. Since existing methods are judged inadequate, conceptual bases for designing a system for licensing are discussed, and a method proposed which would make use of objective scoring methods based on data collection in full-scope simulators. A field trial of such a method is described. The practicality of such a method is critically discussed and possible advantages of subjective methods of evaluation considered. (author). 32 refs., 1 tab., 4 figs

  17. A Method of Evaluating Operation of Electric Energy Meter

    Science.gov (United States)

    Chen, Xiangqun; Li, Tianyang; Cao, Fei; Chu, Pengfei; Zhao, Xinwang; Huang, Rui; Liu, Liping; Zhang, Chenglin

    2018-05-01

    The existing electric energy meter rotation maintenance strategy regularly checks the electric energy meter and evaluates the state. It only considers the influence of time factors, neglects the influence of other factors, leads to the inaccuracy of the evaluation, and causes the waste of resources. In order to evaluate the running state of the electric energy meter in time, a method of the operation evaluation of the electric energy meter is proposed. The method is based on extracting the existing data acquisition system, marketing business system and metrology production scheduling platform that affect the state of energy meters, and classified into error stability, operational reliability, potential risks and other factors according to the influencing factors, based on the above basic test score, inspecting score, monitoring score, score of family defect detection. Then, according to the evaluation model according to the scoring, we evaluate electric energy meter operating state, and finally put forward the corresponding maintenance strategy of rotation.

  18. CHARACTERISTICS OF MIRR METHOD IN EVALUATION OF INVESTMENT PROJECTS' EFFECTIVENESS

    Directory of Open Access Journals (Sweden)

    P. Kukhta

    2014-09-01

    Full Text Available There were analyzed characteristics of the Modified Internal Rate of Return method in the evaluation of investment projects, restrictions connected with its application, advantages and disadvantages compared with indicators of the original Internal Rate of Return and Net Present Value for projects with certain baseline characteristics. It was determined opportunities to adapt the method of Modified Internal Rate of Return to alternative computational approaches of the project cash flows evaluation.

  19. CHARACTERISTICS OF MIRR METHOD IN EVALUATION OF INVESTMENT PROJECTS' EFFECTIVENESS

    OpenAIRE

    P. Kukhta

    2014-01-01

    There were analyzed characteristics of the Modified Internal Rate of Return method in the evaluation of investment projects, restrictions connected with its application, advantages and disadvantages compared with indicators of the original Internal Rate of Return and Net Present Value for projects with certain baseline characteristics. It was determined opportunities to adapt the method of Modified Internal Rate of Return to alternative computational approaches of the project cash flows evalu...

  20. Evaluation of man-machine systems - methods and problems

    International Nuclear Information System (INIS)

    1985-01-01

    The symposium gives a survey of the methods of evaluation which permit as quantitive an assessment as possible of the collaboration between men and machines. This complex of problems is of great current significance in many areas of application. The systems to be evaluated are aircraft, land vehicles and watercraft as well as process control systems. (orig./GL) [de

  1. How to Reach Evidence-Based Usability Evaluation Methods

    NARCIS (Netherlands)

    Marcilly, Romaric; Peute, Linda

    2017-01-01

    This paper discusses how and why to build evidence-based knowledge on usability evaluation methods. At each step of building evidence, requisites and difficulties to achieve it are highlighted. Specifically, the paper presents how usability evaluation studies should be designed to allow capitalizing

  2. Comparative study of heuristic evaluation and usability testing methods.

    Science.gov (United States)

    Thyvalikakath, Thankam Paul; Monaco, Valerie; Thambuganipalle, Himabindu; Schleyer, Titus

    2009-01-01

    Usability methods, such as heuristic evaluation, cognitive walk-throughs and user testing, are increasingly used to evaluate and improve the design of clinical software applications. There is still some uncertainty, however, as to how those methods can be used to support the development process and evaluation in the most meaningful manner. In this study, we compared the results of a heuristic evaluation with those of formal user tests in order to determine which usability problems were detected by both methods. We conducted heuristic evaluation and usability testing on four major commercial dental computer-based patient records (CPRs), which together cover 80% of the market for chairside computer systems among general dentists. Both methods yielded strong evidence that the dental CPRs have significant usability problems. An average of 50% of empirically-determined usability problems were identified by the preceding heuristic evaluation. Some statements of heuristic violations were specific enough to precisely identify the actual usability problem that study participants encountered. Other violations were less specific, but still manifested themselves in usability problems and poor task outcomes. In this study, heuristic evaluation identified a significant portion of problems found during usability testing. While we make no assumptions about the generalizability of the results to other domains and software systems, heuristic evaluation may, under certain circumstances, be a useful tool to determine design problems early in the development cycle.

  3. Method of safety evaluation in nuclear power plants

    International Nuclear Information System (INIS)

    Kuraszkiewicz, P.; Zahn, P.

    1988-01-01

    A novel quantitative technique for evaluating safety of subsystems of nuclear power plants based on expert estimations is presented. It includes methods of mathematical psychology recognizing the effect of subjective factors in the expert estimates and, consequently, contributes to further objectification of evaluation. It may be applied to complementing probabilistic safety assessment. As a result of such evaluations a characteristic 'safety of nuclear power plants' is obtained. (author)

  4. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    Science.gov (United States)

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  5. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  6. Comparative evaluation of the genomes of three common Drosophila-associated bacteria

    Directory of Open Access Journals (Sweden)

    Kristina Petkau

    2016-09-01

    Full Text Available Drosophila melanogaster is an excellent model to explore the molecular exchanges that occur between an animal intestine and associated microbes. Previous studies in Drosophila uncovered a sophisticated web of host responses to intestinal bacteria. The outcomes of these responses define critical events in the host, such as the establishment of immune responses, access to nutrients, and the rate of larval development. Despite our steady march towards illuminating the host machinery that responds to bacterial presence in the gut, there are significant gaps in our understanding of the microbial products that influence bacterial association with a fly host. We sequenced and characterized the genomes of three common Drosophila-associated microbes: Lactobacillus plantarum, Lactobacillus brevis and Acetobacter pasteurianus. For each species, we compared the genomes of Drosophila-associated strains to the genomes of strains isolated from alternative sources. We found that environmental Lactobacillus strains readily associated with adult Drosophila and were similar to fly isolates in terms of genome organization. In contrast, we identified a strain of A. pasteurianus that apparently fails to associate with adult Drosophila due to an inability to grow on fly nutrient food. Comparisons between association competent and incompetent A. pasteurianus strains identified a short list of candidate genes that may contribute to survival on fly medium. Many of the gene products unique to fly-associated strains have established roles in the stabilization of host-microbe interactions. These data add to a growing body of literature that examines the microbial perspective of host-microbe relationships.

  7. Genomic selection strategies in a small dairy cattle population evaluated for genetic gain and profit

    DEFF Research Database (Denmark)

    Thomasen, Jørn Rind; Egger-Danner, C; Willam, A

    2014-01-01

    progeny testing. Strong positive interaction effects between increased reliability of genomic predictions and more intensive use of young bulls exist. From an economic perspective a juvenile scheme is always advantageous. The main future focus area for the smaller dairy cattle breeds is to join forces...

  8. SSFI and SSOMI new method of evaluating design

    International Nuclear Information System (INIS)

    Tolson, G.M.

    1992-01-01

    The NRC has developed a new inspection method which has proven its effectiveness in evaluating design organizations. The new method is used in two types of NRC inspections, Safety System Functional Inspection (SSFI), and Safety System Outage Modification Inspection (SSOMI). The SSFI/SSOMI audits were developed following an event which brought a nuclear power plant close to a core meltdown. That event was caused by a series of problems which would not have been found using conventional methods. The SSFI and SSOMI audits involve intense technical evaluation of a nuclear system to determine wheter the system will function as designed. The SSFI/SSOMI method normally uses eight to fifteen engineers with different fields of expertise to evaluate a system, or a change to a system in the case of a SSOMI. The effectiveness of each engineer's input is amplified in a series of open, questioning, free-wheeling, brainstorming-type team meetings. During the team meetings, all aspects of the audit are controlled by a consensus of the team members. The findings from these new methods are surprisingly consistent, regardless of which organization is audited or which organization performs the audit. This consistency implies a widespread generic weakness in the manner design is being performed. This paper addresses generic findings and recommends increased use of these new methods to evaluate design organizations. These audit methods can be readily used to evaluate any process or system. (orig.)

  9. Systematic drug safety evaluation based on public genomic expression (Connectivity Map) data: Myocardial and infectious adverse reactions as application cases

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Kejian, E-mail: kejian.wang.bio@gmail.com [Bio-X Institutes, Key Laboratory for the Genetics of Developmental and Neuropsychiatric Disorders, Shanghai Jiao Tong University, Shanghai (China); Weng, Zuquan [Japan National Institute of Occupational Safety and Health, Kawasaki (Japan); Sun, Liya [Bio-X Institutes, Key Laboratory for the Genetics of Developmental and Neuropsychiatric Disorders, Shanghai Jiao Tong University, Shanghai (China); Sun, Jiazhi; Zhou, Shu-Feng [Department of Pharmaceutical Sciences, College of Pharmacy, University of South Florida, Tampa, FL (United States); He, Lin, E-mail: helin@Bio-X.com [Bio-X Institutes, Key Laboratory for the Genetics of Developmental and Neuropsychiatric Disorders, Shanghai Jiao Tong University, Shanghai (China)

    2015-02-13

    Adverse drug reaction (ADR) is of great importance to both regulatory agencies and the pharmaceutical industry. Various techniques, such as quantitative structure–activity relationship (QSAR) and animal toxicology, are widely used to identify potential risks during the preclinical stage of drug development. Despite these efforts, drugs with safety liabilities can still pass through safety checkpoints and enter the market. This situation raises the concern that conventional chemical structure analysis and phenotypic screening are not sufficient to avoid all clinical adverse events. Genomic expression data following in vitro drug treatments characterize drug actions and thus have become widely used in drug repositioning. In the present study, we explored prediction of ADRs based on the drug-induced gene-expression profiles from cultured human cells in the Connectivity Map (CMap) database. The results showed that drugs inducing comparable ADRs generally lead to similar CMap expression profiles. Based on such ADR-gene expression association, we established prediction models for various ADRs, including severe myocardial and infectious events. Drugs with FDA boxed warnings of safety liability were effectively identified. We therefore suggest that drug-induced gene expression change, in combination with effective computational methods, may provide a new dimension of information to facilitate systematic drug safety evaluation. - Highlights: • Drugs causing common toxicity lead to similar in vitro gene expression changes. • We built a model to predict drug toxicity with drug-specific expression profiles. • Drugs with FDA black box warnings were effectively identified by our model. • In vitro assay can detect severe toxicity in the early stage of drug development.

  10. Systematic drug safety evaluation based on public genomic expression (Connectivity Map) data: Myocardial and infectious adverse reactions as application cases

    International Nuclear Information System (INIS)

    Wang, Kejian; Weng, Zuquan; Sun, Liya; Sun, Jiazhi; Zhou, Shu-Feng; He, Lin

    2015-01-01

    Adverse drug reaction (ADR) is of great importance to both regulatory agencies and the pharmaceutical industry. Various techniques, such as quantitative structure–activity relationship (QSAR) and animal toxicology, are widely used to identify potential risks during the preclinical stage of drug development. Despite these efforts, drugs with safety liabilities can still pass through safety checkpoints and enter the market. This situation raises the concern that conventional chemical structure analysis and phenotypic screening are not sufficient to avoid all clinical adverse events. Genomic expression data following in vitro drug treatments characterize drug actions and thus have become widely used in drug repositioning. In the present study, we explored prediction of ADRs based on the drug-induced gene-expression profiles from cultured human cells in the Connectivity Map (CMap) database. The results showed that drugs inducing comparable ADRs generally lead to similar CMap expression profiles. Based on such ADR-gene expression association, we established prediction models for various ADRs, including severe myocardial and infectious events. Drugs with FDA boxed warnings of safety liability were effectively identified. We therefore suggest that drug-induced gene expression change, in combination with effective computational methods, may provide a new dimension of information to facilitate systematic drug safety evaluation. - Highlights: • Drugs causing common toxicity lead to similar in vitro gene expression changes. • We built a model to predict drug toxicity with drug-specific expression profiles. • Drugs with FDA black box warnings were effectively identified by our model. • In vitro assay can detect severe toxicity in the early stage of drug development

  11. Oral cancer prognosis based on clinicopathologic and genomic markers using a hybrid of feature selection and machine learning methods

    Science.gov (United States)

    2013-01-01

    Background Machine learning techniques are becoming useful as an alternative approach to conventional medical diagnosis or prognosis as they are good for handling noisy and incomplete data, and significant results can be attained despite a small sample size. Traditionally, clinicians make prognostic decisions based on clinicopathologic markers. However, it is not easy for the most skilful clinician to come out with an accurate prognosis by using these markers alone. Thus, there is a need to use genomic markers to improve the accuracy of prognosis. The main aim of this research is to apply a hybrid of feature selection and machine learning methods in oral cancer prognosis based on the parameters of the correlation of clinicopathologic and genomic markers. Results In the first stage of this research, five feature selection methods have been proposed and experimented on the oral cancer prognosis dataset. In the second stage, the model with the features selected from each feature selection methods are tested on the proposed classifiers. Four types of classifiers are chosen; these are namely, ANFIS, artificial neural network, support vector machine and logistic regression. A k-fold cross-validation is implemented on all types of classifiers due to the small sample size. The hybrid model of ReliefF-GA-ANFIS with 3-input features of drink, invasion and p63 achieved the best accuracy (accuracy = 93.81%; AUC = 0.90) for the oral cancer prognosis. Conclusions The results revealed that the prognosis is superior with the presence of both clinicopathologic and genomic markers. The selected features can be investigated further to validate the potential of becoming as significant prognostic signature in the oral cancer studies. PMID:23725313

  12. Genome-wide prediction methods in highly diverse and heterozygous species: proof-of-concept through simulation in grapevine.

    Directory of Open Access Journals (Sweden)

    Agota Fodor

    Full Text Available Nowadays, genome-wide association studies (GWAS and genomic selection (GS methods which use genome-wide marker data for phenotype prediction are of much potential interest in plant breeding. However, to our knowledge, no studies have been performed yet on the predictive ability of these methods for structured traits when using training populations with high levels of genetic diversity. Such an example of a highly heterozygous, perennial species is grapevine. The present study compares the accuracy of models based on GWAS or GS alone, or in combination, for predicting simple or complex traits, linked or not with population structure. In order to explore the relevance of these methods in this context, we performed simulations using approx 90,000 SNPs on a population of 3,000 individuals structured into three groups and corresponding to published diversity grapevine data. To estimate the parameters of the prediction models, we defined four training populations of 1,000 individuals, corresponding to these three groups and a core collection. Finally, to estimate the accuracy of the models, we also simulated four breeding populations of 200 individuals. Although prediction accuracy was low when breeding populations were too distant from the training populations, high accuracy levels were obtained using the sole core-collection as training population. The highest prediction accuracy was obtained (up to 0.9 using the combined GWAS-GS model. We thus recommend using the combined prediction model and a core-collection as training population for grapevine breeding or for other important economic crops with the same characteristics.

  13. Improved methods and resources for paramecium genomics: transcription units, gene annotation and gene expression.

    Science.gov (United States)

    Arnaiz, Olivier; Van Dijk, Erwin; Bétermier, Mireille; Lhuillier-Akakpo, Maoussi; de Vanssay, Augustin; Duharcourt, Sandra; Sallet, Erika; Gouzy, Jérôme; Sperling, Linda

    2017-06-26

    The 15 sibling species of the Paramecium aurelia cryptic species complex emerged after a whole genome duplication that occurred tens of millions of years ago. Given extensive knowledge of the genetics and epigenetics of Paramecium acquired over the last century, this species complex offers a uniquely powerful system to investigate the consequences of whole genome duplication in a unicellular eukaryote as well as the genetic and epigenetic mechanisms that drive speciation. High quality Paramecium gene models are important for research using this system. The major aim of the work reported here was to build an improved gene annotation pipeline for the Paramecium lineage. We generated oriented RNA-Seq transcriptome data across the sexual process of autogamy for the model species Paramecium tetraurelia. We determined, for the first time in a ciliate, candidate P. tetraurelia transcription start sites using an adapted Cap-Seq protocol. We developed TrUC, multi-threaded Perl software that in conjunction with TopHat mapping of RNA-Seq data to a reference genome, predicts transcription units for the annotation pipeline. We used EuGene software to combine annotation evidence. The high quality gene structural annotations obtained for P. tetraurelia were used as evidence to improve published annotations for 3 other Paramecium species. The RNA-Seq data were also used for differential gene expression analysis, providing a gene expression atlas that is more sensitive than the previously established microarray resource. We have developed a gene annotation pipeline tailored for the compact genomes and tiny introns of Paramecium species. A novel component of this pipeline, TrUC, predicts transcription units using Cap-Seq and oriented RNA-Seq data. TrUC could prove useful beyond Paramecium, especially in the case of high gene density. Accurate predictions of 3' and 5' UTR will be particularly valuable for studies of gene expression (e.g. nucleosome positioning, identification of cis

  14. Evaluation of Stress Loaded Steel Samples Using Selected Electromagnetic Methods

    International Nuclear Information System (INIS)

    Chady, T.

    2004-01-01

    In this paper the magnetic leakage flux and eddy current method were used to evaluate changes of materials' properties caused by stress. Seven samples made of ferromagnetic material with different level of applied stress were prepared. First, the leakage magnetic fields were measured by scanning the surface of the specimens with GMR gradiometer. Next, the same samples were evaluated using an eddy current sensor. A comparison between results obtained from both methods was carried out. Finally, selected parameters of the measured signal were calculated and utilized to evaluate level of the applied stress. A strong coincidence between amount of the applied stress and the maximum amplitude of the derivative was confirmed

  15. Computational methods for planning and evaluating geothermal energy projects

    International Nuclear Information System (INIS)

    Goumas, M.G.; Lygerou, V.A.; Papayannakis, L.E.

    1999-01-01

    In planning, designing and evaluating a geothermal energy project, a number of technical, economic, social and environmental parameters should be considered. The use of computational methods provides a rigorous analysis improving the decision-making process. This article demonstrates the application of decision-making methods developed in operational research for the optimum exploitation of geothermal resources. Two characteristic problems are considered: (1) the economic evaluation of a geothermal energy project under uncertain conditions using a stochastic analysis approach and (2) the evaluation of alternative exploitation schemes for optimum development of a low enthalpy geothermal field using a multicriteria decision-making procedure. (Author)

  16. Application of least-squares method to decay heat evaluation

    International Nuclear Information System (INIS)

    Schmittroth, F.; Schenter, R.E.

    1976-01-01

    Generalized least-squares methods are applied to decay-heat experiments and summation calculations to arrive at evaluated values and uncertainties for the fission-product decay-heat from the thermal fission of 235 U. Emphasis is placed on a proper treatment of both statistical and correlated uncertainties in the least-squares method

  17. Designing, Teaching, and Evaluating Two Complementary Mixed Methods Research Courses

    Science.gov (United States)

    Christ, Thomas W.

    2009-01-01

    Teaching mixed methods research is difficult. This longitudinal explanatory study examined how two classes were designed, taught, and evaluated. Curriculum, Research, and Teaching (EDCS-606) and Mixed Methods Research (EDCS-780) used a research proposal generation process to highlight the importance of the purpose, research question and…

  18. New knowledge network evaluation method for design rationale management

    Science.gov (United States)

    Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao

    2015-01-01

    Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.

  19. A novel reliability evaluation method for large engineering systems

    Directory of Open Access Journals (Sweden)

    Reda Farag

    2016-06-01

    Full Text Available A novel reliability evaluation method for large nonlinear engineering systems excited by dynamic loading applied in time domain is presented. For this class of problems, the performance functions are expected to be function of time and implicit in nature. Available first- or second-order reliability method (FORM/SORM will be challenging to estimate reliability of such systems. Because of its inefficiency, the classical Monte Carlo simulation (MCS method also cannot be used for large nonlinear dynamic systems. In the proposed approach, only tens instead of hundreds or thousands of deterministic evaluations at intelligently selected points are used to extract the reliability information. A hybrid approach, consisting of the stochastic finite element method (SFEM developed by the author and his research team using FORM, response surface method (RSM, an interpolation scheme, and advanced factorial schemes, is proposed. The method is clarified with the help of several numerical examples.

  20. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1993-03-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others

  1. DOE methods for evaluating environmental and waste management samples.

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K [eds.; Pacific Northwest Lab., Richland, WA (United States)

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  2. Ethnographic methods for process evaluations of complex health behaviour interventions.

    Science.gov (United States)

    Morgan-Trimmer, Sarah; Wood, Fiona

    2016-05-04

    This article outlines the contribution that ethnography could make to process evaluations for trials of complex health-behaviour interventions. Process evaluations are increasingly used to examine how health-behaviour interventions operate to produce outcomes and often employ qualitative methods to do this. Ethnography shares commonalities with the qualitative methods currently used in health-behaviour evaluations but has a distinctive approach over and above these methods. It is an overlooked methodology in trials of complex health-behaviour interventions that has much to contribute to the understanding of how interventions work. These benefits are discussed here with respect to three strengths of ethnographic methodology: (1) producing valid data, (2) understanding data within social contexts, and (3) building theory productively. The limitations of ethnography within the context of process evaluations are also discussed.

  3. An alternative method for cDNA cloning from surrogate eukaryotic cells transfected with the corresponding genomic DNA.

    Science.gov (United States)

    Hu, Lin-Yong; Cui, Chen-Chen; Song, Yu-Jie; Wang, Xiang-Guo; Jin, Ya-Ping; Wang, Ai-Hua; Zhang, Yong

    2012-07-01

    cDNA is widely used in gene function elucidation and/or transgenics research but often suitable tissues or cells from which to isolate mRNA for reverse transcription are unavailable. Here, an alternative method for cDNA cloning is described and tested by cloning the cDNA of human LALBA (human alpha-lactalbumin) from genomic DNA. First, genomic DNA containing all of the coding exons was cloned from human peripheral blood and inserted into a eukaryotic expression vector. Next, by delivering the plasmids into either 293T or fibroblast cells, surrogate cells were constructed. Finally, the total RNA was extracted from the surrogate cells and cDNA was obtained by RT-PCR. The human LALBA cDNA that was obtained was compared with the corresponding mRNA published in GenBank. The comparison showed that the two sequences were identical. The novel method for cDNA cloning from surrogate eukaryotic cells described here uses well-established techniques that are feasible and simple to use. We anticipate that this alternative method will have widespread applications.

  4. Genome profiling (GP method based classification of insects: congruence with that of classical phenotype-based one.

    Directory of Open Access Journals (Sweden)

    Shamim Ahmed

    Full Text Available Ribosomal RNAs have been widely used for identification and classification of species, and have produced data giving new insights into phylogenetic relationships. Recently, multilocus genotyping and even whole genome sequencing-based technologies have been adopted in ambitious comparative biology studies. However, such technologies are still far from routine-use in species classification studies due to their high costs in terms of labor, equipment and consumables.Here, we describe a simple and powerful approach for species classification called genome profiling (GP. The GP method composed of random PCR, temperature gradient gel electrophoresis (TGGE and computer-aided gel image processing is highly informative and less laborious. For demonstration, we classified 26 species of insects using GP and 18S rDNA-sequencing approaches. The GP method was found to give a better correspondence to the classical phenotype-based approach than did 18S rDNA sequencing employing a congruence value. To our surprise, use of a single probe in GP was sufficient to identify the relationships between the insect species, making this approach more straightforward.The data gathered here, together with those of previous studies show that GP is a simple and powerful method that can be applied for actually universally identifying and classifying species. The current success supported our previous proposal that GP-based web database can be constructible and effective for the global identification/classification of species.

  5. Comprehensive evaluation method in application of nuclear DCS product design

    International Nuclear Information System (INIS)

    Wang Weixin; Zhao Zhemin; Shi Yingbin

    2014-01-01

    In order to select the best design proposal in short time, the TOPSIS comprehensive evaluation method in the nuclear power plant DCS product design was introduced. It can intuitively show the different design proposals good or not good by data and shorten the time of the design proposal optimization. The design proposal selected by this method will be more reasonable and has good comprehensive performance indexes. The TOPSIS comprehensive evaluation method achieves good result in one of the nuclear power plant DCS cabinet design proposal optimization. (authors)

  6. Study on evaluation methods for Rayleigh wave dispersion characteristic

    Science.gov (United States)

    Shi, L.; Tao, X.; Kayen, R.; Shi, H.; Yan, S.

    2005-01-01

    The evaluation of Rayleigh wave dispersion characteristic is the key step for detecting S-wave velocity structure. By comparing the dispersion curves directly with the spectra analysis of surface waves (SASW) method, rather than comparing the S-wave velocity structure, the validity and precision of microtremor-array method (MAM) can be evaluated more objectively. The results from the China - US joint surface wave investigation in 26 sites in Tangshan, China, show that the MAM has the same precision with SASW method in 83% of the 26 sites. The MAM is valid for Rayleigh wave dispersion characteristic testing and has great application potentiality for site S-wave velocity structure detection.

  7. Simplified method evaluation for piping elastic follow-up

    International Nuclear Information System (INIS)

    Severud, L.K.

    1983-05-01

    A proposed simplified method for evaluating elastic follow-up effects in high temperature pipelines is presented. The method was evaluated by comparing the simplified analysis results with those obtained from detailed inelastic solutions. Nine different pipelines typical of a nuclear breeder reactor power plant were analyzed; the simplified method is attractive because it appears to give fairly accurate and conservative results. It is easy to apply and inexpensive since it employs iterative elastic solutions for the pipeline coupled with the readily available isochronous stress-strain data provided in the ASME Code

  8. Investigation of evaluation methods for human factors education effectiveness

    International Nuclear Information System (INIS)

    Yoshimura, Seiichi; Fujimoto, Junzo; Sasou Kunihide; Hasegawa, Naoko

    2004-01-01

    Education effectiveness in accordance with investment is required in the steam of electric power regulation alleviation. Therefore, evaluation methods for human factors education effectiveness which can observe human factors culture pervading process were investigated through research activities on education effectiveness in universities and actual in house education in industry companies. As a result, the contents of evaluation were found to be the change of feeling for human factors and some improving proposals in work places when considering the purpose of human factors education. And, questionnaire is found to be suitable for the style of evaluation. In addition, the timing of evaluation is desirable for both just after education and after some period in work places. Hereafter, data will be collected using these two kinds of questionnaires in human factors education courses in CRIEPI and some education courses in utilities. Thus, education effectiveness evaluation method which is suitable for human factors will be established. (author)

  9. What is the proper evaluation method: Some basic considerations

    International Nuclear Information System (INIS)

    Leeb, Helmut; Schnabel, Georg; Srdinko, Thomas

    2014-01-01

    Recent developments and applications demand for an extension of the energy range and the inclusion of reliable uncertainty information in nuclear data libraries. Due to the scarcity of neutron-induced reaction data beyond 20 MeV the extension of the energy range up to at least 150 MeV is not trivial because the corresponding nuclear data evaluations depend heavily on nuclear models and proper evaluation methods are still under discussion. Restricting to evaluation techniques based on Bayesian statistics the influence of the a priori knowledge on the final result of the evaluation is considered. The study clearly indicates the need to account properly for the deficiencies of the nuclear model. Concerning the covariance matrices it is argued that they depend not only on the model, but also on the method of generation and an additional consent is required for the comparison of different evaluations of the same data sets. (authors)

  10. Methods of Identification and Evaluation of Brownfield Sites

    Directory of Open Access Journals (Sweden)

    Safet Kurtović

    2014-04-01

    Full Text Available The basic objective of this paper was to determine the importance and potential restoration of brownfield sites in terms of economic prosperity of a particular region or country. In addition, in a theoretical sense, this paper presents the methods used in the identification of brownfield sites such as Smart Growth Network model and Thomas GIS model, and methods for evaluation of brownfield sites or the indexing method, cost-benefit and multivariate analysis.

  11. Evaluation of verification methods for input-accountability measurements

    International Nuclear Information System (INIS)

    Maeck, W.J.

    1980-01-01

    As part of TASTEX related programs two independent methods have been evaluated for the purpose of providing verification of the amount of Pu charged to the head-end of a nuclear fuel processing plant. The first is the Pu/U (gravimetric method), TASTEX Task-L, and the second is the Tracer Method, designated Task-M. Summaries of the basic technology, results of various studies under actual plant conditions, future requirements, are given for each of the Tasks

  12. NCC-AUC: an AUC optimization method to identify multi-biomarker panel for cancer prognosis from genomic and clinical data.

    Science.gov (United States)

    Zou, Meng; Liu, Zhaoqi; Zhang, Xiang-Sun; Wang, Yong

    2015-10-15

    In prognosis and survival studies, an important goal is to identify multi-biomarker panels with predictive power using molecular characteristics or clinical observations. Such analysis is often challenged by censored, small-sample-size, but high-dimensional genomic profiles or clinical data. Therefore, sophisticated models and algorithms are in pressing need. In this study, we propose a novel Area Under Curve (AUC) optimization method for multi-biomarker panel identification named Nearest Centroid Classifier for AUC optimization (NCC-AUC). Our method is motived by the connection between AUC score for classification accuracy evaluation and Harrell's concordance index in survival analysis. This connection allows us to convert the survival time regression problem to a binary classification problem. Then an optimization model is formulated to directly maximize AUC and meanwhile minimize the number of selected features to construct a predictor in the nearest centroid classifier framework. NCC-AUC shows its great performance by validating both in genomic data of breast cancer and clinical data of stage IB Non-Small-Cell Lung Cancer (NSCLC). For the genomic data, NCC-AUC outperforms Support Vector Machine (SVM) and Support Vector Machine-based Recursive Feature Elimination (SVM-RFE) in classification accuracy. It tends to select a multi-biomarker panel with low average redundancy and enriched biological meanings. Also NCC-AUC is more significant in separation of low and high risk cohorts than widely used Cox model (Cox proportional-hazards regression model) and L1-Cox model (L1 penalized in Cox model). These performance gains of NCC-AUC are quite robust across 5 subtypes of breast cancer. Further in an independent clinical data, NCC-AUC outperforms SVM and SVM-RFE in predictive accuracy and is consistently better than Cox model and L1-Cox model in grouping patients into high and low risk categories. In summary, NCC-AUC provides a rigorous optimization framework to

  13. Methods to evaluate fish freshness in research and industry

    DEFF Research Database (Denmark)

    Olafsdottir, G.; Martinsdóttir, E.; Oehlenschläger, J.

    1997-01-01

    Current work in a European concerted action project 'Evaluation of Fish Freshness' (AIR3 CT94-2283) focuses on harmonizing research activities in the area of fish freshness evaluation in leading fish laboratories in Europe (see Box 1). The overall aim of the concerted action project is to validat...... measurements with respect to fish freshness evaluation. In this article, the different subgroups have summarized changes that occur in fish and methods to evaluate fish freshness as a first step towards the definition of criteria for fish freshness...

  14. Durability evaluation method on rebar corrosion of reinforced concrete

    International Nuclear Information System (INIS)

    Kitsutaka, Yoshinori

    2013-01-01

    In this paper, method on the durability evaluation in nuclear power plant concrete structures was investigated. In view of the importance of evaluating the degree of deterioration of reinforced concrete structures, relationships should be formulated among the number of years elapsed, t, the amount of action of a deteriorative factor, F, the degree of material deterioration, D, and the performance of the structure, P. Evaluation by PDFt diagrams combining these relationships may be effective. A detailed procedure of durability evaluation for a reinforced concrete structure using PDFt concept is presented for the deterioration of rebar corrosion caused by neutralization and penetration of salinity by referring to the recent papers. (author)

  15. A quantitative method for evaluating alternatives. [aid to decision making

    Science.gov (United States)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  16. Survey of evaluation methods for thermal striping in FBR structures

    International Nuclear Information System (INIS)

    Miura, Naoki; Nitta, Akito; Take, Kohji

    1988-01-01

    In the upper core structures or the sodium mixing tee of Fast Breeder Reactors, sodium mixing streams which are at different temperatures produce rapid temperature fluctuations, namely 'thermal striping', upon component surfaces, and it is apprehended that the high-cycle thermal fatigue causes the crack initiation and propagation. The thermal striping is one of the factors which is considered in FBR component design, however, the standard evaluation method has not built up yet because of the intricacy of that mechanism, the difficulty of an actual proof, the lack of data, and so on. In this report, it is intended to survey of the datails and the present situation of the evaluation method of crack initiation and propagation due to thermal striping, and study the appropriate method which will be made use of the rationalization of design. So it is ascertained that the method which use a quantitative prediction of crack propagation is optimum to evaluate the thermal striping phenomenon. (author)

  17. A structured expert evaluation method for the evaluation of children's computer games

    NARCIS (Netherlands)

    Baauw, E.; Bekker, M.M.; Barendregt, W.; Costabile, M.F.; Paternò, F.

    2005-01-01

    Inspection-based evaluation methods predicting usability problems can be applied for evaluating products without involving users. A new method (named SEEM), inspired by Norman’s theory-of-action model [18] and Malone’s concepts of fun [15], is described for predicting usability and fun problems in

  18. Evaluation of GRCh38 and de novo haploid genome assemblies demonstrates the enduring quality of the reference assembly.

    Science.gov (United States)

    Schneider, Valerie A; Graves-Lindsay, Tina; Howe, Kerstin; Bouk, Nathan; Chen, Hsiu-Chuan; Kitts, Paul A; Murphy, Terence D; Pruitt, Kim D; Thibaud-Nissen, Françoise; Albracht, Derek; Fulton, Robert S; Kremitzki, Milinn; Magrini, Vincent; Markovic, Chris; McGrath, Sean; Steinberg, Karyn Meltz; Auger, Kate; Chow, William; Collins, Joanna; Harden, Glenn; Hubbard, Timothy; Pelan, Sarah; Simpson, Jared T; Threadgold, Glen; Torrance, James; Wood, Jonathan M; Clarke, Laura; Koren, Sergey; Boitano, Matthew; Peluso, Paul; Li, Heng; Chin, Chen-Shan; Phillippy, Adam M; Durbin, Richard; Wilson, Richard K; Flicek, Paul; Eichler, Evan E; Church, Deanna M

    2017-05-01

    The human reference genome assembly plays a central role in nearly all aspects of today's basic and clinical research. GRCh38 is the first coordinate-changing assembly update since 2009; it reflects the resolution of roughly 1000 issues and encompasses modifications ranging from thousands of single base changes to megabase-scale path reorganizations, gap closures, and localization of previously orphaned sequences. We developed a new approach to sequence generation for targeted base updates and used data from new genome mapping technologies and single haplotype resources to identify and resolve larger assembly issues. For the first time, the reference assembly contains sequence-based representations for the centromeres. We also expanded the number of alternate loci to create a reference that provides a more robust representation of human population variation. We demonstrate that the updates render the reference an improved annotation substrate, alter read alignments in unchanged regions, and impact variant interpretation at clinically relevant loci. We additionally evaluated a collection of new de novo long-read haploid assemblies and conclude that although the new assemblies compare favorably to the reference with respect to continuity, error rate, and gene completeness, the reference still provides the best representation for complex genomic regions and coding sequences. We assert that the collected updates in GRCh38 make the newer assembly a more robust substrate for comprehensive analyses that will promote our understanding of human biology and advance our efforts to improve health. © 2017 Schneider et al.; Published by Cold Spring Harbor Laboratory Press.

  19. bNEAT: a Bayesian network method for detecting epistatic interactions in genome-wide association studies

    Directory of Open Access Journals (Sweden)

    Chen Xue-wen

    2011-07-01

    Full Text Available Abstract Background Detecting epistatic interactions plays a significant role in improving pathogenesis, prevention, diagnosis and treatment of complex human diseases. A recent study in automatic detection of epistatic interactions shows that Markov Blanket-based methods are capable of finding genetic variants strongly associated with common diseases and reducing false positives when the number of instances is large. Unfortunately, a typical dataset from genome-wide association studies consists of very limited number of examples, where current methods including Markov Blanket-based method may perform poorly. Results To address small sample problems, we propose a Bayesian network-based approach (bNEAT to detect epistatic interactions. The proposed method also employs a Branch-and-Bound technique for learning. We apply the proposed method to simulated datasets based on four disease models and a real dataset. Experimental results show that our method outperforms Markov Blanket-based methods and other commonly-used methods, especially when the number of samples is small. Conclusions Our results show bNEAT can obtain a strong power regardless of the number of samples and is especially suitable for detecting epistatic interactions with slight or no marginal effects. The merits of the proposed approach lie in two aspects: a suitable score for Bayesian network structure learning that can reflect higher-order epistatic interactions and a heuristic Bayesian network structure learning method.

  20. REGEN: Ancestral Genome Reconstruction for Bacteria

    Directory of Open Access Journals (Sweden)

    João C. Setubal

    2012-07-01

    Full Text Available Ancestral genome reconstruction can be understood as a phylogenetic study with more details than a traditional phylogenetic tree reconstruction. We present a new computational system called REGEN for ancestral bacterial genome reconstruction at both the gene and replicon levels. REGEN reconstructs gene content, contiguous gene runs, and replicon structure for each ancestral genome. Along each branch of the phylogenetic tree, REGEN infers evolutionary events, including gene creation and deletion and replicon fission and fusion. The reconstruction can be performed by either a maximum parsimony or a maximum likelihood method. Gene content reconstruction is based on the concept of neighboring gene pairs. REGEN was designed to be used with any set of genomes that are sufficiently related, which will usually be the case for bacteria within the same taxonomic order. We evaluated REGEN using simulated genomes and genomes in the Rhizobiales order.

  1. REGEN: Ancestral Genome Reconstruction for Bacteria.

    Science.gov (United States)

    Yang, Kuan; Heath, Lenwood S; Setubal, João C

    2012-07-18

    Ancestral genome reconstruction can be understood as a phylogenetic study with more details than a traditional phylogenetic tree reconstruction. We present a new computational system called REGEN for ancestral bacterial genome reconstruction at both the gene and replicon levels. REGEN reconstructs gene content, contiguous gene runs, and replicon structure for each ancestral genome. Along each branch of the phylogenetic tree, REGEN infers evolutionary events, including gene creation and deletion and replicon fission and fusion. The reconstruction can be performed by either a maximum parsimony or a maximum likelihood method. Gene content reconstruction is based on the concept of neighboring gene pairs. REGEN was designed to be used with any set of genomes that are sufficiently related, which will usually be the case for bacteria within the same taxonomic order. We evaluated REGEN using simulated genomes and genomes in the Rhizobiales order.

  2. Efficiency of boiling and four other methods for genomic DNA extraction of deteriorating spore-forming bacteria from milk

    Directory of Open Access Journals (Sweden)

    Jose Carlos Ribeiro Junior

    2016-10-01

    Full Text Available The spore-forming microbiota is mainly responsible for the deterioration of pasteurized milk with long shelf life in the United States. The identification of these microorganisms, using molecular tools, is of particular importance for the maintenance of the quality of milk. However, these molecular techniques are not only costly but also labor-intensive and time-consuming. The aim of this study was to compare the efficiency of boiling in conjunction with four other methods for the genomic DNA extraction of sporulated bacteria with proteolytic and lipolytic potential isolated from raw milk in the states of Paraná and Maranhão, Brazil. Protocols based on cellular lysis by enzymatic digestion, phenolic extraction, microwave-heating, as well as the use of guanidine isothiocyanate were used. This study proposes a method involving simple boiling for the extraction of genomic DNA from these microorganisms. Variations in the quality and yield of the extracted DNA among these methods were observed. However, both the cell lysis protocol by enzymatic digestion (commercial kit and the simple boiling method proposed in this study yielded sufficient DNA for successfully carrying out the Polymerase Chain Reaction (PCR of the rpoB and 16S rRNA genes for all 11 strains of microorganisms tested. Other protocols failed to yield sufficient quantity and quality of DNA from all microorganisms tested, since only a few strains have showed positive results by PCR, thereby hindering the search for new microorganisms. Thus, the simple boiling method for DNA extraction from sporulated bacteria in spoiled milk showed the same efficacy as that of the commercial kit. Moreover, the method is inexpensive, easy to perform, and much less time-consuming.

  3. Nurse educators’ perceptions of OSCE as a clinical evaluation method

    Directory of Open Access Journals (Sweden)

    MM Chabeli

    2001-09-01

    Full Text Available The South African Qualifications Authority, and the South African Nursing Council are in pursuit of quality nursing education to enable the learners to practise as independent and autonomous practitioners. The educational programme should focus on the facilitation of critical and reflective thinking skills that will help the learner to make rational decisions and solve problems. A way of achieving this level of functioning is the use of assessment and evaluation methods that measure the learners’ clinical competence holistically. This article is focused on the perceptions of twenty nurse educators, purposively selected from three Nursing Colleges affiliated to a university in Gauteng, regarding the use of OSCE (Objective Structured Clinical Examination as a clinical evaluation method within a qualitative and descriptive research strategy. Three focus group interviews were conducted in different sessions. A descriptive content analysis was used. Trustworthiness was ensured by using Lincoln and Guba’s model (1985. The results revealed both positive and negative aspects of OSCE as a clinical evaluation method with regard to: administrative aspects; evaluators; learners; procedures/instruments and evaluation. The conclusion drawn from the related findings is that OSCE does not measure the learners’ clinical competence holistically. It is therefore recommended that the identified negative perception be taken as challenges faced by nurse educators and that the positive aspects be strengthened. One way of meeting these recommendations is the use of varied alternative methods for clinical assessment and evaluation that focus on the holistic measurement of the learners’ clinical competence.

  4. Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency

    Directory of Open Access Journals (Sweden)

    Rodrigo Aniceto

    2015-01-01

    Full Text Available Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB.

  5. Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency

    Science.gov (United States)

    Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio

    2015-01-01

    Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB. PMID:26558254

  6. Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency.

    Science.gov (United States)

    Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio

    2015-01-01

    Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB.

  7. Use Case Evaluation (UCE): A Method for Early Usability Evaluation in Software Development

    DEFF Research Database (Denmark)

    Stage, Jan; Høegh, Rune Thaarup; Hornbæk, K.

    2007-01-01

    t is often argued that usability problems should be identified as early as possible during software development, but many usability evaluation methods do not fit well in early development activities. We propose a method for usability evaluation of use cases, a widely used representation of design...... ideas produced early in software development processes. The method proceeds by systematic inspection of use cases with reference to a set of guidelines for usable design. To validate the method, four evaluators inspected a set of use cases for a health care application....

  8. Evaluation of Quality Assessment Protocols for High Throughput Genome Resequencing Data.

    Science.gov (United States)

    Chiara, Matteo; Pavesi, Giulio

    2017-01-01

    Large-scale initiatives aiming to recover the complete sequence of thousands of human genomes are currently being undertaken worldwide, concurring to the generation of a comprehensive catalog of human genetic variation. The ultimate and most ambitious goal of human population scale genomics is the characterization of the so-called human "variome," through the identification of causal mutations or haplotypes. Several research institutions worldwide currently use genotyping assays based on Next-Generation Sequencing (NGS) for diagnostics and clinical screenings, and the widespread application of such technologies promises major revolutions in medical science. Bioinformatic analysis of human resequencing data is one of the main factors limiting the effectiveness and general applicability of NGS for clinical studies. The requirement for multiple tools, to be combined in dedicated protocols in order to accommodate different types of data (gene panels, exomes, or whole genomes) and the high variability of the data makes difficult the establishment of a ultimate strategy of general use. While there already exist several studies comparing sensitivity and accuracy of bioinformatic pipelines for the identification of single nucleotide variants from resequencing data, little is known about the impact of quality assessment and reads pre-processing strategies. In this work we discuss major strengths and limitations of the various genome resequencing protocols are currently used in molecular diagnostics and for the discovery of novel disease-causing mutations. By taking advantage of publicly available data we devise and suggest a series of best practices for the pre-processing of the data that consistently improve the outcome of genotyping with minimal impacts on computational costs.

  9. MIPS plant genome information resources.

    Science.gov (United States)

    Spannagl, Manuel; Haberer, Georg; Ernst, Rebecca; Schoof, Heiko; Mayer, Klaus F X

    2007-01-01

    The Munich Institute for Protein Sequences (MIPS) has been involved in maintaining plant genome databases since the Arabidopsis thaliana genome project. Genome databases and analysis resources have focused on individual genomes and aim to provide flexible and maintainable data sets for model plant genomes as a backbone against which experimental data, for example from high-throughput functional genomics, can be organized and evaluated. In addition, model genomes also form a scaffold for comparative genomics, and much can be learned from genome-wide evolutionary studies.

  10. PredictSNP2: A Unified Platform for Accurately Evaluating SNP Effects by Exploiting the Different Characteristics of Variants in Distinct Genomic Regions.

    Science.gov (United States)

    Bendl, Jaroslav; Musil, Miloš; Štourač, Jan; Zendulka, Jaroslav; Damborský, Jiří; Brezovský, Jan

    2016-05-01

    An important message taken from human genome sequencing projects is that the human population exhibits approximately 99.9% genetic similarity. Variations in the remaining parts of the genome determine our identity, trace our history and reveal our heritage. The precise delineation of phenotypically causal variants plays a key role in providing accurate personalized diagnosis, prognosis, and treatment of inherited diseases. Several computational methods for achieving such delineation have been reported recently. However, their ability to pinpoint potentially deleterious variants is limited by the fact that their mechanisms of prediction do not account for the existence of different categories of variants. Consequently, their output is biased towards the variant categories that are most strongly represented in the variant databases. Moreover, most such methods provide numeric scores but not binary predictions of the deleteriousness of variants or confidence scores that would be more easily understood by users. We have constructed three datasets covering different types of disease-related variants, which were divided across five categories: (i) regulatory, (ii) splicing, (iii) missense, (iv) synonymous, and (v) nonsense variants. These datasets were used to develop category-optimal decision thresholds and to evaluate six tools for variant prioritization: CADD, DANN, FATHMM, FitCons, FunSeq2 and GWAVA. This evaluation revealed some important advantages of the category-based approach. The results obtained with the five best-performing tools were then combined into a consensus score. Additional comparative analyses showed that in the case of missense variations, protein-based predictors perform better than DNA sequence-based predictors. A user-friendly web interface was developed that provides easy access to the five tools' predictions, and their consensus scores, in a user-understandable format tailored to the specific features of different categories of variations. To

  11. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Laboratory Management Division of the DOE. Methods are prepared for entry into DOE Methods as chapter editors, together with DOE and other participants in this program, identify analytical and sampling method needs. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types. open-quotes Draftclose quotes or open-quotes Verified.close quotes. open-quotes Draftclose quotes methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. open-quotes Verifiedclose quotes methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations

  12. Genomic variation in Salmonella enterica core genes for epidemiological typing

    DEFF Research Database (Denmark)

    Leekitcharoenphon, Pimlapas; Lukjancenko, Oksana; Rundsten, Carsten Friis

    2012-01-01

    Background: Technological advances in high throughput genome sequencing are making whole genome sequencing (WGS) available as a routine tool for bacterial typing. Standardized procedures for identification of relevant genes and of variation are needed to enable comparison between studies and over...... genomes and evaluate their value as typing targets, comparing whole genome typing and traditional methods such as 16S and MLST. A consensus tree based on variation of core genes gives much better resolution than 16S and MLST; the pan-genome family tree is similar to the consensus tree, but with higher...... that there is a positive selection towards mutations leading to amino acid changes. Conclusions: Genomic variation within the core genome is useful for investigating molecular evolution and providing candidate genes for bacterial genome typing. Identification of genes with different degrees of variation is important...

  13. Evaluation of five methods for total DNA extraction from western corn rootworm beetles.

    Directory of Open Access Journals (Sweden)

    Hong Chen

    Full Text Available BACKGROUND: DNA extraction is a routine step in many insect molecular studies. A variety of methods have been used to isolate DNA molecules from insects, and many commercial kits are available. Extraction methods need to be evaluated for their efficiency, cost, and side effects such as DNA degradation during extraction. METHODOLOGY/PRINCIPAL FINDINGS: From individual western corn rootworm beetles, Diabrotica virgifera virgifera, DNA extractions by the SDS method, CTAB method, DNAzol reagent, Puregene solutions and DNeasy column were compared in terms of DNA quantity and quality, cost of materials, and time consumed. Although all five methods resulted in acceptable DNA concentrations and absorbance ratios, the SDS and CTAB methods resulted in higher DNA yield (ng DNA vs. mg tissue at much lower cost and less degradation as revealed on agarose gels. The DNeasy kit was most time-efficient but was the costliest among the methods tested. The effects of ethanol volume, temperature and incubation time on precipitation of DNA were also investigated. The DNA samples obtained by the five methods were tested in PCR for six microsatellites located in various positions of the beetle's genome, and all samples showed successful amplifications. CONCLUSION/SIGNIFICANCE: These evaluations provide a guide for choosing methods of DNA extraction from western corn rootworm beetles based on expected DNA yield and quality, extraction time, cost, and waste control. The extraction conditions for this mid-size insect were optimized. The DNA extracted by the five methods was suitable for further molecular applications such as PCR and sequencing by synthesis.

  14. Large-Scale Genomic Analysis of Codon Usage in Dengue Virus and Evaluation of Its Phylogenetic Dependence

    Directory of Open Access Journals (Sweden)

    Edgar E. Lara-Ramírez

    2014-01-01

    Full Text Available The increasing number of dengue virus (DENV genome sequences available allows identifying the contributing factors to DENV evolution. In the present study, the codon usage in serotypes 1–4 (DENV1–4 has been explored for 3047 sequenced genomes using different statistics methods. The correlation analysis of total GC content (GC with GC content at the three nucleotide positions of codons (GC1, GC2, and GC3 as well as the effective number of codons (ENC, ENCp versus GC3 plots revealed mutational bias and purifying selection pressures as the major forces influencing the codon usage, but with distinct pressure on specific nucleotide position in the codon. The correspondence analysis (CA and clustering analysis on relative synonymous codon usage (RSCU within each serotype showed similar clustering patterns to the phylogenetic analysis of nucleotide sequences for DENV1–4. These clustering patterns are strongly related to the virus geographic origin. The phylogenetic dependence analysis also suggests that stabilizing selection acts on the codon usage bias. Our analysis of a large scale reveals new feature on DENV genomic evolution.

  15. Large-Scale Genomic Analysis of Codon Usage in Dengue Virus and Evaluation of Its Phylogenetic Dependence

    Science.gov (United States)

    Lara-Ramírez, Edgar E.; Salazar, Ma Isabel; López-López, María de Jesús; Salas-Benito, Juan Santiago; Sánchez-Varela, Alejandro

    2014-01-01

    The increasing number of dengue virus (DENV) genome sequences available allows identifying the contributing factors to DENV evolution. In the present study, the codon usage in serotypes 1–4 (DENV1–4) has been explored for 3047 sequenced genomes using different statistics methods. The correlation analysis of total GC content (GC) with GC content at the three nucleotide positions of codons (GC1, GC2, and GC3) as well as the effective number of codons (ENC, ENCp) versus GC3 plots revealed mutational bias and purifying selection pressures as the major forces influencing the codon usage, but with distinct pressure on specific nucleotide position in the codon. The correspondence analysis (CA) and clustering analysis on relative synonymous codon usage (RSCU) within each serotype showed similar clustering patterns to the phylogenetic analysis of nucleotide sequences for DENV1–4. These clustering patterns are strongly related to the virus geographic origin. The phylogenetic dependence analysis also suggests that stabilizing selection acts on the codon usage bias. Our analysis of a large scale reveals new feature on DENV genomic evolution. PMID:25136631

  16. Radiochemistry methods in DOE Methods for Evaluating Environmental and Waste Management Samples: Addressing new challenges

    International Nuclear Information System (INIS)

    Fadeff, S.K.; Goheen, S.C.; Riley, R.G.

    1994-01-01

    Radiochemistry methods in Department of Energy Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) add to the repertoire of other standard methods in support of U.S. Department of Energy environmental restoration and waste management (DOE/EM) radiochemical characterization activities. Current standard sources of radiochemistry methods are not always applicable for evaluating DOE/EM samples. Examples of current sources include those provided by the US Environmental Protection Agency, the American Society for Testing and Materials, Standard Methods for the Examination of Water and Wastewater, and Environmental Measurements Laboratory Procedures Manual (HASL-300). The applicability of these methods is generally limited to specific matrices (usually water), low-level radioactive samples, and a limited number of analytes. DOE Methods complements these current standard methods by addressing the complexities of EM characterization needs. The process for determining DOE/EM radiochemistry characterization needs is discussed. In this context of DOE/EM needs, the applicability of other sources of standard radiochemistry methods is defined, and gaps in methodology are identified. Current methods in DOE Methods and the EM characterization needs they address are discussed. Sources of new methods and the methods incorporation process are discussed. The means for individuals to participate in (1) identification of DOE/EM needs, (2) the methods incorporation process, and (3) submission of new methods are identified

  17. Development of multilateral comparative evaluation method for fuel cycle system

    International Nuclear Information System (INIS)

    Tamaki, Hitoshi; Ikushima, Takeshi; Nomura, Yasushi; Nakajima, Kiyoshi.

    1998-03-01

    In the near future, Japanese nuclear fuel cycle system will be promoted by national nuclear energy policy, and it''s options i.e. once through, thermal cycle and fast breeder cycle must be selected by multilateral comparative evaluation method from various aspects of safety, society, economy, and e.t.c. Therefore such a problem can be recognized as a social problem of decision making and applied for AHP (Analytic Hierarchy Process) that can multilaterally and comparatively evaluate the problem. On comparative evaluation, much information are needed for decision making, therefore two kinds of databases having these information have been constructed. And then, the multilateral comparative evaluation method consisting of two kinds of databases and AHP for optimum selection of fuel cycle system option have been developed. (author)

  18. Multiple centroid method to evaluate the adaptability of alfalfa genotypes

    Directory of Open Access Journals (Sweden)

    Moysés Nascimento

    2015-02-01

    Full Text Available This study aimed to evaluate the efficiency of multiple centroids to study the adaptability of alfalfa genotypes (Medicago sativa L.. In this method, the genotypes are compared with ideotypes defined by the bissegmented regression model, according to the researcher's interest. Thus, genotype classification is carried out as determined by the objective of the researcher and the proposed recommendation strategy. Despite the great potential of the method, it needs to be evaluated under the biological context (with real data. In this context, we used data on the evaluation of dry matter production of 92 alfalfa cultivars, with 20 cuttings, from an experiment in randomized blocks with two repetitions carried out from November 2004 to June 2006. The multiple centroid method proved efficient for classifying alfalfa genotypes. Moreover, it showed no unambiguous indications and provided that ideotypes were defined according to the researcher's interest, facilitating data interpretation.

  19. Comparison of two heuristic evaluation methods for evaluating the usability of health information systems.

    Science.gov (United States)

    Khajouei, Reza; Hajesmaeel Gohari, Sadrieh; Mirzaee, Moghaddameh

    2018-04-01

    In addition to following the usual Heuristic Evaluation (HE) method, the usability of health information systems can also be evaluated using a checklist. The objective of this study is to compare the performance of these two methods in identifying usability problems of health information systems. Eight evaluators independently evaluated different parts of a Medical Records Information System using two methods of HE (usual and with a checklist). The two methods were compared in terms of the number of problems identified, problem type, and the severity of identified problems. In all, 192 usability problems were identified by two methods in the Medical Records Information System. This was significantly higher than the number of usability problems identified by the checklist and usual method (148 and 92, respectively) (p information systems. The results demonstrated that the checklist method had significantly better performance in terms of the number of identified usability problems; however, the performance of the usual method for identifying problems of higher severity was significantly better. Although the checklist method can be more efficient for less experienced evaluators, wherever usability is critical, the checklist should be used with caution in usability evaluations. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. User Experience Evaluation Methods in Product Development (UXEM'09)

    Science.gov (United States)

    Roto, Virpi; Väänänen-Vainio-Mattila, Kaisa; Law, Effie; Vermeeren, Arnold

    High quality user experience (UX) has become a central competitive factor of product development in mature consumer markets [1]. Although the term UX originated from industry and is a widely used term also in academia, the tools for managing UX in product development are still inadequate. A prerequisite for designing delightful UX in an industrial setting is to understand both the requirements tied to the pragmatic level of functionality and interaction and the requirements pertaining to the hedonic level of personal human needs, which motivate product use [2]. Understanding these requirements helps managers set UX targets for product development. The next phase in a good user-centered design process is to iteratively design and evaluate prototypes [3]. Evaluation is critical for systematically improving UX. In many approaches to UX, evaluation basically needs to be postponed until the product is fully or at least almost fully functional. However, in an industrial setting, it is very expensive to find the UX failures only at this phase of product development. Thus, product development managers and developers have a strong need to conduct UX evaluation as early as possible, well before all the parts affecting the holistic experience are available. Different types of products require evaluation on different granularity and maturity levels of a prototype. For example, due to its multi-user characteristic, a community service or an enterprise resource planning system requires a broader scope of UX evaluation than a microwave oven or a word processor that is meant for a single user at a time. Before systematic UX evaluation can be taken into practice, practical, lightweight UX evaluation methods suitable for different types of products and different phases of product readiness are needed. A considerable amount of UX research is still about the conceptual frameworks and models for user experience [4]. Besides, applying existing usability evaluation methods (UEMs) without

  1. Intercomparison of measuring and evaluation methods for environmental radiations

    International Nuclear Information System (INIS)

    Saito, Kimiaki; Nagaoka, Toshi; Sakamoto, Ryuichi; Tsutsumi, Masahiro; Moriuchi, Shigeru; Okano, Masaharu.

    1990-11-01

    This report presents the results of the cooperative research by IPCR and JAERI. In order to investigate the accuracy and characteristics of the measurement and evaluation methods for environmental radiations, developed independently by IPCR and by JAERI, field experiments were performed using the both methods under the same conditions at the same time. The experiments were performed at fixed points on the ground, in a moving car and in a flying helicopter. Discussion was made on the basis of the comparison of the results from the both methods. The exposure rates from the IPCR method agreed within 10% with those from the JAERI method, however the former were systematically higher than the latter. Since the evaluated exposure rate components due to 238 U-series, 232 Th-series and 40 K showed apparent differences between the IPCR method and the JAERI method, further investigation is necessary for the positions and widths of the applied energy bands and the coefficients in the equations to evaluate the components. In measurements in a moving vehicle, the sequential data should be gathered per sufficiently short distance to distinguish the local change of the radiation level when it changes greatly according to the location. The aerial data showed that the exposure rates from terrestrial gamma rays in the region west of the Abe river are apparently higher than those in the Kanto district. (author)

  2. A biosegmentation benchmark for evaluation of bioimage analysis methods

    Directory of Open Access Journals (Sweden)

    Kvilekval Kristian

    2009-11-01

    Full Text Available Abstract Background We present a biosegmentation benchmark that includes infrastructure, datasets with associated ground truth, and validation methods for biological image analysis. The primary motivation for creating this resource comes from the fact that it is very difficult, if not impossible, for an end-user to choose from a wide range of segmentation methods available in the literature for a particular bioimaging problem. No single algorithm is likely to be equally effective on diverse set of images and each method has its own strengths and limitations. We hope that our benchmark resource would be of considerable help to both the bioimaging researchers looking for novel image processing methods and image processing researchers exploring application of their methods to biology. Results Our benchmark consists of different classes of images and ground truth data, ranging in scale from subcellular, cellular to tissue level, each of which pose their own set of challenges to image analysis. The associated ground truth data can be used to evaluate the effectiveness of different methods, to improve methods and to compare results. Standard evaluation methods and some analysis tools are integrated into a database framework that is available online at http://bioimage.ucsb.edu/biosegmentation/. Conclusion This online benchmark will facilitate integration and comparison of image analysis methods for bioimages. While the primary focus is on biological images, we believe that the dataset and infrastructure will be of interest to researchers and developers working with biological image analysis, image segmentation and object tracking in general.

  3. Technology transfer - insider protection workshop (Safeguards Evaluation Method - Insider Threat)

    International Nuclear Information System (INIS)

    Strait, R.S.; Renis, T.A.

    1986-01-01

    The Safeguards Evaluation Method - Insider Threat, developed by Lawrence Livermore National Laboratory, is a field-applicable tool to evaluate facility safeguards against theft or diversion of special nuclear material (SNM) by nonviolent insiders. To ensure successful transfer of this technology from the laboratory to DOE field offices and contractors, LLNL developed a three-part package. The package includes a workbook, user-friendly microcomputer software, and a three-day training program. The workbook guides an evaluation team through the Safeguards Evaluation Method and provides forms for gathering data. The microcomputer software assists in the evaluation of safeguards effectiveness. The software is designed for safeguards analysts with no previous computer experience. It runs on an IBM Personal Computer or any compatible machine. The three-day training program is called the Insider Protection Workshop. The workshop students learn how to use the workbook and the computer software to assess insider vulnerabilities and to evaluate the benefits and costs of potential improvements. These activities increase the students' appreciation of the insider threat. The workshop format is informal and interactive, employing four different instruction modes: classroom presentations, small-group sessions, a practical exercise, and ''hands-on'' analysis using microcomputers. This approach to technology transfer has been successful: over 100 safeguards planners and analysts have been trained in the method, and it is being used at facilities through the DOE complex

  4. Genomic prediction using subsampling

    OpenAIRE

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-01-01

    Background Genome-wide assisted selection is a critical tool for the?genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each rou...

  5. Use of Thermoanalytic Methods in the Evaluation of Combusted Materials

    Directory of Open Access Journals (Sweden)

    František Krepelka

    2006-12-01

    Full Text Available The paper describes possibilities of using thermoanalytic methods for the evaluation and comparison of materials designed for a direct combustion. Differential thermal analysis (DTA and thermogravimetric analysis (TGA were both used in the evaluation. The paper includes a description of methods of data processing from analyses for the purposes of comparison of used materials regarding their heating values. The following materials were analysed in the experiments: wooden coal of objectional grain size, fly ash from heating plant exhaust funnels, dendromass waste: spruce sawdust, micro-briquettes of spruce sawdust and fly-ash combined.

  6. A method for evaluating dynamical friction in linear ball bearings.

    Science.gov (United States)

    Fujii, Yusaku; Maru, Koichi; Jin, Tao; Yupapin, Preecha P; Mitatha, Somsak

    2010-01-01

    A method is proposed for evaluating the dynamical friction of linear bearings, whose motion is not perfectly linear due to some play in its internal mechanism. In this method, the moving part of a linear bearing is made to move freely, and the force acting on the moving part is measured as the inertial force given by the product of its mass and the acceleration of its centre of gravity. To evaluate the acceleration of its centre of gravity, the acceleration of two different points on it is measured using a dual-axis optical interferometer.

  7. Facile mutant identification via a single parental backcross method and application of whole genome sequencing based mapping pipelines

    Directory of Open Access Journals (Sweden)

    Robert Silas Allen

    2013-09-01

    Full Text Available Forward genetic screens have identified numerous genes involved in development and metabolism, and remain a cornerstone of biological research. However to locate a causal mutation, the practice of crossing to a polymorphic background to generate a mapping population can be problematic if the mutant phenotype is difficult to recognise in the hybrid F2 progeny, or dependent on parental specific traits. Here in a screen for leaf hyponasty mutants, we have performed a single backcross of an Ethane Methyl Sulphonate (EMS generated hyponastic mutant to its parent. Whole genome deep sequencing of a bulked homozygous F2 population and analysis via the Next Generation EMS mutation mapping pipeline (NGM unambiguously determined the causal mutation to be a single nucleotide polymorphisim (SNP residing in HASTY, a previously characterised gene involved in microRNA biogenesis. We have evaluated the feasibility of this backcross approach using three additional SNP mapping pipelines; SHOREmap, the GATK pipeline, and the samtools pipeline. Although there was variance in the identification of EMS SNPs, all returned the same outcome in clearly identifying the causal mutation in HASTY. The simplicity of performing a single parental backcross and genome sequencing a small pool of segregating mutants has great promise for identifying mutations that may be difficult to map using conventional approaches.

  8. Comparison of the dose evaluation methods for criticality accident

    International Nuclear Information System (INIS)

    Shimizu, Yoshio; Oka, Tsutomu

    2004-01-01

    The improvement of the dose evaluation method for criticality accidents is important to rationalize design of the nuclear fuel cycle facilities. The source spectrums of neutron and gamma ray of a criticality accident depend on the condition of the source, its materials, moderation, density and so on. The comparison of the dose evaluation methods for a criticality accident is made. Some methods, which are combination of criticality calculation and shielding calculation, are proposed. Prompt neutron and gamma ray doses from nuclear criticality of some uranium systems have been evaluated as the Nuclear Criticality Slide Rule. The uranium metal source (unmoderated system) and the uranyl nitrate solution source (moderated system) in the rule are evaluated by some calculation methods, which are combinations of code and cross section library, as follows: (a) SAS1X (ENDF/B-IV), (b) MCNP4C (ENDF/B-VI)-ANISN (DLC23E or JSD120), (c) MCNP4C-MCNP4C (ENDF/B-VI). They have consisted of criticality calculation and shielding calculation. These calculation methods are compared about the tissue absorbed dose and the spectrums at 2 m from the source. (author)

  9. Unified method to integrate and blend several, potentially related, sources of information for genetic evaluation.

    Science.gov (United States)

    Vandenplas, Jérémie; Colinet, Frederic G; Gengler, Nicolas

    2014-09-30

    information and, therefore, more reliable estimated breeding values were obtained. The proposed unified method integrated and blended several sources of information well into a genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. The unified method can also be extended to other types of situations such as single-step genomic or multi-trait evaluations, combining information across different traits.

  10. A reliability evaluation method for NPP safety DCS application software

    International Nuclear Information System (INIS)

    Li Yunjian; Zhang Lei; Liu Yuan

    2014-01-01

    In the field of nuclear power plant (NPP) digital i and c application, reliability evaluation for safety DCS application software is a key obstacle to be removed. In order to quantitatively evaluate reliability of NPP safety DCS application software, this paper propose a reliability evaluating method based on software development life cycle every stage's v and v defects density characteristics, by which the operating reliability level of the software can be predicted before its delivery, and helps to improve the reliability of NPP safety important software. (authors)

  11. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, open-quotes Draftclose quotes or open-quotes Verifiedclose quotes. open-quotes Draftclose quotes methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. open-quotes Verifiedclose quotes methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy

  12. DOE methods for evaluating environmental and waste management samples

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K. [eds.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  13. Methods of evaluating SPECT images. The usefulness of the Matsuda`s method by the Patlak plot method in children

    Energy Technology Data Exchange (ETDEWEB)

    Takaishi, Yasuko [Nippon Medical School, Tokyo (Japan); Hashimoto, Kiyoshi; Fujino, Osamu [and others

    1998-11-01

    Single photon emission computed tomography (SPECT) is a tool to study cerebral blood flow (CBF) kinetics. There are three methods of evaluating SPECT images: visual, semi-quantitative (evaluation of the radioactivity ratio of the cerebral region to the cerebellum (R/CE) or to the thalamus (R/TH)) and quantitative (Matsuda`s method by Patlak plot method using {sup 99m}Tc-hexamethylpropylene amine oxime radionuclide angiography). We evaluated SPECT images by the quantitative method in 14 patients with neurological disorders and examined the correlation of the results to those obtained by the semi-quantitative method. There was no significant correlation between the R/CE or R/TH ratio and regional CBF except two regions. The evaluation by the semi-quantitative method may have been inappropriate, probably because the cerebellar or thalamic blood flow was not constant in each case. Evaluation by the quantitative method, on the other hand, seemed to be useful not only for the comparison of CBF among normal subjects, but also in the demonstration of progressive changes of CBF in the same case. The Matsuda`s method by the Patlak plot method is suitable for examination of children, since it dose not require aortic blood sampling. (author)

  14. Fundamental characteristics and simplified evaluation method of dynamic earth pressure

    International Nuclear Information System (INIS)

    Nukui, Y.; Inagaki, Y.; Ohmiya, Y.

    1989-01-01

    In Japan, a method is commonly used in the evaluation of dynamic earth pressure acting on the underground walls of a deeply embedded nuclear reactor building. However, since this method was developed on the basis of the limit state of soil supported by retaining walls, the behavior of dynamic earth pressure acting on the embedded part of a nuclear reactor building may differ from the estimated by this method. This paper examines the fundamental characteristics of dynamic earth pressure through dynamic soil-structure interaction analysis. A simplified method to evaluate dynamic earth pressure for the design of underground walls of a nuclear reactor building is described. The dynamic earth pressure is fluctuating earth pressure during earthquake

  15. Double-strand breaks in genome-sized DNA caused by mechanical stress under mixing: Quantitative evaluation through single-molecule observation

    Science.gov (United States)

    Kikuchi, Hayato; Nose, Keiji; Yoshikawa, Yuko; Yoshikawa, Kenichi

    2018-06-01

    It is becoming increasingly apparent that changes in the higher-order structure of genome-sized DNA molecules of more than several tens kbp play important roles in the self-control of genome activity in living cells. Unfortunately, it has been rather difficult to prepare genome-sized DNA molecules without damage or fragmentation. Here, we evaluated the degree of double-strand breaks (DSBs) caused by mechanical mixing by single-molecule observation with fluorescence microscopy. The results show that DNA breaks are most significant for the first second after the initiation of mechanical agitation. Based on such observation, we propose a novel mixing procedure to significantly decrease DSBs.

  16. Combination of Three Methods of Photo Voltaic Panels Damage Evaluation

    Directory of Open Access Journals (Sweden)

    Olšan T.

    2017-06-01

    Full Text Available In broken photovoltaic (PV cells the flow of electric current can be reduced in some places, which results in a lowered efficiency. In the present study, the damage of PV cells and panels was evaluated using three methods - electroluminescence, infrared camera imaging, and visual examination. The damage is detectable by all these methods which were presented and compared from the viewpoint of resolution, difficulty, and accuracy of monitoring the PV panels damage.

  17. Nondestructive methods for quality evaluation of livestock products.

    Science.gov (United States)

    Narsaiah, K; Jha, Shyam N

    2012-06-01

    The muscles derived from livestock are highly perishable. Rapid and nondestructive methods are essential for quality assurance of such products. Potential nondestructive methods, which can supplement or replace many of traditional time consuming destructive methods, include colour and computer image analysis, NIR spectroscopy, NMRI, electronic nose, ultrasound, X-ray imaging and biosensors. These methods are briefly described and the research work involving them for products derived from livestock is reviewed. These methods will be helpful in rapid screening of large number of samples, monitoring distribution networks, quick product recall and enhance traceability in the value chain of livestock products. With new developments in the areas of basic science related to these methods, colour, image processing, NIR spectroscopy, biosensors and ultrasonic analysis are expected to be widespread and cost effective for large scale meat quality evaluation in near future.

  18. Evaluation of left ventricular function by invasive and noninvasive methods

    Energy Technology Data Exchange (ETDEWEB)

    Kusukawa, R [Yamaguchi Univ., Ube (Japan). School of Medicine

    1982-06-01

    Noninvasive methods in cardiology have progressed very rapidly in recent years. Cardiac catheterization and angiocardiography are the standard methods for evaluating of cardiac performance, however, they need expensive apparatus and are time-consuming, arduous procedures which do not permit to repeat frequently, and sometimes risky. In this article, the indices of pump and muscle function of the heart obtained by invasive methods were compared to those indices obtained by noninvasive methods, and correlation between two groups and usefulness and limitation were discussed. Systolic time intervals are convenient and repeatable measures of left ventricular performance in clinical cardiology. There are significant correlations of PEP/LVET with stroke volume, ejection fraction and mean circumferential shortening velocity. Although some limitations are present in application of this method to certain diseases, these measures are useful in the evaluation of left ventricular performance. Echocardiography has made an era of the noninvasive cardiology. Left ventricular volume, ejection fraction, mean circumferential shortening velocity and PSP/ESVI are accurately calculated by echocardiographic measurement. Nuclear cardiology is also accurate noninvasive method in evaluation of cardiac performance. With this tremendous growth in this field, it will make next era of noninvasive cardiology.

  19. Investigation of the fuel temperature evaluation method at BOL

    International Nuclear Information System (INIS)

    Ishii, Tetsuya; Asaga, Takeo; Nemoto, Junichi

    1999-06-01

    It is one of the major subjects in the improvement of the design method for determining the thermal conditions of the solid type Mixed - Oxide (MOX) fuels in FBR to evaluate the fuel temperature at BOL as precisely as possible. Therefore, we have planned to modify the fuel temperature evaluation method 'FEVER', which was developed by JNC in 1988, as one of the investigation for the establishment of the precise fuel temperature evaluation method. And, we also have planned to use the modified FEVER, named FEVER-M', for estimation of the irradiation conditions of the PTM test in Joyo, called 'B10 test', planning to perform in 2000. In this work, the following results were obtained; 1) As a result of the modification, the uncertainty in the fuel temperature evaluation of 'FEVER-M' is reduced to about ±60 K. 2) Estimating the irradiation conditions of 'B10' test using the method 'FEVER-M', it is found that the appropriate maximum linear heat rate for the test is 620 W/cm. The detail plans of the 'B10' test were also determined based on the results. 3) Based on the results of this work, it is found that one of the effective procedure for the improvement of the accuracy of the fuel temperature evaluation method seems to calculate the fuel temperature taking the pellet relocation phenomena into account. In future, although there are a lot of matters to be discussed in this phenomena, the design method for the thermal conditions of the MOX fuels in FBR should be performed with taking the pellet relocation phenomena into account. (author)

  20. Evaluation of genomic selection for replacement strategies using selection index theory.

    Science.gov (United States)

    Calus, M P L; Bijma, P; Veerkamp, R F

    2015-09-01

    Our objective was to investigate the economic effect of prioritizing heifers for replacement at the herd level based on genomic estimated breeding values, and to compute break-even genotyping costs across a wide range of scenarios. Specifically, we aimed to determine the optimal proportion of preselection based on parent average information for all scenarios considered. Considered replacement strategies include a range of different selection intensities by considering different numbers of heifers available for replacement (15-45 in a herd with 100 dairy cows) as well as different replacement rates (15-40%). Use of conventional versus sexed semen was considered, where the latter resulted in having twice as many heifers available for replacement. The baseline scenario relies on prioritization of replacement heifers based on parent average. The first alternative scenario involved genomic selection of heifers, considering that all heifers were genotyped. The benefits of genomic selection in this scenario were computed using a simple formula that only requires the number of lactating animals, the difference in accuracy between parent average and genomic selection (GS), and the selection intensity as input. When all heifers were genotyped, using GS for replacement of heifers was beneficial in most scenarios for current genotyping prices, provided some room exists for selection, in the sense that at least 2 more heifers are available than needed for replacement. In those scenarios, minimum break-even genotyping costs were equal to half the economic value of a standard deviation of the breeding goal. The second alternative scenario involved a preselection based on parent average, followed by GS among all the preselected heifers. It was in almost all cases beneficial to genotype all heifers when conventional semen was used (i.e., to do no preselection). The optimal proportion of preselection based on parent average was at least 0.63 when sexed semen was used. Use of sexed

  1. Genomic prediction using subsampling.

    Science.gov (United States)

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-03-24

    Genome-wide assisted selection is a critical tool for the genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each round of a Markov Chain Monte Carlo. We evaluated the effect of subsampling bootstrap on prediction and computational parameters. Across datasets, we observed an optimal subsampling proportion of observations around 50% with replacement, and around 33% without replacement. Subsampling provided a substantial decrease in computation time, reducing the time to fit the model by half. On average, losses on predictive properties imposed by subsampling were negligible, usually below 1%. For each dataset, an optimal subsampling point that improves prediction properties was observed, but the improvements were also negligible. Combining subsampling with Gibbs sampling is an interesting ensemble algorithm. The investigation indicates that the subsampling bootstrap Markov chain algorithm substantially reduces computational burden associated with model fitting, and it may slightly enhance prediction properties.

  2. Investigation of Evaluation method of chemical runaway reaction

    International Nuclear Information System (INIS)

    Sato, Yoshihiko; Sasaya, Shinji; Kurakata, Koichiro; Nojiri, Ichiro

    2002-02-01

    Safety study 'Study of evaluation of abnormal occurrence for chemical substances in the nuclear fuel facilities' will be carried out from 2001 to 2005. In this study, the prediction of thermal hazards of chemical substances will be investigated and prepared. The hazard prediction method of chemical substances will be constructed from these results. Therefore, the hazard prediction methods applied in the chemical engineering in which the chemical substances with the hazard of fire and explosion were often treated were investigated. CHETAH (The ASTM Computer Program for Chemical Thermodynamic and Energy Release Evaluation) developed by ASTM (American Society for Testing and Materials) and TSS (Thermal Safety Software) developed by CISP (ChemInform St. Petersburg) were introduced and the fire and explosion hazards of chemical substances and reactions in the reprocessing process were evaluated. From these evaluated results, CHETAH could almost estimate the heat of reaction at 10% accuracy. It was supposed that CHETAH was useful as a screening for the hazards of fire and explosion of the new chemical substances and so on. TSS could calculate the reaction rate and the reaction behavior from the data measured by the various calorimeters rapidly. It was supposed that TSS was useful as an evaluation method for the hazards of fire and explosion of the new chemical reactions and so on. (author)

  3. Applying Hierarchical Task Analysis Method to Discovery Layer Evaluation

    Directory of Open Access Journals (Sweden)

    Marlen Promann

    2015-03-01

    Full Text Available Libraries are implementing discovery layers to offer better user experiences. While usability tests have been helpful in evaluating the success or failure of implementing discovery layers in the library context, the focus has remained on its relative interface benefits over the traditional federated search. The informal site- and context specific usability tests have offered little to test the rigor of the discovery layers against the user goals, motivations and workflow they have been designed to support. This study proposes hierarchical task analysis (HTA as an important complementary evaluation method to usability testing of discovery layers. Relevant literature is reviewed for the discovery layers and the HTA method. As no previous application of HTA to the evaluation of discovery layers was found, this paper presents the application of HTA as an expert based and workflow centered (e.g. retrieving a relevant book or a journal article method to evaluating discovery layers. Purdue University’s Primo by Ex Libris was used to map eleven use cases as HTA charts. Nielsen’s Goal Composition theory was used as an analytical framework to evaluate the goal carts from two perspectives: a users’ physical interactions (i.e. clicks, and b user’s cognitive steps (i.e. decision points for what to do next. A brief comparison of HTA and usability test findings is offered as a way of conclusion.

  4. Comparison of genome-wide association methods in analyses of admixed populations with complex familial relationships.

    Directory of Open Access Journals (Sweden)

    Naveen K Kadri

    Full Text Available Population structure is known to cause false-positive detection in association studies. We compared the power, precision, and type-I error rates of various association models in analyses of a simulated dataset with structure at the population (admixture from two populations; P and family (K levels. We also compared type-I error rates among models in analyses of publicly available human and dog datasets. The models corrected for none, one, or both structure levels. Correction for K was performed with linear mixed models incorporating familial relationships estimated from pedigrees or genetic markers. Linear models that ignored K were also tested. Correction for P was performed using principal component or structured association analysis. In analyses of simulated and real data, linear mixed models that corrected for K were able to control for type-I error, regardless of whether they also corrected for P. In contrast, correction for P alone in linear models was insufficient. The power and precision of linear mixed models with and without correction for P were similar. Furthermore, power, precision, and type-I error rate were comparable in linear mixed models incorporating pedigree and genomic relationships. In summary, in association studies using samples with both P and K, ancestries estimated using principal components or structured assignment were not sufficient to correct type-I errors. In such cases type-I errors may be controlled by use of linear mixed models with relationships derived from either pedigree or from genetic markers.

  5. Evaluation of different methods to overcome in vitro seed dormancy ...

    African Journals Online (AJOL)

    SAM

    2014-09-03

    Sep 3, 2014 ... Seeds from yellow passion fruit (Passiflora edulis Sims) present dormancy imposed by the seed-coat. The present study aimed to evaluate some methods to overcome dormancy of seeds from P. edulis grown under in vitro conditions. The experimental design was completely randomized in factorial scheme ...

  6. An Evaluation of the Efficiency of Different Hygienisation Methods

    Science.gov (United States)

    Zrubková, M.

    2017-10-01

    The aim of this study is to evaluate the efficiency of hygienisation by pasteurisation, temperature-phased anaerobic digestion and sludge liming. A summary of the legislation concerning sludge treatment, disposal and recycling is included. The hygienisation methods are compared not only in terms of hygienisation efficiency but a comparison of other criteria is also included.

  7. An evaluation of solutions to moment method of biochemical oxygen ...

    African Journals Online (AJOL)

    This paper evaluated selected solutions of moment method in respect to Biochemical Oxygen Demand (BOD) kinetics with the aim of ascertain error free solution. Domestic - institutional wastewaters were collected two - weekly for three months from waste - stabilization ponds in Obafemi Awolowo University, Ile - Ife.

  8. Evaluation methods used for phosphate-solubilizing bacteria ...

    African Journals Online (AJOL)

    This work aimed to evaluate the different selection methods and select inorganic phosphorus-solubilizing bacteria as potential plant-growth promoters. Bacterial isolates obtained from sugarcane roots and soil were tested using solid growth media containing bicalcium phosphate and Irecê Apatite ground rock phosphate as ...

  9. Bacterial population in traditional sourdough evaluated by molecular methods

    NARCIS (Netherlands)

    Randazzo, C.L.; Heilig, G.H.J.; Restuccia, C.; Giudici, P.; Caggia, C.

    2005-01-01

    Aims: To study the microbial communities in artisanal sourdoughs, manufactured by traditional procedure in different areas of Sicily, and to evaluate the lactic acid bacteria (LAB) population by classical and culture-independent approaches. Methods and Results: Forty-five LAB isolates were

  10. Data-driven performance evaluation method for CMS RPC trigger ...

    Indian Academy of Sciences (India)

    level triggers, to handle the large stream of data produced in collision. The information transmitted from the three muon subsystems (DT, CSC and RPC) are collected by the Global Muon Trigger (GMT) Board and merged. A method for evaluating ...

  11. Conceptual Crossroads: Methods and Ethics in Ethnographic Evaluation.

    Science.gov (United States)

    Fetterman, David M.

    1986-01-01

    The delicacy required of ethnographic work is nowhere more evident and more necessary than at the conceptual crossroads where methods and ethical decision making intersect. This paper addresses moments of decision and puts them into perspective by locating them within the lifecycle of ethnographic evaluation. (LMO)

  12. Sensitivity evaluation of dynamic speckle activity measurements using clustering methods

    International Nuclear Information System (INIS)

    Etchepareborda, Pablo; Federico, Alejandro; Kaufmann, Guillermo H.

    2010-01-01

    We evaluate and compare the use of competitive neural networks, self-organizing maps, the expectation-maximization algorithm, K-means, and fuzzy C-means techniques as partitional clustering methods, when the sensitivity of the activity measurement of dynamic speckle images needs to be improved. The temporal history of the acquired intensity generated by each pixel is analyzed in a wavelet decomposition framework, and it is shown that the mean energy of its corresponding wavelet coefficients provides a suited feature space for clustering purposes. The sensitivity obtained by using the evaluated clustering techniques is also compared with the well-known methods of Konishi-Fujii, weighted generalized differences, and wavelet entropy. The performance of the partitional clustering approach is evaluated using simulated dynamic speckle patterns and also experimental data.

  13. Statistical methods for evaluating the attainment of cleanup standards

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, R.O.; Simpson, J.C.

    1992-12-01

    This document is the third volume in a series of volumes sponsored by the US Environmental Protection Agency (EPA), Statistical Policy Branch, that provide statistical methods for evaluating the attainment of cleanup Standards at Superfund sites. Volume 1 (USEPA 1989a) provides sampling designs and tests for evaluating attainment of risk-based standards for soils and solid media. Volume 2 (USEPA 1992) provides designs and tests for evaluating attainment of risk-based standards for groundwater. The purpose of this third volume is to provide statistical procedures for designing sampling programs and conducting statistical tests to determine whether pollution parameters in remediated soils and solid media at Superfund sites attain site-specific reference-based standards. This.document is written for individuals who may not have extensive training or experience with statistical methods. The intended audience includes EPA regional remedial project managers, Superfund-site potentially responsible parties, state environmental protection agencies, and contractors for these groups.

  14. Methods for the Evaluation of Waste Treatment Processes

    Directory of Open Access Journals (Sweden)

    Hans-Joachim Gehrmann

    2017-01-01

    Full Text Available Decision makers for waste management are confronted with the problem of selecting the most economic, environmental, and socially acceptable waste treatment process. This paper elucidates evaluation methods for waste treatment processes for the comparison of ecological and economic aspects such as material flow analysis, statistical entropy analysis, energetic and exergetic assessment, cumulative energy demand, and life cycle assessment. The work is based on the VDI guideline 3925. A comparison of two thermal waste treatment plants with different process designs and energy recovery systems was performed with the described evaluation methods. The results are mainly influenced by the type of energy recovery, where the waste-to-energy plant providing district heat and process steam emerged to be beneficial in most aspects. Material recovery options from waste incineration were evaluated according to sustainability targets, such as saving of resources and environmental protection.

  15. Study on Mechanism Experiments and Evaluation Methods for Water Eutrophication

    Directory of Open Access Journals (Sweden)

    Jiabin Yu

    2017-01-01

    Full Text Available The process of water eutrophication involves the interaction of external factors, nutrients, microorganisms, and other factors. It is complex and has not yet been effectively studied. To examine the formation process of water eutrophication, a set of orthogonal experiments with three factors and four levels is designed to analyze the key factors. At the same time, with the help of a large amount of monitoring data, the principal component analysis method is used to extract the main components of water eutrophication and determine the effective evaluation indicators of eutrophication. Finally, the Bayesian theory of uncertainty is applied to the evaluation of the eutrophication process to evaluate the sample data. The simulation results demonstrate the validity of the research method.

  16. Evaluation method for change of concentration of nuclear fuel material

    International Nuclear Information System (INIS)

    Kiyono, Takeshi; Ando, Ryohei.

    1997-01-01

    The present invention provides a method of evaluating the change of concentration of compositions of nuclear fuel element materials loaded to a reactor along with neutron irradiation based on analytic calculation not relying on integration with time. Namely, the method of evaluating the change of concentration of nuclear fuel materials comprises evaluating the changing concentration of nuclear fuel materials based on nuclear fission, capturing of neutrons and radioactive decaying along with neutron irradiation. In this case, an optional nuclide on a nuclear conversion chain is determined as a standard nuclide. When the main fuel material is Pu-239, it is determined as the standard nuclide. The ratio of the concentration of the standard nuclide to that of the nuclide as an object of the evaluation can be expressed by the ratio of the cross sectional area of neutron nuclear reaction of the standard nuclide to the cross sectional area of the neutron nuclear reaction of the nuclide as the object of the evaluation. Accordingly, the concentration of the nuclide as the object of the evaluation can be expressed by an analysis formula shown by an analysis function for the ratio of the concentration of the standard nuclide to the cross section of the neutron nuclear reaction. As a result, by giving an optional concentration of the standard nuclide to the analysis formula, the concentration of each of other nuclides can be determined analytically. (I.S.)

  17. Detecting loci under recent positive selection in dairy and beef cattle by combining different genome-wide scan methods.

    Directory of Open Access Journals (Sweden)

    Yuri Tani Utsunomiya

    Full Text Available As the methodologies available for the detection of positive selection from genomic data vary in terms of assumptions and execution, weak correlations are expected among them. However, if there is any given signal that is consistently supported across different methodologies, it is strong evidence that the locus has been under past selection. In this paper, a straightforward frequentist approach based on the Stouffer Method to combine P-values across different tests for evidence of recent positive selection in common variations, as well as strategies for extracting biological information from the detected signals, were described and applied to high density single nucleotide polymorphism (SNP data generated from dairy and beef cattle (taurine and indicine. The ancestral Bovinae allele state of over 440,000 SNP is also reported. Using this combination of methods, highly significant (P<3.17×10(-7 population-specific sweeps pointing out to candidate genes and pathways that may be involved in beef and dairy production were identified. The most significant signal was found in the Cornichon homolog 3 gene (CNIH3 in Brown Swiss (P = 3.82×10(-12, and may be involved in the regulation of pre-ovulatory luteinizing hormone surge. Other putative pathways under selection are the glucolysis/gluconeogenesis, transcription machinery and chemokine/cytokine activity in Angus; calpain-calpastatin system and ribosome biogenesis in Brown Swiss; and gangliosides deposition in milk fat globules in Gyr. The composite method, combined with the strategies applied to retrieve functional information, may be a useful tool for surveying genome-wide selective sweeps and providing insights in to the source of selection.

  18. Deterministic and fuzzy-based methods to evaluate community resilience

    Science.gov (United States)

    Kammouh, Omar; Noori, Ali Zamani; Taurino, Veronica; Mahin, Stephen A.; Cimellaro, Gian Paolo

    2018-04-01

    Community resilience is becoming a growing concern for authorities and decision makers. This paper introduces two indicator-based methods to evaluate the resilience of communities based on the PEOPLES framework. PEOPLES is a multi-layered framework that defines community resilience using seven dimensions. Each of the dimensions is described through a set of resilience indicators collected from literature and they are linked to a measure allowing the analytical computation of the indicator's performance. The first method proposed in this paper requires data on previous disasters as an input and returns as output a performance function for each indicator and a performance function for the whole community. The second method exploits a knowledge-based fuzzy modeling for its implementation. This method allows a quantitative evaluation of the PEOPLES indicators using descriptive knowledge rather than deterministic data including the uncertainty involved in the analysis. The output of the fuzzy-based method is a resilience index for each indicator as well as a resilience index for the community. The paper also introduces an open source online tool in which the first method is implemented. A case study illustrating the application of the first method and the usage of the tool is also provided in the paper.

  19. Research on image complexity evaluation method based on color information

    Science.gov (United States)

    Wang, Hao; Duan, Jin; Han, Xue-hui; Xiao, Bo

    2017-11-01

    In order to evaluate the complexity of a color image more effectively and find the connection between image complexity and image information, this paper presents a method to compute the complexity of image based on color information.Under the complexity ,the theoretical analysis first divides the complexity from the subjective level, divides into three levels: low complexity, medium complexity and high complexity, and then carries on the image feature extraction, finally establishes the function between the complexity value and the color characteristic model. The experimental results show that this kind of evaluation method can objectively reconstruct the complexity of the image from the image feature research. The experimental results obtained by the method of this paper are in good agreement with the results of human visual perception complexity,Color image complexity has a certain reference value.

  20. On Some Methods in Safety Evaluation in Geotechnics

    Directory of Open Access Journals (Sweden)

    Puła Wojciech

    2015-06-01

    Full Text Available The paper demonstrates how the reliability methods can be utilised in order to evaluate safety in geotechnics. Special attention is paid to the so-called reliability based design that can play a useful and complementary role to Eurocode 7. In the first part, a brief review of first- and second-order reliability methods is given. Next, two examples of reliability-based design are demonstrated. The first one is focussed on bearing capacity calculation and is dedicated to comparison with EC7 requirements. The second one analyses a rigid pile subjected to lateral load and is oriented towards working stress design method. In the second part, applications of random field to safety evaluations in geotechnics are addressed. After a short review of the theory a Random Finite Element algorithm to reliability based design of shallow strip foundation is given. Finally, two illustrative examples for cohesive and cohesionless soils are demonstrated.

  1. On Some Methods in Safety Evaluation in Geotechnics

    Science.gov (United States)

    Puła, Wojciech; Zaskórski, Łukasz

    2015-06-01

    The paper demonstrates how the reliability methods can be utilised in order to evaluate safety in geotechnics. Special attention is paid to the so-called reliability based design that can play a useful and complementary role to Eurocode 7. In the first part, a brief review of first- and second-order reliability methods is given. Next, two examples of reliability-based design are demonstrated. The first one is focussed on bearing capacity calculation and is dedicated to comparison with EC7 requirements. The second one analyses a rigid pile subjected to lateral load and is oriented towards working stress design method. In the second part, applications of random field to safety evaluations in geotechnics are addressed. After a short review of the theory a Random Finite Element algorithm to reliability based design of shallow strip foundation is given. Finally, two illustrative examples for cohesive and cohesionless soils are demonstrated.

  2. The Bolmen tunnel project - evaluation of geophysical site investigation methods

    International Nuclear Information System (INIS)

    Stanfors, R.

    1987-12-01

    The report presents geophysical measurements along and adjacent to the tunnel and an evaluation of the ability of the various methods to permit prediction of rock mass parameters of significance to stability and water bearing ability. The evaluation shows that, using airborne electro-magnetic surveys, it was possible to indicate about 80% of alla the zones of weakness more than 50 m wide in the tunnel. Airborne magnetic surveys located about 90% of all dolerite dykes more than 10 m wide. Ground-level VLF and Slingram methods of electro-magnetic measurement indicated 75% and 85% respectively of all zones of weakness more than 50 m wide. Resistivity methods were successfully used to locate clay filled and water-bearing fracture zones. About 75% of the length of tunnel over which resistivity values below 500 ohm m were measured required shotcrete support and pre-grouting. (orig./DG)

  3. Evaluation method for the drying performance of enzyme containing formulations

    DEFF Research Database (Denmark)

    Sloth, Jakob; Bach, P.; Jensen, Anker Degn

    2008-01-01

    A method is presented for fast and cheap evaluation of the performance of enzyme containing formulations in terms of preserving the highest enzyme activity during spray drying. The method is based on modeling the kinetics of the thermal inactivation reaction which occurs during the drying process....... Relevant kinetic parameters are determined from differential scanning calorimeter (DSC) experiments and the model is used to simulate the severity of the inactivation reaction for temperatures and moisture levels relevant for spray drying. After conducting experiments and subsequent simulations...... for a number of different formulations it may be deduced which formulation performs best. This is illustrated by a formulation design study where 4 different enzyme containing formulations are evaluated. The method is validated by comparison to pilot scale spray dryer experiments....

  4. Methods for evaluation of mechanical stress condition of materials

    Directory of Open Access Journals (Sweden)

    Mirchev Yordan

    2018-01-01

    Full Text Available Primary attention is given to the following methods: method by drilling cylindrical holes (drill method and integrated ultrasonic method using volume (longitudinal and transverse, surface, and sub-surface waves. Drill method allows determination of residual mechanical stress in small depth of material surfaces, assessing type, size, and orientation of principal stresses. For the first time, parallel studies are carried out of mechanical stress in materials using the electroacoustic effect of volume, surface and sub-surface waves on the one hand, and effective mechanical stresses on the other. The experimental results present electroacoustic coefficients for different types of waves in the material of gas pipeline tube of 243 mm diameter and 14 mm thickness. These are used to evaluate mechanical stresses in pipelines, according to active GOST standards.

  5. Holistic Evaluation of Lightweight Operating Systems using the PERCU Method

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; He, Yun (Helen); Carter, Jonathan; Glenski, Joseph; Rippe, Lynn; Cardo, Nicholas

    2008-05-01

    The scale of Leadership Class Systems presents unique challenges to the features and performance of operating system services. This paper reports results of comprehensive evaluations of two Light Weight Operating Systems (LWOS), Cray's Catamount Virtual Node (CVN) and Linux Environment (CLE) operating systems, on the exact same large-scale hardware. The evaluation was carried out over a 5-month period on NERSC's 19,480 core Cray XT-4, Franklin, using a comprehensive evaluation method that spans Performance, Effectiveness, Reliability, Consistency and Usability criteria for all major subsystems and features. The paper presents the results of the comparison between CVN and CLE, evaluates their relative strengths, and reports observations regarding the world's largest Cray XT-4 as well.

  6. Benchmarking of methods for identification of antimicrobial resistance genes in bacterial whole genome data

    DEFF Research Database (Denmark)

    Clausen, Philip T. L. C.; Zankari, Ea; Aarestrup, Frank Møller

    2016-01-01

    to two different methods in current use for identification of antibiotic resistance genes in bacterial WGS data. A novel method, KmerResistance, which examines the co-occurrence of k-mers between the WGS data and a database of resistance genes, was developed. The performance of this method was compared...... with two previously described methods; ResFinder and SRST2, which use an assembly/BLAST method and BWA, respectively, using two datasets with a total of 339 isolates, covering five species, originating from the Oxford University Hospitals NHS Trust and Danish pig farms. The predicted resistance...... was compared with the observed phenotypes for all isolates. To challenge further the sensitivity of the in silico methods, the datasets were also down-sampled to 1% of the reads and reanalysed. The best results were obtained by identification of resistance genes by mapping directly against the raw reads...

  7. A phylogenetic method to perform genome-wide association studies in microbes that accounts for population structure and recombination.

    Directory of Open Access Journals (Sweden)

    Caitlin Collins

    2018-02-01

    Full Text Available Genome-Wide Association Studies (GWAS in microbial organisms have the potential to vastly improve the way we understand, manage, and treat infectious diseases. Yet, microbial GWAS methods established thus far remain insufficiently able to capitalise on the growing wealth of bacterial and viral genetic sequence data. Facing clonal population structure and homologous recombination, existing GWAS methods struggle to achieve both the precision necessary to reject spurious findings and the power required to detect associations in microbes. In this paper, we introduce a novel phylogenetic approach that has been tailor-made for microbial GWAS, which is applicable to organisms ranging from purely clonal to frequently recombining, and to both binary and continuous phenotypes. Our approach is robust to the confounding effects of both population structure and recombination, while maintaining high statistical power to detect associations. Thorough testing via application to simulated data provides strong support for the power and specificity of our approach and demonstrates the advantages offered over alternative cluster-based and dimension-reduction methods. Two applications to Neisseria meningitidis illustrate the versatility and potential of our method, confirming previously-identified penicillin resistance loci and resulting in the identification of both well-characterised and novel drivers of invasive disease. Our method is implemented as an open-source R package called treeWAS which is freely available at https://github.com/caitiecollins/treeWAS.

  8. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods

    Science.gov (United States)

    2010-01-01

    Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202

  9. Simultaneous gene finding in multiple genomes.

    Science.gov (United States)

    König, Stefanie; Romoth, Lars W; Gerischer, Lizzy; Stanke, Mario

    2016-11-15

    As the tree of life is populated with sequenced genomes ever more densely, the new challenge is the accurate and consistent annotation of entire clades of genomes. We address this problem with a new approach to comparative gene finding that takes a multiple genome alignment of closely related species and simultaneously predicts the location and structure of protein-coding genes in all input genomes, thereby exploiting negative selection and sequence conservation. The model prefers potential gene structures in the different genomes that are in agreement with each other, or-if not-where the exon gains and losses are plausible given the species tree. We formulate the multi-species gene finding problem as a binary labeling problem on a graph. The resulting optimization problem is NP hard, but can be efficiently approximated using a subgradient-based dual decomposition approach. The proposed method was tested on whole-genome alignments of 12 vertebrate and 12 Drosophila species. The accuracy was evaluated for human, mouse and Drosophila melanogaster and compared to competing methods. Results suggest that our method is well-suited for annotation of (a large number of) genomes of closely related species within a clade, in particular, when RNA-Seq data are available for many of the genomes. The transfer of existing annotations from one genome to another via the genome alignment is more accurate than previous approaches that are based on protein-spliced alignments, when the genomes are at close to medium distances. The method is implemented in C ++ as part of Augustus and available open source at http://bioinf.uni-greifswald.de/augustus/ CONTACT: stefaniekoenig@ymail.com or mario.stanke@uni-greifswald.deSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Participatory Training Evaluation Method (PATEM) as a Collaborative Evaluation Capacity Building Strategy

    Science.gov (United States)

    Kuzmin, Alexey

    2012-01-01

    This article describes Participatory Training Evaluation Method (PATEM) of measuring participants' reaction to the training. PATEM provides rich information; allows to document evaluation findings; becomes organic part of the training that helps participants process their experience individually and as a group; makes sense to participants; is an…

  11. Economic evaluation in patient safety: a literature review of methods.

    Science.gov (United States)

    de Rezende, Bruna Alves; Or, Zeynep; Com-Ruelle, Laure; Michel, Philippe

    2012-06-01

    Patient safety practices, targeting organisational changes for improving patient safety, are implemented worldwide but their costs are rarely evaluated. This paper provides a review of the methods used in economic evaluation of such practices. International medical and economics databases were searched for peer-reviewed publications on economic evaluations of patient safety between 2000 and 2010 in English and French. This was complemented by a manual search of the reference lists of relevant papers. Grey literature was excluded. Studies were described using a standardised template and assessed independently by two researchers according to six quality criteria. 33 articles were reviewed that were representative of different patient safety domains, data types and evaluation methods. 18 estimated the economic burden of adverse events, 3 measured the costs of patient safety practices and 12 provided complete economic evaluations. Healthcare-associated infections were the most common subject of evaluation, followed by medication-related errors and all types of adverse events. Of these, 10 were selected that had adequately fulfilled one or several key quality criteria for illustration. This review shows that full cost-benefit/utility evaluations are rarely completed as they are resource intensive and often require unavailable data; some overcome these difficulties by performing stochastic modelling and by using secondary sources. Low methodological transparency can be a problem for building evidence from available economic evaluations. Investing in the economic design and reporting of studies with more emphasis on defining study perspectives, data collection and methodological choices could be helpful for strengthening our knowledge base on practices for improving patient safety.

  12. Evaluating a physician leadership development program - a mixed methods approach.

    Science.gov (United States)

    Throgmorton, Cheryl; Mitchell, Trey; Morley, Tom; Snyder, Marijo

    2016-05-16

    Purpose - With the extent of change in healthcare today, organizations need strong physician leaders. To compensate for the lack of physician leadership education, many organizations are sending physicians to external leadership programs or developing in-house leadership programs targeted specifically to physicians. The purpose of this paper is to outline the evaluation strategy and outcomes of the inaugural year of a Physician Leadership Academy (PLA) developed and implemented at a Michigan-based regional healthcare system. Design/methodology/approach - The authors applied the theoretical framework of Kirkpatrick's four levels of evaluation and used surveys, observations, activity tracking, and interviews to evaluate the program outcomes. The authors applied grounded theory techniques to the interview data. Findings - The program met targeted outcomes across all four levels of evaluation. Interview themes focused on the significance of increasing self-awareness, building relationships, applying new skills, and building confidence. Research limitations/implications - While only one example, this study illustrates the importance of developing the evaluation strategy as part of the program design. Qualitative research methods, often lacking from learning evaluation design, uncover rich themes of impact. The study supports how a PLA program can enhance physician learning, engagement, and relationship building throughout and after the program. Physician leaders' partnership with organization development and learning professionals yield results with impact to individuals, groups, and the organization. Originality/value - Few studies provide an in-depth review of evaluation methods and outcomes of physician leadership development programs. Healthcare organizations seeking to develop similar in-house programs may benefit applying the evaluation strategy outlined in this study.

  13. Are three methods better than one? A comparative assessment of usability evaluation methods in an EHR.

    Science.gov (United States)

    Walji, Muhammad F; Kalenderian, Elsbeth; Piotrowski, Mark; Tran, Duong; Kookal, Krishna K; Tokede, Oluwabunmi; White, Joel M; Vaderhobli, Ram; Ramoni, Rachel; Stark, Paul C; Kimmes, Nicole S; Lagerweij, Maxim; Patel, Vimla L

    2014-05-01

    To comparatively evaluate the effectiveness of three different methods involving end-users for detecting usability problems in an EHR: user testing, semi-structured interviews and surveys. Data were collected at two major urban dental schools from faculty, residents and dental students to assess the usability of a dental EHR for developing a treatment plan. These included user testing (N=32), semi-structured interviews (N=36), and surveys (N=35). The three methods together identified a total of 187 usability violations: 54% via user testing, 28% via the semi-structured interview and 18% from the survey method, with modest overlap. These usability problems were classified into 24 problem themes in 3 broad categories. User testing covered the broadest range of themes (83%), followed by the interview (63%) and survey (29%) methods. Multiple evaluation methods provide a comprehensive approach to identifying EHR usability challenges and specific problems. The three methods were found to be complementary, and thus each can provide unique insights for software enhancement. Interview and survey methods were found not to be sufficient by themselves, but when used in conjunction with the user testing method, they provided a comprehensive evaluation of the EHR. We recommend using a multi-method approach when testing the usability of health information technology because it provides a more comprehensive picture of usability challenges. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Evaluation of two streamlined life cycle assessment methods

    International Nuclear Information System (INIS)

    Hochschomer, Elisabeth; Finnveden, Goeran; Johansson, Jessica

    2002-02-01

    Two different methods for streamlined life cycle assessment (LCA) are described: the MECO-method and SLCA. Both methods are tested on an already made case-study on cars fuelled with petrol or ethanol, and electric cars with electricity produced from hydro power or coal. The report also contains some background information on LCA and streamlined LCA, and a deschption of the case study used. The evaluation of the MECO and SLCA-methods are based on a comparison of the results from the case study as well as practical aspects. One conclusion is that the SLCA-method has some limitations. Among the limitations are that the whole life-cycle is not covered, it requires quite a lot of information and there is room for arbitrariness. It is not very flexible instead it difficult to develop further. We are therefore not recommending the SLCA-method. The MECO-method does in comparison show several attractive features. It is also interesting to note that the MECO-method produces information that is complementary compared to a more traditional quantitative LCA. We suggest that the MECO method needs some further development and adjustment to Swedish conditions

  15. A universal genomic coordinate translator for comparative genomics.

    Science.gov (United States)

    Zamani, Neda; Sundström, Görel; Meadows, Jennifer R S; Höppner, Marc P; Dainat, Jacques; Lantz, Henrik; Haas, Brian J; Grabherr, Manfred G

    2014-06-30

    Genomic duplications constitute major events in the evolution of species, allowing paralogous copies of genes to take on fine-tuned biological roles. Unambiguously identifying the orthology relationship between copies across multiple genomes can be resolved by synteny, i.e. the conserved order of genomic sequences. However, a comprehensive analysis of duplication events and their contributions to evolution would require all-to-all genome alignments, which increases at N2 with the number of available genomes, N. Here, we introduce Kraken, software that omits the all-to-all requirement by recursively traversing a graph of pairwise alignments and dynamically re-computing orthology. Kraken scales linearly with the number of targeted genomes, N, which allows for including large numbers of genomes in analyses. We first evaluated the method on the set of 12 Drosophila genomes, finding that orthologous correspondence computed indirectly through a graph of multiple synteny maps comes at minimal cost in terms of sensitivity, but reduces overall computational runtime by an order of magnitude. We then used the method on three well-annotated mammalian genomes, human, mouse, and rat, and show that up to 93% of protein coding transcripts have unambiguous pairwise orthologous relationships across the genomes. On a nucleotide level, 70 to 83% of exons match exactly at both splice junctions, and up to 97% on at least one junction. We last applied Kraken to an RNA-sequencing dataset from multiple vertebrates and diverse tissues, where we confirmed that brain-specific gene family members, i.e. one-to-many or many-to-many homologs, are more highly correlated across species than single-copy (i.e. one-to-one homologous) genes. Not limited to protein coding genes, Kraken also identifies thousands of newly identified transcribed loci, likely non-coding RNAs that are consistently transcribed in human, chimpanzee and gorilla, and maintain significant correlation of expression levels across

  16. Evaluation method of radon preventing effect in underground construction

    International Nuclear Information System (INIS)

    Luo Shaodong; Deng Yuequan; Dong Faqin; Qu Ruixue; Xie Zhonglei

    2014-01-01

    Background: It's difficult to evaluate the radon prevention effect because of the short operating time of measuring instrument under the circumstances of high humidity in underground construction. Purpose: A new rapid method to evaluate the radon prevention efficiency of underground construction was introduced. Methods: The radon concentrations before and after shielding operation were determined, and according to the regularity of radon decay, the shielding rate can be calculated. Results: The results showed that radon shielding rate in underground construction remains generally stable with variation of time, and the actual relatively standard deviation was 3.95%. So the rapid determination and evaluation of radon preventing effect under special conditions in underground construction can be realized by taking shielding rate in a short time for the final shielding rate. Compared with those by the local static method in ground lab, the results were similar. Conclusion: This paper provided a prompt, accurate and practicable way for the evaluation of radon prevention in underground construction, having a certain reference value. (authors)

  17. Methodological proposal for environmental impact evaluation since different specific methods

    International Nuclear Information System (INIS)

    Leon Pelaez, Juan Diego; Lopera Arango Gabriel Jaime

    1999-01-01

    Some conceptual and practical elements related to environmental impact evaluation are described and related to the preparation of technical reports (environmental impact studies and environmental management plans) to be presented to environmental authorities for obtaining the environmental permits for development projects. In the first part of the document a summary of the main aspects of normative type is made that support the studies of environmental impact in Colombia. We propose a diagram for boarding and elaboration of the evaluation of environmental impact, which begins with the description of the project and of the environmental conditions in the area of the same. Passing then to identify the impacts through a method matricial and continuing with the quantitative evaluation of the same. For which we propose the use of the method developed by Arboleda (1994). Also we propose to qualify the activities of the project and the components of the environment in their relative importance, by means of a method here denominated agglomerate evaluation. Which allows finding those activities more impacting and the mostly impacted components. Lastly it is presented some models for the elaboration and presentation of the environmental management plans. The pursuit programs and those of environmental supervision

  18. Method for evaluation of radiative properties of glass samples

    Energy Technology Data Exchange (ETDEWEB)

    Mohelnikova, Jitka [Faculty of Civil Engineering, Brno University of Technology, Veveri 95, 602 00 Brno (Czech Republic)], E-mail: mohelnikova.j@fce.vutbr.cz

    2008-04-15

    The paper presents a simple calculation method which serves for an evaluation of radiative properties of window glasses. The method is based on a computer simulation model of the energy balance of a thermally insulated box with selected glass samples. A temperature profile of the air inside of the box with a glass sample exposed to affecting radiation was determined for defined boundary conditions. The spectral range of the radiation was considered in the interval between 280 and 2500 nm. This interval is adequate to the spectral range of solar radiation affecting windows in building facades. The air temperature rise within the box was determined in a response to the affecting radiation in the time between the beginning of the radiation exposition and the time of steady-state thermal conditions. The steady state temperature inside of the insulated box serves for the evaluation of the box energy balance and determination of the glass sample radiative properties. These properties are represented by glass characteristics as mean values of transmittance, reflectance and absorptance calculated for a defined spectral range. The data of the computer simulations were compared to experimental measurements on a real model of the insulated box. Results of both the calculations and measurements are in a good compliance. The method is recommended for preliminary evaluation of window glass radiative properties which serve as data for energy evaluation of buildings.

  19. Dogmas in the assessment of usability evaluation methods

    DEFF Research Database (Denmark)

    Hornbæk, Kasper

    2010-01-01

    Usability evaluation methods (UEMs) are widely recognised as an essential part of systems development. Assessments of the performance of UEMs, however, have been criticised for low validity and limited reliability. The present study extends this critique by describing seven dogmas in recent work ...... research approaches that may help move beyond the dogmas. In particular, we emphasise detailed studies of evaluation processes, assessments of the impact of UEMs on design carried out in real-world systems development and analyses of how UEMs may be combined......Usability evaluation methods (UEMs) are widely recognised as an essential part of systems development. Assessments of the performance of UEMs, however, have been criticised for low validity and limited reliability. The present study extends this critique by describing seven dogmas in recent work...... on UEMs. The dogmas include using inadequate procedures and measures for assessment, focusing on win-lose outcomes, holding simplistic models of how usability evaluators work, concentrating on evaluation rather than on design and working from the assumption that usability problems are real. We discuss...

  20. Operator performance evaluation using multi criteria decision making methods

    Science.gov (United States)

    Rani, Ruzanita Mat; Ismail, Wan Rosmanira; Razali, Siti Fatihah

    2014-06-01

    Operator performance evaluation is a very important operation in labor-intensive manufacturing industry because the company's productivity depends on the performance of its operators. The aims of operator performance evaluation are to give feedback to operators on their performance, to increase company's productivity and to identify strengths and weaknesses of each operator. In this paper, six multi criteria decision making methods; Analytical Hierarchy Process (AHP), fuzzy AHP (FAHP), ELECTRE, PROMETHEE II, Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR) are used to evaluate the operators' performance and to rank the operators. The performance evaluation is based on six main criteria; competency, experience and skill, teamwork and time punctuality, personal characteristics, capability and outcome. The study was conducted at one of the SME food manufacturing companies in Selangor. From the study, it is found that AHP and FAHP yielded the "outcome" criteria as the most important criteria. The results of operator performance evaluation showed that the same operator is ranked the first using all six methods.

  1. A new method to evaluate human-robot system performance

    Science.gov (United States)

    Rodriguez, G.; Weisbin, C. R.

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  2. Development of evaluation method for software hazard identification techniques

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-01-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  3. Field evaluation of personal sampling methods for multiple bioaerosols.

    Science.gov (United States)

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  4. Field evaluation of personal sampling methods for multiple bioaerosols.

    Directory of Open Access Journals (Sweden)

    Chi-Hsun Wang

    Full Text Available Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min. Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  5. Evaluation of whole genome amplified DNA to decrease material expenditure and increase quality

    Directory of Open Access Journals (Sweden)

    Marie Bækvad-Hansen

    2017-06-01

    Discussion: Whole genome amplified DNA samples from dried blood spots is well suited for array genotyping and produces robust and reliable genotype data. However, the amplification process introduces additional noise to the data, making detection of structural variants such as copy number variants difficult. With this study, we explore ways of optimizing the amplification protocol in order to reduce noise and increase data quality. We found, that the amplification process was very robust, and that changes in amplification time or temperature did not alter the genotyping calls or quality of the array data. Adding additional replicates of each sample also lead to insignificant changes in the array data. Thus, the amount of noise introduced by the amplification process was consistent regardless of changes made to the amplification protocol. We also explored ways of decreasing material expenditure by reducing the spot size or the amplification reaction volume. The reduction did not affect the quality of the genotyping data.

  6. SCPS: a fast implementation of a spectral method for detecting protein families on a genome-wide scale

    Directory of Open Access Journals (Sweden)

    Paccanaro Alberto

    2010-03-01

    Full Text Available Abstract Background An important problem in genomics is the automatic inference of groups of homologous proteins from pairwise sequence similarities. Several approaches have been proposed for this task which are "local" in the sense that they assign a protein to a cluster based only on the distances between that protein and the other proteins in the set. It was shown recently that global methods such as spectral clustering have better performance on a wide variety of datasets. However, currently available implementations of spectral clustering methods mostly consist of a few loosely coupled Matlab scripts that assume a fair amount of familiarity with Matlab programming and hence they are inaccessible for large parts of the research community. Results SCPS (Spectral Clustering of Protein Sequences is an efficient and user-friendly implementation of a spectral method for inferring protein families. The method uses only pairwise sequence similarities, and is therefore practical when only sequence information is available. SCPS was tested on difficult sets of proteins whose relationships were extracted from the SCOP database, and its results were extensively compared with those obtained using other popular protein clustering algorithms such as TribeMCL, hierarchical clustering and connected component analysis. We show that SCPS is able to identify many of the family/superfamily relationships correctly and that the quality of the obtained clusters as indicated by their F-scores is consistently better than all the other methods we compared it with. We also demonstrate the scalability of SCPS by clustering the entire SCOP database (14,183 sequences and the complete genome of the yeast Saccharomyces cerevisiae (6,690 sequences. Conclusions Besides the spectral method, SCPS also implements connected component analysis and hierarchical clustering, it integrates TribeMCL, it provides different cluster quality tools, it can extract human-readable protein

  7. SCPS: a fast implementation of a spectral method for detecting protein families on a genome-wide scale.

    Science.gov (United States)

    Nepusz, Tamás; Sasidharan, Rajkumar; Paccanaro, Alberto

    2010-03-09

    An important problem in genomics is the automatic inference of groups of homologous proteins from pairwise sequence similarities. Several approaches have been proposed for this task which are "local" in the sense that they assign a protein to a cluster based only on the distances between that protein and the other proteins in the set. It was shown recently that global methods such as spectral clustering have better performance on a wide variety of datasets. However, currently available implementations of spectral clustering methods mostly consist of a few loosely coupled Matlab scripts that assume a fair amount of familiarity with Matlab programming and hence they are inaccessible for large parts of the research community. SCPS (Spectral Clustering of Protein Sequences) is an efficient and user-friendly implementation of a spectral method for inferring protein families. The method uses only pairwise sequence similarities, and is therefore practical when only sequence information is available. SCPS was tested on difficult sets of proteins whose relationships were extracted from the SCOP database, and its results were extensively compared with those obtained using other popular protein clustering algorithms such as TribeMCL, hierarchical clustering and connected component analysis. We show that SCPS is able to identify many of the family/superfamily relationships correctly and that the quality of the obtained clusters as indicated by their F-scores is consistently better than all the other methods we compared it with. We also demonstrate the scalability of SCPS by clustering the entire SCOP database (14,183 sequences) and the complete genome of the yeast Saccharomyces cerevisiae (6,690 sequences). Besides the spectral method, SCPS also implements connected component analysis and hierarchical clustering, it integrates TribeMCL, it provides different cluster quality tools, it can extract human-readable protein descriptions using GI numbers from NCBI, it interfaces with

  8. Test Methods for Evaluating Solid Waste, Physical/Chemical Methods. First Update. (3rd edition)

    International Nuclear Information System (INIS)

    Friedman; Sellers.

    1988-01-01

    The proposed Update is for Test Methods for Evaluating Solid Waste, Physical/Chemical Methods, SW-846, Third Edition. Attached to the report is a list of methods included in the proposed update indicating whether the method is a new method, a partially revised method, or a totally revised method. Do not discard or replace any of the current pages in the SW-846 manual until the proposed update I package is promulgated. Until promulgation of the update package, the methods in the update package are not officially part of the SW-846 manual and thus do not carry the status of EPA-approved methods. In addition to the proposed Update, six finalized methods are included for immediate inclusion into the Third Edition of SW-846. Four methods, originally proposed October 1, 1984, will be finalized in a soon to be released rulemaking. They are, however, being submitted to subscribers for the first time in the update. These methods are 7211, 7381, 7461, and 7951. Two other methods were finalized in the 2nd Edition of SW-846. They were inadvertantly omitted from the 3rd Edition and are not being proposed as new. These methods are 7081 and 7761

  9. Evaluation of cost reduction method for manufacturing ODS ferritic claddings

    International Nuclear Information System (INIS)

    Fujiwara, Masayuki; Mizuta, Shunji; Ukai, Shigeharu

    2000-04-01

    For evaluating the fast reactor system technology, it is important to evaluate the practical feasibility of ODS ferritic claddings, which is the most promising materials to attain the goal of high coolant temperature and more than 150 GWd/t. Based on the results of their technology development, mass production process with highly economically benefit as well as manufacturing cost estimation of ODS ferritic claddings were preliminarily conducted. From the view point of future utility scale, the cost for manufacturing mother tubes has a dominant factor in the total manufacturing cost. The method to reduce the cost of mother tube manufacturing was also preliminarily investigated. (author)

  10. A new method to evaluate neutron spectra for bnct

    International Nuclear Information System (INIS)

    Martin Hernandez, Guido

    2001-01-01

    This paper deals with the development of a method to evaluate neutron spectra for BNCT. Physical dose deposition calculations for different neutron energies, ranging from thermal to fast, were performed. A matrix, containing dose for each energy and position in the beam center line was obtained. MCNP 4B and Snyder's head model were used. A simple computer code containing the matrix calculates the dose for each point in the beam center line depending on the input energy spectrum to be evaluated. The output of this program is the dose distribution in the brain and the dose gain, that is the ratio between dose to tumor and maximum dose to healthy tissue maximum

  11. International intercomparison of neutron spectra evaluating methods using activation detectors

    International Nuclear Information System (INIS)

    Fischer, A.

    1975-06-01

    The international intercomparison of neutron spectrum evaluation methods using activation detectors was organized by the IAEA in 1971 - 1972. All of the contributions and the results of a critical evaluation are presented here. The spectra of different contributors are compared to a reference spectrum by means of different integrals and weighting functions. Different cross section sets, foil numbers, energy point systems, guess spectra used by the contributors cause differences in the resulting spectra. The possible ways of separating these effects are also investigated. Suggestions are made for the organization of a new intercomparison on the basis of more uniform input data. (orig.) [de

  12. Performance evaluation methods and instrumentation for mine ventilation fans

    Institute of Scientific and Technical Information of China (English)

    LI Man; WANG Xue-rong

    2009-01-01

    Ventilation fans are one of the most important pieces of equipment in coal mines. Their performance plays an important role in the safety of staff and production. Given the actual requirements of coal mine production, we instituted a research project on the measurement methods of key performance parameters such as wind pressure, amount of ventilation and power. At the end a virtual instrument for mine ventilation fans performance evaluation was developed using a USB interface. The practical perform-ance and analytical results of our experiments show that it is feasible, reliable and effective to use the proposed instrumentation for mine ventilation performance evaluation.

  13. Genome-wide evaluation of histone methylation changes associated with leaf senescence in Arabidopsis.

    Directory of Open Access Journals (Sweden)

    Judy A Brusslan

    Full Text Available Leaf senescence is the orderly dismantling of older tissue that allows recycling of nutrients to developing portions of the plant and is accompanied by major changes in gene expression. Histone modifications correlate to levels of gene expression, and this study utilizes ChIP-seq to classify activating H3K4me3 and silencing H3K27me3 marks on a genome-wide scale for soil-grown mature and naturally senescent Arabidopsis leaves. ChIPnorm was used to normalize data sets and identify genomic regions with significant differences in the two histone methylation patterns, and the differences were correlated to changes in gene expression. Genes that showed an increase in the H3K4me3 mark in older leaves were senescence up-regulated, while genes that showed a decrease in the H3K4me3 mark in the older leaves were senescence down-regulated. For the H3K27me3 modification, genes that lost the H3K27me3 mark in older tissue were senescence up-regulated. Only a small number of genes gained the H3K27me3 mark, and these were senescence down-regulated. Approximately 50% of senescence up-regulated genes lacked the H3K4me3 mark in both mature and senescent leaf tissue. Two of these genes, SAG12 and At1g73220, display strong senescence up-regulation without the activating H3K4me3 histone modification. This study provides an initial epigenetic framework for the developmental transition into senescence.

  14. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  15. Statistical power and utility of meta-analysis methods for cross-phenotype genome-wide association studies.

    Science.gov (United States)

    Zhu, Zhaozhong; Anttila, Verneri; Smoller, Jordan W; Lee, Phil H

    2018-01-01

    Advances in recent genome wide association studies (GWAS) suggest that pleiotropic effects on human complex traits are widespread. A number of classic and recent meta-analysis methods have been used to identify genetic loci with pleiotropic effects, but the overall performance of these methods is not well understood. In this work, we use extensive simulations and case studies of GWAS datasets to investigate the power and type-I error rates of ten meta-analysis methods. We specifically focus on three conditions commonly encountered in the studies of multiple traits: (1) extensive heterogeneity of genetic effects; (2) characterization of trait-specific association; and (3) inflated correlation of GWAS due to overlapping samples. Although the statistical power is highly variable under distinct study conditions, we found the superior power of several methods under diverse heterogeneity. In particular, classic fixed-effects model showed surprisingly good performance when a variant is associated with more than a half of study traits. As the number of traits with null effects increases, ASSET performed the best along with competitive specificity and sensitivity. With opposite directional effects, CPASSOC featured the first-rate power. However, caution is advised when using CPASSOC for studying genetically correlated traits with overlapping samples. We conclude with a discussion of unresolved issues and directions for future research.

  16. Systematic evaluation of observational methods assessing biomechanical exposures at work

    DEFF Research Database (Denmark)

    Takala, Esa-Pekka; Irmeli, Pehkonen; Forsman, Mikael

    2009-01-01

    by sorting the methods according to the several items evaluated.   Numerous methods have been developed to assess physical workload (biomechanical exposures) in order to identify hazards leading to musculoskeletal disorders, to monitor the effects of ergonomic changes, and for research. No indvidual method...... between observers Potential users NIOSH Lifting Eq. NA X - O, R Arbouw M - - O ACGIH Lifting TLV M - - O MAC - - M O, W(?) ManTRA - - - O, R(?),W(?) NZ Code for MH - - - O, W(?) Washington state ergonomic rule M X M O, W(?) BackEST ML - M R   Correspondence with valid reference: HM = High to moderate, L......), and Washington state model. MAC (UK), ManTRA (Australia), and New Zealand code are widely used for the assessment of risks in MMH but we did not found formal studies on validity of these methods. The inter-observer repeatability of MAC and the Washington state model has been found to be moderate. Back...

  17. Performance evaluation of sea surface simulation methods for target detection

    Science.gov (United States)

    Xia, Renjie; Wu, Xin; Yang, Chen; Han, Yiping; Zhang, Jianqi

    2017-11-01

    With the fast development of sea surface target detection by optoelectronic sensors, machine learning has been adopted to improve the detection performance. Many features can be learned from training images by machines automatically. However, field images of sea surface target are not sufficient as training data. 3D scene simulation is a promising method to address this problem. For ocean scene simulation, sea surface height field generation is the key point to achieve high fidelity. In this paper, two spectra-based height field generation methods are evaluated. Comparison between the linear superposition and linear filter method is made quantitatively with a statistical model. 3D ocean scene simulating results show the different features between the methods, which can give reference for synthesizing sea surface target images with different ocean conditions.

  18. Laboratory-scale evaluations of alternative plutonium precipitation methods

    International Nuclear Information System (INIS)

    Martella, L.L.; Saba, M.T.; Campbell, G.K.

    1984-01-01

    Plutonium(III), (IV), and (VI) carbonate; plutonium(III) fluoride; plutonium(III) and (IV) oxalate; and plutonium(IV) and (VI) hydroxide precipitation methods were evaluated for conversion of plutonium nitrate anion-exchange eluate to a solid, and compared with the current plutonium peroxide precipitation method used at Rocky Flats. Plutonium(III) and (IV) oxalate, plutonium(III) fluoride, and plutonium(IV) hydroxide precipitations were the most effective of the alternative conversion methods tested because of the larger particle-size formation, faster filtration rates, and the low plutonium loss to the filtrate. These were found to be as efficient as, and in some cases more efficient than, the peroxide method. 18 references, 14 figures, 3 tables

  19. Comparison of evaluation results of piping thermal fatigue evaluation method based on equivalent stress amplitude

    International Nuclear Information System (INIS)

    Suzuki, Takafumi; Kasahara, Naoto

    2012-01-01

    In recent years, reports have increased about failure cases caused by high cycle thermal fatigue both at light water reactors and fast breeder reactors. One of the reasons of the cases is a turbulent mixing at a Tee-junction, where hot and cold temperature fluids are mixed, in a coolant system. In order to prevent thermal fatigue failures at Tee-junctions. The Japan Society of Mechanical Engineers published the guideline which is an evaluation method of high cycle thermal fatigue damage at nuclear pipes. In order to justify safety margin and make the procedure of the guideline concise, this paper proposes a new evaluation method of thermal fatigue damage with use of the 'equivalent stress amplitude.' Because this new method makes procedure of evaluation clear and concise, it will contribute to improving the guideline for thermal fatigue evaluation. (author)

  20. A simple and rapid lysis method for preparation of genomic DNA ...

    African Journals Online (AJOL)

    Yomi

    2011-12-07

    Dec 7, 2011 ... phenolic component of the essential oils of oregano and thyme (Lagouri ... to its hydrophobic nature, carvacrol induces this effect by damage of ..... Improved method for determination of antimicrobial activity of essentials oils in.