Sample records for splice-site analysis tools

  1. Systematic Analysis of Splice-Site-Creating Mutations in Cancer

    Directory of Open Access Journals (Sweden)

    Reyka G. Jayasinghe


    Full Text Available Summary: For the past decade, cancer genomic studies have focused on mutations leading to splice-site disruption, overlooking those having splice-creating potential. Here, we applied a bioinformatic tool, MiSplice, for the large-scale discovery of splice-site-creating mutations (SCMs across 8,656 TCGA tumors. We report 1,964 originally mis-annotated mutations having clear evidence of creating alternative splice junctions. TP53 and GATA3 have 26 and 18 SCMs, respectively, and ATRX has 5 from lower-grade gliomas. Mutations in 11 genes, including PARP1, BRCA1, and BAP1, were experimentally validated for splice-site-creating function. Notably, we found that neoantigens induced by SCMs are likely several folds more immunogenic compared to missense mutations, exemplified by the recurrent GATA3 SCM. Further, high expression of PD-1 and PD-L1 was observed in tumors with SCMs, suggesting candidates for immune blockade therapy. Our work highlights the importance of integrating DNA and RNA data for understanding the functional and the clinical implications of mutations in human diseases. : Jayasinghe et al. identify nearly 2,000 splice-site-creating mutations (SCMs from over 8,000 tumor samples across 33 cancer types. They provide a more accurate interpretation of previously mis-annotated mutations, highlighting the importance of integrating data types to understand the functional and the clinical implications of splicing mutations in human disease. Keywords: splicing, RNA, mutations of clinical relevance

  2. Analysis and recognition of 5 ' UTR intron splice sites in human pre-mRNA

    DEFF Research Database (Denmark)

    Eden, E.; Brunak, Søren


    Prediction of splice sites in non-coding regions of genes is one of the most challenging aspects of gene structure recognition. We perform a rigorous analysis of such splice sites embedded in human 5' untranslated regions (UTRs), and investigate correlations between this class of splice sites...... and other features found in the adjacent exons and introns. By restricting the training of neural network algorithms to 'pure' UTRs (not extending partially into protein coding regions), we for the first time investigate the predictive power of the splicing signal proper, in contrast to conventional splice...... in the synaptic weights of the neural networks trained to identify UTR donor sites. Conventional splice site prediction methods perform poorly in UTRs because the reading frame pattern is absent. The NetUTR method presented here performs 2-.3-fold better compared with NetGene2 and GenScan in 5' UTRs. We also...

  3. Analysis of 30 putative BRCA1 splicing mutations in hereditary breast and ovarian cancer families identifies exonic splice site mutations that escape in silico prediction.

    Directory of Open Access Journals (Sweden)

    Barbara Wappenschmidt

    Full Text Available Screening for pathogenic mutations in breast and ovarian cancer genes such as BRCA1/2, CHEK2 and RAD51C is common practice for individuals from high-risk families. However, test results may be ambiguous due to the presence of unclassified variants (UCV in the concurrent absence of clearly cancer-predisposing mutations. Especially the presence of intronic or exonic variants within these genes that possibly affect proper pre-mRNA processing poses a challenge as their functional implications are not immediately apparent. Therefore, it appears necessary to characterize potential splicing UCV and to develop appropriate classification tools. We investigated 30 distinct BRCA1 variants, both intronic and exonic, regarding their spliceogenic potential by commonly used in silico prediction algorithms (HSF, MaxEntScan along with in vitro transcript analyses. A total of 25 variants were identified spliceogenic, either causing/enhancing exon skipping or activation of cryptic splice sites, or both. Except from a single intronic variant causing minor effects on BRCA1 pre-mRNA processing in our analyses, 23 out of 24 intronic variants were correctly predicted by MaxEntScan, while HSF was less accurate in this cohort. Among the 6 exonic variants analyzed, 4 severely impair correct pre-mRNA processing, while the remaining two have partial effects. In contrast to the intronic alterations investigated, only half of the spliceogenic exonic variants were correctly predicted by HSF and/or MaxEntScan. These data support the idea that exonic splicing mutations are commonly disease-causing and concurrently prone to escape in silico prediction, hence necessitating experimental in vitro splicing analysis.

  4. Splice Site Mutations in the ATP7A Gene

    DEFF Research Database (Denmark)

    Skjørringe, Tina; Tümer, Zeynep; Møller, Lisbeth Birk


    Menkes disease (MD) is caused by mutations in the ATP7A gene. We describe 33 novel splice site mutations detected in patients with MD or the milder phenotypic form, Occipital Horn Syndrome. We review these 33 mutations together with 28 previously published splice site mutations. We investigate 12...... mutations for their effect on the mRNA transcript in vivo. Transcriptional data from another 16 mutations were collected from the literature. The theoretical consequences of splice site mutations, predicted with the bioinformatics tool Human Splice Finder, were investigated and evaluated in relation...... to in vivo results. Ninety-six percent of the mutations identified in 45 patients with classical MD were predicted to have a significant effect on splicing, which concurs with the absence of any detectable wild-type transcript in all 19 patients investigated in vivo. Sixty-seven percent of the mutations...

  5. Sequence Analysis of In Vivo-Expressed HIV-1 Spliced RNAs Reveals the Usage of New and Unusual Splice Sites by Viruses of Different Subtypes. (United States)

    Vega, Yolanda; Delgado, Elena; de la Barrera, Jorge; Carrera, Cristina; Zaballos, Ángel; Cuesta, Isabel; Mariño, Ana; Ocampo, Antonio; Miralles, Celia; Pérez-Castro, Sonia; Álvarez, Hortensia; López-Miragaya, Isabel; García-Bodas, Elena; Díez-Fuertes, Francisco; Thomson, Michael M


    HIV-1 RNAs are generated through a complex splicing mechanism, resulting in a great diversity of transcripts, which are classified in three major categories: unspliced, singly spliced (SS), and doubly spliced (DS). Knowledge on HIV-1 RNA splicing in vivo and by non-subtype B viruses is scarce. Here we analyze HIV-1 RNA splice site usage in CD4+CD25+ lymphocytes from HIV-1-infected individuals through pyrosequencing. HIV-1 DS and SS RNAs were amplified by RT-PCR in 19 and 12 samples, respectively. 13,108 sequences from HIV-1 spliced RNAs, derived from viruses of five subtypes (A, B, C, F, G), were identified. In four samples, three of non-B subtypes, five 3' splice sites (3'ss) mapping to unreported positions in the HIV-1 genome were identified. Two, designated A4i and A4j, were used in 22% and 25% of rev RNAs in two viruses of subtypes B and A, respectively. Given their close proximity (one or two nucleotides) to A4c and A4d, respectively, they could be viewed as variants of these sites. Three 3'ss, designated A7g, A7h, and A7i, located 20, 32, and 18 nucleotides downstream of A7, respectively, were identified in a subtype C (A7g, A7h) and a subtype G (A7i) viruses, each in around 2% of nef RNAs. The new splice sites or variants of splice sites were associated with the usual sequence features of 3'ss. Usage of unusual 3'ss A4d, A4e, A5a, A7a, and A7b was also detected. A4f, previously identified in two subtype C viruses, was preferentially used by rev RNAs of a subtype C virus. These results highlight the great diversity of in vivo splice site usage by HIV-1 RNAs. The fact that four of five newly identified splice sites or variants of splice sites were detected in non-subtype B viruses allows anticipating an even greater diversity of HIV-1 splice site usage than currently known.

  6. Splice site mutations in the ATP7A gene.

    Directory of Open Access Journals (Sweden)

    Tina Skjørringe

    Full Text Available Menkes disease (MD is caused by mutations in the ATP7A gene. We describe 33 novel splice site mutations detected in patients with MD or the milder phenotypic form, Occipital Horn Syndrome. We review these 33 mutations together with 28 previously published splice site mutations. We investigate 12 mutations for their effect on the mRNA transcript in vivo. Transcriptional data from another 16 mutations were collected from the literature. The theoretical consequences of splice site mutations, predicted with the bioinformatics tool Human Splice Finder, were investigated and evaluated in relation to in vivo results. Ninety-six percent of the mutations identified in 45 patients with classical MD were predicted to have a significant effect on splicing, which concurs with the absence of any detectable wild-type transcript in all 19 patients investigated in vivo. Sixty-seven percent of the mutations identified in 12 patients with milder phenotypes were predicted to have no significant effect on splicing, which concurs with the presence of wild-type transcript in 7 out of 9 patients investigated in vivo. Both the in silico predictions and the in vivo results support the hypothesis previously suggested by us and others, that the presence of some wild-type transcript is correlated to a milder phenotype.

  7. The emergence of alternative 3' and 5' splice site exons from constitutive exons.

    Directory of Open Access Journals (Sweden)

    Eli Koren


    Full Text Available Alternative 3' and 5' splice site (ss events constitute a significant part of all alternative splicing events. These events were also found to be related to several aberrant splicing diseases. However, only few of the characteristics that distinguish these events from alternative cassette exons are known currently. In this study, we compared the characteristics of constitutive exons, alternative cassette exons, and alternative 3'ss and 5'ss exons. The results revealed that alternative 3'ss and 5'ss exons are an intermediate state between constitutive and alternative cassette exons, where the constitutive side resembles constitutive exons, and the alternative side resembles alternative cassette exons. The results also show that alternative 3'ss and 5'ss exons exhibit low levels of symmetry (frame-preserving, similar to constitutive exons, whereas the sequence between the two alternative splice sites shows high symmetry levels, similar to alternative cassette exons. In addition, flanking intronic conservation analysis revealed that exons whose alternative splice sites are at least nine nucleotides apart show a high conservation level, indicating intronic participation in the regulation of their splicing, whereas exons whose alternative splice sites are fewer than nine nucleotides apart show a low conservation level. Further examination of these exons, spanning seven vertebrate species, suggests an evolutionary model in which the alternative state is a derivative of an ancestral constitutive exon, where a mutation inside the exon or along the flanking intron resulted in the creation of a new splice site that competes with the original one, leading to alternative splice site selection. This model was validated experimentally on four exons, showing that they indeed originated from constitutive exons that acquired a new competing splice site during evolution.

  8. Method of predicting Splice Sites based on signal interactions

    Directory of Open Access Journals (Sweden)

    Deogun Jitender S


    Full Text Available Abstract Background Predicting and proper ranking of canonical splice sites (SSs is a challenging problem in bioinformatics and machine learning communities. Any progress in SSs recognition will lead to better understanding of splicing mechanism. We introduce several new approaches of combining a priori knowledge for improved SS detection. First, we design our new Bayesian SS sensor based on oligonucleotide counting. To further enhance prediction quality, we applied our new de novo motif detection tool MHMMotif to intronic ends and exons. We combine elements found with sensor information using Naive Bayesian Network, as implemented in our new tool SpliceScan. Results According to our tests, the Bayesian sensor outperforms the contemporary Maximum Entropy sensor for 5' SS detection. We report a number of putative Exonic (ESE and Intronic (ISE Splicing Enhancers found by MHMMotif tool. T-test statistics on mouse/rat intronic alignments indicates, that detected elements are on average more conserved as compared to other oligos, which supports our assumption of their functional importance. The tool has been shown to outperform the SpliceView, GeneSplicer, NNSplice, Genio and NetUTR tools for the test set of human genes. SpliceScan outperforms all contemporary ab initio gene structural prediction tools on the set of 5' UTR gene fragments. Conclusion Designed methods have many attractive properties, compared to existing approaches. Bayesian sensor, MHMMotif program and SpliceScan tools are freely available on our web site. Reviewers This article was reviewed by Manyuan Long, Arcady Mushegian and Mikhail Gelfand.

  9. Identification of a novel splice-site mutation in MIP in a Chinese congenital cataract family. (United States)

    Jiang, Jin; Jin, Chongfei; Wang, Wei; Tang, Xiajing; Shentu, Xingchao; Wu, Renyi; Wang, Yao; Xia, Kun; Yao, Ke


    To map the locus and identify the gene causing autosomal dominant congenital cataract (ADCC) with "snail-like" phenotype in a large Chinese family. Clinical and ophthalmologic examinations were conducted on family members and documented by slit lamp photography. Linkage analysis was performed with an initial 41 microsatellite markers, then 3 additional markers flanking the major intrinsic protein (MIP) gene. Mutations were screened by DNA sequencing and verified by restriction fragment length polymorphism (RFLP) analysis. Significant two-point LOD scores were obtained at 5 markers flanking MIP with the highest 3.08 (theta=0.00) at marker D12S1632. Mutation screening of MIP identified a heterozygous G>A transition at the acceptor splice site of intron 3 (IVS3 -1 G>A), abolishing a BstSF I restriction site in one allele of all the affected individuals. We identified a novel splice-site mutation (IVS3 -1 G>A in MIP) in a Chinese ADCC family. To our knowledge, this is the first report on an acceptor splice-site mutation in human genes associated with ADCC.

  10. Intraspecific variations of Dekkera/Brettanomyces bruxellensis genome studied by capillary electrophoresis separation of the intron splice site profiles. (United States)

    Vigentini, Ileana; De Lorenzis, Gabriella; Picozzi, Claudia; Imazio, Serena; Merico, Annamaria; Galafassi, Silvia; Piškur, Jure; Foschino, Roberto


    In enology, "Brett" character refers to the wine spoilage caused by the yeast Dekkera/Brettanomyces bruxellensis and its production of volatile phenolic off-flavours. However, the spoilage potential of this yeast is strain-dependent. Therefore, a rapid and reliable recognition at the strain level is a key point to avoid serious economic losses. The present work provides an operative tool to assess the genetic intraspecific variation in this species through the use of introns as molecular targets. Firstly, the available partial D./B. bruxellensis genome sequence was investigated in order to build primers annealing to introns 5' splice site sequence (ISS). This analysis allowed the detection of a non-random vocabulary flanking the site and, exploiting this feature, the creation of specific probes for strain discrimination. Secondly, the separation of the intron splice site PCR fragments was obtained throughout the set up of a capillary electrophoresis protocol, giving a 94% repeatability threshold in our experimental conditions. The comparison of results obtained with ISS-PCR/CE versus the ones performed by mtDNA RFLP revealed that the former protocol is more discriminating and allowed a reliable identification at strain level. Actually sixty D./B. bruxellensis isolates were recognised as unique strains, showing a level of similarity below 79% and confirming the high genetic polymorphism existing within the species. Two main clusters were grouped at similarity levels of about 46% and 47%, respectively, showing a poor correlation with the geographic area of isolation. Moreover, from the evolutionary point of view, the proposed technique could determine the frequency of the genome rearrangements that can occur in D./B. bruxellesis populations. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. GC content around splice sites affects splicing through pre-mRNA secondary structures

    Directory of Open Access Journals (Sweden)

    Chen Liang


    Full Text Available Abstract Background Alternative splicing increases protein diversity by generating multiple transcript isoforms from a single gene through different combinations of exons or through different selections of splice sites. It has been reported that RNA secondary structures are involved in alternative splicing. Here we perform a genomic study of RNA secondary structures around splice sites in humans (Homo sapiens, mice (Mus musculus, fruit flies (Drosophila melanogaster, and nematodes (Caenorhabditis elegans to further investigate this phenomenon. Results We observe that GC content around splice sites is closely associated with the splice site usage in multiple species. RNA secondary structure is the possible explanation, because the structural stability difference among alternative splice sites, constitutive splice sites, and skipped splice sites can be explained by the GC content difference. Alternative splice sites tend to be GC-enriched and exhibit more stable RNA secondary structures in all of the considered species. In humans and mice, splice sites of first exons and long exons tend to be GC-enriched and hence form more stable structures, indicating the special role of RNA secondary structures in promoter proximal splicing events and the splicing of long exons. In addition, GC-enriched exon-intron junctions tend to be overrepresented in tissue-specific alternative splice sites, indicating the functional consequence of the GC effect. Compared with regions far from splice sites and decoy splice sites, real splice sites are GC-enriched. We also found that the GC-content effect is much stronger than the nucleotide-order effect to form stable secondary structures. Conclusion All of these results indicate that GC content is related to splice site usage and it may mediate the splicing process through RNA secondary structures.

  12. A novel AVPR2 splice site mutation leads to partial X-linked nephrogenic diabetes insipidus in two brothers. (United States)

    Schernthaner-Reiter, Marie Helene; Adams, David; Trivellin, Giampaolo; Ramnitz, Mary Scott; Raygada, Margarita; Golas, Gretchen; Faucz, Fabio R; Nilsson, Ola; Nella, Aikaterini A; Dileepan, Kavitha; Lodish, Maya; Lee, Paul; Tifft, Cynthia; Markello, Thomas; Gahl, William; Stratakis, Constantine A


    X-linked nephrogenic diabetes insipidus (NDI, OMIM#304800) is caused by mutations in the arginine vasopressin (AVP, OMIM*192340) receptor type 2 (AVPR2, OMIM*300538) gene. A 20-month-old boy and his 8-year-old brother presented with polyuria, polydipsia, and failure to thrive. Both boys demonstrated partial DDAVP (1-desamino-8-D AVP or desmopressin) responses; thus, NDI diagnosis was delayed. While routine sequencing of AVPR2 showed a potential splice site variant, it was not until exome sequencing confirmed the AVPR2 splice site variant and did not reveal any more likely candidates that the patients' diagnosis was made and proper treatment was instituted. Both patients were hemizygous for two AVPR2 variants predicted in silico to affect AVPR2 messenger RNA (mRNA) splicing. A minigene assay revealed that the novel AVPR2 c.276A>G mutation creates a novel splice acceptor site leading to 5' truncation of AVPR2 exon 2 in HEK293 human kidney cells. Both patients have been treated with high-dose DDAVP with a remarkable improvement of their symptoms and accelerated linear growth and weight gain. We present here a unique case of partial X-linked NDI due to an AVPR2 splice site mutation; patients with diabetes insipidus of unknown etiology may harbor splice site mutations that are initially underestimated in their pathogenicity on sequence analysis. • X-linked nephrogenic diabetes insipidus is caused by AVPR2 mutations, and disease severity can vary depending on the functional effect of the mutation. What is New: • We demonstrate here that a splice site mutation in AVPR2 leads to partial X-linked NDI in two brothers. • Treatment with high-dose DDAVP led to improvement of polyuria and polydipsia, weight gain, and growth.

  13. Oriented scanning is the leading mechanism underlying 5' splice site selection in mammals.

    Directory of Open Access Journals (Sweden)

    Keren Borensztajn


    Full Text Available Splice site selection is a key element of pre-mRNA splicing. Although it is known to involve specific recognition of short consensus sequences by the splicing machinery, the mechanisms by which 5' splice sites are accurately identified remain controversial and incompletely resolved. The human F7 gene contains in its seventh intron (IVS7 a 37-bp VNTR minisatellite whose first element spans the exon7-IVS7 boundary. As a consequence, the IVS7 authentic donor splice site is followed by several cryptic splice sites identical in sequence, referred to as 5' pseudo-sites, which normally remain silent. This region, therefore, provides a remarkable model to decipher the mechanism underlying 5' splice site selection in mammals. We previously suggested a model for splice site selection that, in the presence of consecutive splice consensus sequences, would stimulate exclusively the selection of the most upstream 5' splice site, rather than repressing the 3' following pseudo-sites. In the present study, we provide experimental support to this hypothesis by using a mutational approach involving a panel of 50 mutant and wild-type F7 constructs expressed in various cell types. We demonstrate that the F7 IVS7 5' pseudo-sites are functional, but do not compete with the authentic donor splice site. Moreover, we show that the selection of the 5' splice site follows a scanning-type mechanism, precluding competition with other functional 5' pseudo-sites available on immediate sequence context downstream of the activated one. In addition, 5' pseudo-sites with an increased complementarity to U1snRNA up to 91% do not compete with the identified scanning mechanism. Altogether, these findings, which unveil a cell type-independent 5'-3'-oriented scanning process for accurate recognition of the authentic 5' splice site, reconciliate apparently contradictory observations by establishing a hierarchy of competitiveness among the determinants involved in 5' splice site selection.

  14. Oriented scanning is the leading mechanism underlying 5' splice site selection in mammals

    NARCIS (Netherlands)

    Borensztajn, Keren; Sobrier, Marie-Laure; Duquesnoy, Philippe; Fischer, Anne-Marie; Tapon-Bretaudière, Jacqueline; Amselem, Serge


    Splice site selection is a key element of pre-mRNA splicing. Although it is known to involve specific recognition of short consensus sequences by the splicing machinery, the mechanisms by which 5' splice sites are accurately identified remain controversial and incompletely resolved. The human F7

  15. Prenatal diagnosis and a donor splice site mutation in fibrillin in a family with Marfan syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Godfrey, M.; Vandemark, N.; Wang, M.; Han, J.; Rao, V.H. (Univ. of Nebraska Medical Center, Omaha (United States)); Velinov, M.; Tsipouras, P. (Univ. of Connecticut Health Sciences Center, Farmington (United States)); Wargowski, D.; Becker, J.; Robertson, W.; Droste, S. (Univ. of Wisconsin, Madison (United States))


    The Marfan syndrome, an autosomal dominant connective tissue disorder, is manifested by abnormalities in the cardiovascular, skeletal, and ocular systems. Recently, fibrillin, an elastic-associated microfibrillar glycoprotein, has been linked to the Marfan syndrome, and fibrillin mutations in affected individuals have been documented. In this study, genetic linkage analysis with fibrillin-specific markers was used to establish the prenatal diagnosis in an 11-wk-gestation fetus in a four-generation Marfan kindred. At birth, skeletal changes suggestive of the Marfan syndrome were observed. Reverse transcription-PCR amplification of the fibrillin gene mRNA detected a deletion of 123 bp in one allele in affected relatives. This deletion corresponds to an exon encoding an epidermal growth factor-like motif. Examination of genomic DNA showed a G[yields]C transversion at the +1 consensus donor splice site. 45 refs., 7 figs.

  16. Identification of a 5' splice site mutation in the RPGR gene in a family with X-linked retinitis pigmentosa (RP3)

    NARCIS (Netherlands)

    Dry, K. L.; Manson, F. D.; Lennon, A.; Bergen, A. A.; van Dorp, D. B.; Wright, A. F.


    We have identified a novel RPGR gene mutation in a large Dutch family with X-linked retinitis pigmentosa (RP3). In affected members, a G-->T transversion was found at position +1 of the 5' splice site of intron 5 of the RPGR (retinitis pigmentosa GTPase regulator) gene. Analysis of this mutation at

  17. Two novel splicing mutations in the SLC45A2 gene cause Oculocutaneous Albinism Type IV by unmasking cryptic splice sites. (United States)

    Straniero, Letizia; Rimoldi, Valeria; Soldà, Giulia; Mauri, Lucia; Manfredini, Emanuela; Andreucci, Elena; Bargiacchi, Sara; Penco, Silvana; Gesu, Giovanni P; Del Longo, Alessandra; Piozzi, Elena; Asselta, Rosanna; Primignani, Paola


    Oculocutaneous albinism (OCA) is characterized by hypopigmentation of the skin, hair and eye, and by ophthalmologic abnormalities caused by a deficiency in melanin biosynthesis. OCA type IV (OCA4) is one of the four commonly recognized forms of albinism, and is determined by mutation in the SLC45A2 gene. Here, we investigated the genetic basis of OCA4 in an Italian child. The mutational screening of the SLC45A2 gene identified two novel potentially pathogenic splicing mutations: a synonymous transition (c.888G>A) involving the last nucleotide of exon 3 and a single-nucleotide insertion (c.1156+2dupT) within the consensus sequence of the donor splice site of intron 5. As computer-assisted analysis for mutant splice-site prediction was not conclusive, we investigated the effects on pre-mRNA splicing of these two variants by using an in vitro minigene approach. Production of mutant transcripts in HeLa cells demonstrated that both mutations cause the almost complete abolishment of the physiologic donor splice site, with the concomitant unmasking of cryptic donor splice sites. To our knowledge, this work represents the first in-depth molecular characterization of splicing defects in a OCA4 patient.

  18. Features generated for computational splice-site prediction correspond to functional elements

    Directory of Open Access Journals (Sweden)

    Wilbur W John


    Full Text Available Abstract Background Accurate selection of splice sites during the splicing of precursors to messenger RNA requires both relatively well-characterized signals at the splice sites and auxiliary signals in the adjacent exons and introns. We previously described a feature generation algorithm (FGA that is capable of achieving high classification accuracy on human 3' splice sites. In this paper, we extend the splice-site prediction to 5' splice sites and explore the generated features for biologically meaningful splicing signals. Results We present examples from the observed features that correspond to known signals, both core signals (including the branch site and pyrimidine tract and auxiliary signals (including GGG triplets and exon splicing enhancers. We present evidence that features identified by FGA include splicing signals not found by other methods. Conclusion Our generated features capture known biological signals in the expected sequence interval flanking splice sites. The method can be easily applied to other species and to similar classification problems, such as tissue-specific regulatory elements, polyadenylation sites, promoters, etc.

  19. Effect of splice-site polymorphisms of the TMPRSS4, NPHP4 and ...

    Indian Academy of Sciences (India)


    1 Handayama,. Hamamatsu ... structural changes in mRNA transcripts as a result of splice-site polymorphisms implies that they may be of biological significance in ... structural change in an mRNA transcript, leading to the production of a ...

  20. Features of 5'-splice-site efficiency derived from disease-causing mutations and comparative genomics

    DEFF Research Database (Denmark)

    Roca, Xavier; Olson, Andrew J; Rao, Atmakuri R


    Many human diseases, including Fanconi anemia, hemophilia B, neurofibromatosis, and phenylketonuria, can be caused by 5'-splice-site (5'ss) mutations that are not predicted to disrupt splicing, according to position weight matrices. By using comparative genomics, we identify pairwise dependencies...

  1. Ab initio prediction of mutation-induced cryptic splice-site activation and exon skipping

    Czech Academy of Sciences Publication Activity Database

    Divina, Petr; Kvitkovicova, Andrea; Buratti, E.; Vorechovsky, I.


    Roč. 17, č. 6 (2009), s. 759-765 ISSN 1018-4813 Institutional research plan: CEZ:AV0Z50520514 Keywords : mutation * cryptic splice site * exon skipping Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 3.564, year: 2009

  2. Effect of splice-site polymorphisms of the TMPRSS4, NPHP4 and ...

    Indian Academy of Sciences (India)


    structural changes in mRNA transcripts as a result of splice-site polymorphisms implies that they may be of biological significance in certain pathological conditions. ..... show the genomic structures of the normal (diagram “a”) and abnormal (diagram “b” and “c”) splicing forms. Inserted and deleted sequences are indicated ...

  3. OCA2 splice site variant in German Spitz dogs with oculocutaneous albinism. (United States)

    Caduff, Madleina; Bauer, Anina; Jagannathan, Vidhya; Leeb, Tosso


    We investigated a German Spitz family where the mating of a black male to a white female had yielded three puppies with an unexpected light brown coat color, lightly pigmented lips and noses, and blue eyes. Combined linkage and homozygosity analysis based on a fully penetrant monogenic autosomal recessive mode of inheritance identified a critical interval of 15 Mb on chromosome 3. We obtained whole genome sequence data from one affected dog, three wolves, and 188 control dogs. Filtering for private variants revealed a single variant with predicted high impact in the critical interval in LOC100855460 (XM_005618224.1:c.377+2T>G LT844587.1:c.-45+2T>G). The variant perfectly co-segregated with the phenotype in the family. We genotyped 181 control dogs with normal pigmentation from diverse breeds including 22 unrelated German Spitz dogs, which were all homozygous wildtype. Comparative sequence analyses revealed that LOC100855460 actually represents the 5'-end of the canine OCA2 gene. The CanFam 3.1 reference genome assembly is incorrect and separates the first two exons from the remaining exons of the OCA2 gene. We amplified a canine OCA2 cDNA fragment by RT-PCR and determined the correct full-length mRNA sequence (LT844587.1). Variants in the OCA2 gene cause oculocutaneous albinism type 2 (OCA2) in humans, pink-eyed dilution in mice, and similar phenotypes in corn snakes, medaka and Mexican cave tetra fish. We therefore conclude that the observed oculocutaneous albinism in German Spitz is most likely caused by the identified variant in the 5'-splice site of the first intron of the canine OCA2 gene.

  4. OCA2 splice site variant in German Spitz dogs with oculocutaneous albinism.

    Directory of Open Access Journals (Sweden)

    Madleina Caduff

    Full Text Available We investigated a German Spitz family where the mating of a black male to a white female had yielded three puppies with an unexpected light brown coat color, lightly pigmented lips and noses, and blue eyes. Combined linkage and homozygosity analysis based on a fully penetrant monogenic autosomal recessive mode of inheritance identified a critical interval of 15 Mb on chromosome 3. We obtained whole genome sequence data from one affected dog, three wolves, and 188 control dogs. Filtering for private variants revealed a single variant with predicted high impact in the critical interval in LOC100855460 (XM_005618224.1:c.377+2T>G LT844587.1:c.-45+2T>G. The variant perfectly co-segregated with the phenotype in the family. We genotyped 181 control dogs with normal pigmentation from diverse breeds including 22 unrelated German Spitz dogs, which were all homozygous wildtype. Comparative sequence analyses revealed that LOC100855460 actually represents the 5'-end of the canine OCA2 gene. The CanFam 3.1 reference genome assembly is incorrect and separates the first two exons from the remaining exons of the OCA2 gene. We amplified a canine OCA2 cDNA fragment by RT-PCR and determined the correct full-length mRNA sequence (LT844587.1. Variants in the OCA2 gene cause oculocutaneous albinism type 2 (OCA2 in humans, pink-eyed dilution in mice, and similar phenotypes in corn snakes, medaka and Mexican cave tetra fish. We therefore conclude that the observed oculocutaneous albinism in German Spitz is most likely caused by the identified variant in the 5'-splice site of the first intron of the canine OCA2 gene.

  5. A novel BTK gene mutation creates a de-novo splice site in an X-linked agammaglobulinemia patient. (United States)

    Chear, Chai Teng; Ripen, Adiratna Mat; Mohamed, Sharifah Adlena Syed; Dhaliwal, Jasbir Singh


    Bruton's tyrosine kinase (BTK), encoded by the BTK gene, is a cytoplasmic protein critical in B cell development. Mutations in the BTK gene cause X-linked agammaglobulinemia (XLA), a primary immunodeficiency with characteristically low or absent B cells and antibodies. This report describes a five year-old boy who presented with otitis externa, arthritis, reduced immunoglobulins and no B cells. Flow cytometry showed undetectable monocyte BTK expression. Sequencing revealed a novel mutation at exon 13 of the BTK gene which created a de novo splice site with a proximal 5 nucleotide loss resulting in a truncated BTK protein. The patient still suffered from ear infection despite intravenous immunoglobulin replacement therapy. In this study, mosaicism was seen only in the mother's genomic DNA. These results suggest that a combination of flow cytometry and BTK gene analysis is important for XLA diagnosis and carrier screening. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Modulation of 5' splice site selection using tailed oligonucleotides carrying splicing signals

    Directory of Open Access Journals (Sweden)

    Elela Sherif


    Full Text Available Abstract Background We previously described the use of tailed oligonucleotides as a means of reprogramming alternative pre-mRNA splicing in vitro and in vivo. The tailed oligonucleotides that were used interfere with splicing because they contain a portion complementary to sequences immediately upstream of the target 5' splice site combined with a non-hybridizing 5' tail carrying binding sites for the hnRNP A1/A2 proteins. In the present study, we have tested the inhibitory activity of RNA oligonucleotides carrying different tail structures. Results We show that an oligonucleotide with a 5' tail containing the human β-globin branch site sequence inhibits the use of the 5' splice site of Bcl-xL, albeit less efficiently than a tail containing binding sites for the hnRNP A1/A2 proteins. A branch site-containing tail positioned at the 3' end of the oligonucleotide also elicited splicing inhibition but not as efficiently as a 5' tail. The interfering activity of a 3' tail was improved by adding a 5' splice site sequence next to the branch site sequence. A 3' tail carrying a Y-shaped branch structure promoted similar splicing interference. The inclusion of branch site or 5' splice site sequences in the Y-shaped 3' tail further improved splicing inhibition. Conclusion Our in vitro results indicate that a variety of tail architectures can be used to elicit splicing interference at low nanomolar concentrations, thereby broadening the scope and the potential impact of this antisense technology.

  7. Splice site mutations in mismatch repair genes and risk of cancer in the general population

    DEFF Research Database (Denmark)

    Thomsen, Mette; Nordestgaard, Børge G; Tybjærg-Hansen, Anne


    We tested the hypothesis that splice site variations in MSH2 and MLH1 are associated with increased risk of hereditary non-polyposis colorectal cancer (HNPCC) and of cancer in general in the general population. In a cohort of 154 HNPCC patients with sequenced MSH2 and MLH1, we identified four pos......-related cancers and of all cancers combined in the general population. These findings are novel and important in the counseling of HNPCC patients and their relatives.......We tested the hypothesis that splice site variations in MSH2 and MLH1 are associated with increased risk of hereditary non-polyposis colorectal cancer (HNPCC) and of cancer in general in the general population. In a cohort of 154 HNPCC patients with sequenced MSH2 and MLH1, we identified four...... possible splice-site mutations, which we subsequently genotyped in more than 9,000 individuals from the general population. Allele frequencies in the general population were 0 % for 942+3A>T in MSH2, 0.05 % for 307-19A>G, 0.005 % for 1,667+(2-8)del(taaatca);ins(attt), and 4.4 % for 1039-8T>A in MLH1. Odds...

  8. A 5' splice site enhances the recruitment of basal transcription initiation factors in vivo

    DEFF Research Database (Denmark)

    Damgaard, Christian Kroun; Kahns, Søren; Lykke-Andersen, Søren


    Transcription and pre-mRNA splicing are interdependent events. Although mechanisms governing the effects of transcription on splicing are becoming increasingly clear, the means by which splicing affects transcription remain elusive. Using cell lines stably expressing HIV-1 or β-globin mRNAs, harb...... a promoter-proximal 5′ splice site via its U1 snRNA interaction can feed back to stimulate transcription initiation by enhancing preinitiation complex assembly.......Transcription and pre-mRNA splicing are interdependent events. Although mechanisms governing the effects of transcription on splicing are becoming increasingly clear, the means by which splicing affects transcription remain elusive. Using cell lines stably expressing HIV-1 or β-globin mRNAs......, harboring wild-type or various 5′ splice site mutations, we demonstrate a strong positive correlation between splicing efficiency and transcription activity. Interestingly, a 5′ splice site can stimulate transcription even in the absence of splicing. Chromatin immunoprecipitation experiments show enhanced...

  9. G to A substitution in 5{prime} donor splice site of introns 18 and 48 of COL1A1 gene of type I collagen results in different splicing alternatives in osteogenesis imperfecta type I cell strains

    Energy Technology Data Exchange (ETDEWEB)

    Willing, M.; Deschenes, S. [Univ. of Iowa, Iowa City, IA (United States)


    We have identified a G to A substitution in the 5{prime} donor splice site of intron 18 of one COL1A1 allele in two unrelated families with osteogenesis imperfecta (OI) type I. A third OI type I family has a G to A substitution at the identical position in intron 48 of one COL1A1 allele. Both mutations abolish normal splicing and lead to reduced steady-state levels of mRNA from the mutant COL1A1 allele. The intron 18 mutation leads to both exon 18 skipping in the mRNA and to utilization of a single alternative splice site near the 3{prime} end of exon 18. The latter results in deletion of the last 8 nucleotides of exon 18 from the mRNA, a shift in the translational reading-frame, and the creation of a premature termination codon in exon 19. Of the potential alternative 5{prime} splice sites in exon 18 and intron 18, the one utilized has a surrounding nucleotide sequence which most closely resembles that of the natural splice site. Although a G to A mutation was detected at the identical position in intron 48 of one COL1A1 allele in another OI type I family, nine complex alternative splicing patterns were identified by sequence analysis of cDNA clones derived from fibroblast mRNA from this cell strain. All result in partial or complete skipping of exon 48, with in-frame deletions of portions of exons 47 and/or 49. The different patterns of RNA splicing were not explained by their sequence homology with naturally occuring 5{prime} splice sites, but rather by recombination between highly homologous exon sequences, suggesting that we may not have identified the major splicing alternative(s) in this cell strain. Both G to A mutations result in decreased production of type I collagen, the common biochemical correlate of OI type I.

  10. Computational Recognition of RNA Splice Sites by Exact Algorithms for the Quadratic Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Anja Fischer


    Full Text Available One fundamental problem of bioinformatics is the computational recognition of DNA and RNA binding sites. Given a set of short DNA or RNA sequences of equal length such as transcription factor binding sites or RNA splice sites, the task is to learn a pattern from this set that allows the recognition of similar sites in another set of DNA or RNA sequences. Permuted Markov (PM models and permuted variable length Markov (PVLM models are two powerful models for this task, but the problem of finding an optimal PM model or PVLM model is NP-hard. While the problem of finding an optimal PM model or PVLM model of order one is equivalent to the traveling salesman problem (TSP, the problem of finding an optimal PM model or PVLM model of order two is equivalent to the quadratic TSP (QTSP. Several exact algorithms exist for solving the QTSP, but it is unclear if these algorithms are capable of solving QTSP instances resulting from RNA splice sites of at least 150 base pairs in a reasonable time frame. Here, we investigate the performance of three exact algorithms for solving the QTSP for ten datasets of splice acceptor sites and splice donor sites of five different species and find that one of these algorithms is capable of solving QTSP instances of up to 200 base pairs with a running time of less than two days.

  11. Whole exome sequencing identifies a novel splice-site mutation in ADAMTS17 in an Indian family with Weill-Marchesani syndrome. (United States)

    Shah, Mohd Hussain; Bhat, Vishwanath; Shetty, Jyoti S; Kumar, Arun


    Weill-Marchesani syndrome (WMS) is a rare connective tissue disorder, characterized by short stature, microspherophakic lens, and stubby hands and feet (brachydactyly). WMS is caused by mutations in the FBN1, ADAMTS10, and LTBP2 genes. Mutations in the LTBP2 and ADAMTS17 genes cause a WMS-like syndrome, in which the affected individuals show major features of WMS but do not display brachydactyly and joint stiffness. The main purpose of our study was to determine the genetic cause of WMS in an Indian family. Whole exome sequencing (WES) was used to identify the genetic cause of WMS in the family. The cosegregation of the mutation was determined with Sanger sequencing. Reverse transcription (RT)-PCR analysis was used to assess the effect of a splice-site mutation on splicing of the ADAMTS17 transcript. The WES analysis identified a homozygous novel splice-site mutation c.873+1G>T in a known WMS-like syndrome gene, ADAMTS17, in the family. RT-PCR analysis in the patient showed that exon 5 was skipped, which resulted in the deletion of 28 amino acids in the ADAMTS17 protein. The mutation in the WMS-like syndrome gene ADAMTS17 also causes WMS in an Indian family. The present study will be helpful in genetic diagnosis of this family and increases the number of mutations of this gene to six.

  12. IVS8+1 DelG, a Novel Splice Site Mutation Causing DFNA5 Deafness in a Chinese Family. (United States)

    Li-Yang, Mei-Na; Shen, Xiao-Fei; Wei, Qin-Jun; Yao, Jun; Lu, Ya-Jie; Cao, Xin; Xing, Guang-Qian


    Nonsyndromic hearing loss (NSHL) is highly heterogeneous, in which more than 90 causative genes have currently been identified. DFNA5 is one of the deafness genes that known to cause autosomal dominant NSHL. Until date, only five DFNA5 mutations have been described in eight families worldwide. In this study, we reported the identification of a novel pathogenic mutation causing DFNA5 deafness in a five-generation Chinese family. After detailed clinical evaluations of this family, the genomic DNA of three affected individuals was selected for targeted exome sequencing of 101 known deafness genes, as well as mitochondrial DNA and microRNA regions. Co-segregation analysis between the hearing loss and the candidate variant was confirmed in available family members by direct polymerase chain reaction (PCR)-Sanger sequencing. Real-time PCR (RT-PCR) was performed to investigate the potential effect of the pathogenic mutation on messenger RNA splicing. Clinical evaluations revealed a similar deafness phenotype in this family to that of previously reported DFNA5 families with autosomal dominant, late-onset hearing loss. Molecular analysis identified a novel splice site mutation in DFNA5 intron 8 (IVS8+1 delG). The mutation segregated with the hearing loss of the family and was absent in 120 unrelated control DNA samples of Chinese origin. RT-PCR showed skipping of exon 8 in the mutant transcript. We identified a novel DFNA5 mutation IVS8+1 delG in a Chinese family which led to skipping of exon 8. This is the sixth DFNA5 mutation relates to hearing loss and the second one in DFNA5 intron 8. Our findings provide further support to the hypothesis that the DFNA5-associated hearing loss represents a mechanism of gain-of-function.

  13. Clinical and genetic studies in a family with a new splice-site mutation in the choroideremia gene. (United States)

    Contestabile, Maria T; Piane, Maria; Cascone, Nikhil C; Pasquale, Nadia; Ciarnella, Angela; Recupero, Santi M; Chessa, Luciana


    To describe the clinical and molecular findings of an Italian family with a new mutation in the choroideremia (CHM) gene. We performed a comprehensive ophthalmologic examination, fundus photography, macular optical coherence tomography, perimetry, electroretinography, and fluorescein angiography in an Italian family. The clinical diagnosis was supported by western blot analysis of lymphoblastoid cell lines from patients with CHM and carriers, using a monoclonal antibody against the 415 C-terminal amino acids of Rab escort protein-1 (REP-1). Sequencing of the CHM gene was undertaken on genomic DNA from affected men and carriers; the RNA transcript was analyzed with reverse transcriptase-PCR. The affected men showed a variability in the rate of visual change and in the degree of clinical and functional ophthalmologic involvement, mainly age-related, while the women displayed aspecific areas of chorioretinal degeneration. Western blot did not show a detectable amount of normal REP-1 protein in affected men who were hemizygous for a novel mutation, c.819+2T>A at the donor splicing site of intron 6 of the CHM gene; the mutation was confirmed in heterozygosity in the carriers. Western blot of the REP-1 protein confirmed the clinical diagnosis, and molecular analysis showed the new in-frame mutation, c.819+2T>A, leading to loss of function of the REP-1 protein. These results emphasize the value of a diagnostic approach that correlates genetic and ophthalmologic data for identifying carriers in families with CHM. An early diagnosis might be crucial for genetic counseling of this type of progressive and still untreatable disease.

  14. Novel aberrant splicings caused by a splice site mutation (IVS1a+5g>a) in F7 gene. (United States)

    Ding, Qiulan; Wu, Wenman; Fu, Qihua; Wang, Xuefeng; Hu, Yiqun; Wang, Hongli; Wang, Zhenyi


    Low FVII coagulant activity (FVII:C 8.2%) and antigen level (FVII:Ag 34.1%) in a 46-year-old Chinese male led to a diagnosis of coagulation factor VII (FVII) deficiency. Compound heterozygous mutations were identified in his F 7 gene:a G to A transition in the 5' donor splice site of intron 1a (IVS1a+5g>a) and a T to G transition at the nucleotide position 10961 in exon 8, resulting in a His to Gln substitution at amino acid residue 348. An analysis of ectopic transcripts of F7 in the leukocytes of the patient reveals that the mutation (IVS1a+5g>a) is associated with two novel aberrant patterns of splicing. The predominant alternative transcript removes exon 2, but retains intron 3, which shifts the reading frame and predicts a premature translation termination at the nucleotide positions 2-4 in intron 3. The minor alternative transcript skips both exon 2 and exon 3 (FVII Delta 2, 3), leading to an in-frame deletion of the propeptide and gamma-carboxylated glutamic acid (Gla) domains of mature FVII protein. In vitro expression studies of the alternative transcript FVII Delta 2,3 by transient transfection of HEK 293 cells with PcDNA 3.1(-) expression vector showed that although the mutant protein could be secreted, no pro-coagulation activity was detected. The coexistence of the two abnormal transcripts and a heterozygous mutation His348Gln, explained the patient's phenotype.

  15. Weak negative and positive selection and the drift load at splice sites. (United States)

    Denisov, Stepan V; Bazykin, Georgii A; Sutormin, Roman; Favorov, Alexander V; Mironov, Andrey A; Gelfand, Mikhail S; Kondrashov, Alexey S


    Splice sites (SSs) are short sequences that are crucial for proper mRNA splicing in eukaryotic cells, and therefore can be expected to be shaped by strong selection. Nevertheless, in mammals and in other intron-rich organisms, many of the SSs often involve nonconsensus (Nc), rather than consensus (Cn), nucleotides, and beyond the two critical nucleotides, the SSs are not perfectly conserved between species. Here, we compare the SS sequences between primates, and between Drosophila fruit flies, to reveal the pattern of selection acting at SSs. Cn-to-Nc substitutions are less frequent, and Nc-to-Cn substitutions are more frequent, than neutrally expected, indicating, respectively, negative and positive selection. This selection is relatively weak (1 positions, the positive selection in favor of Nc-to-Cn substitutions is weaker than the negative selection maintaining already established Cn nucleotides; this difference is due to site-specific negative selection favoring current Nc nucleotides. In general, however, the strength of negative selection protecting the Cn alleles is similar in magnitude to the strength of positive selection favoring replacement of Nc alleles, as expected under the simple nearly neutral turnover. In summary, although a fraction of the Nc nucleotides within SSs is maintained by selection, the abundance of deleterious nucleotides in this class suggests a substantial genome-wide drift load. © The Author(s) 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  16. An empirical study of ensemble-based semi-supervised learning approaches for imbalanced splice site datasets. (United States)

    Stanescu, Ana; Caragea, Doina


    Recent biochemical advances have led to inexpensive, time-efficient production of massive volumes of raw genomic data. Traditional machine learning approaches to genome annotation typically rely on large amounts of labeled data. The process of labeling data can be expensive, as it requires domain knowledge and expert involvement. Semi-supervised learning approaches that can make use of unlabeled data, in addition to small amounts of labeled data, can help reduce the costs associated with labeling. In this context, we focus on the problem of predicting splice sites in a genome using semi-supervised learning approaches. This is a challenging problem, due to the highly imbalanced distribution of the data, i.e., small number of splice sites as compared to the number of non-splice sites. To address this challenge, we propose to use ensembles of semi-supervised classifiers, specifically self-training and co-training classifiers. Our experiments on five highly imbalanced splice site datasets, with positive to negative ratios of 1-to-99, showed that the ensemble-based semi-supervised approaches represent a good choice, even when the amount of labeled data consists of less than 1% of all training data. In particular, we found that ensembles of co-training and self-training classifiers that dynamically balance the set of labeled instances during the semi-supervised iterations show improvements over the corresponding supervised ensemble baselines. In the presence of limited amounts of labeled data, ensemble-based semi-supervised approaches can successfully leverage the unlabeled data to enhance supervised ensembles learned from highly imbalanced data distributions. Given that such distributions are common for many biological sequence classification problems, our work can be seen as a stepping stone towards more sophisticated ensemble-based approaches to biological sequence annotation in a semi-supervised framework.

  17. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.


    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  18. Factor IX[sub Madrid 2]: A deletion/insertion in Facotr IX gene which abolishes the sequence of the donor junction at the exon IV-intron d splice site

    Energy Technology Data Exchange (ETDEWEB)

    Solera, J. (Unidades de Genetica Molecular, Madrid (Spain)); Magallon, M.; Martin-Villar, J. (Hemofilia Hospital, Madrid (Spain)); Coloma, A. (Departamento deBioquimica de la Facultad de Medicina de la Universidad Autonoma, Madrid (Spain))


    DNA from a patient with severe hemophilia B was evaluated by RFLP analysis, producing results which suggested the existence of a partial deletion within the factor IX gene. The deletion was further localized and characterized by PCR amplification and sequencing. The altered allele has a 4,442-bp deletion which removes both the donor splice site located at the 5[prime] end of intron d and the two last coding nucleotides located at the 3[prime] end of exon IV in the normal factor IX gene; this fragment has been inserted in inverted orientation. Two homologous sequences have been discovered at the ends of the deleted DNA fragment.

  19. Building energy analysis tool (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars


    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  20. Extended Testability Analysis Tool (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher


    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  1. BAP1 missense mutation c.2054 A>T (p.E685V completely disrupts normal splicing through creation of a novel 5' splice site in a human mesothelioma cell line.

    Directory of Open Access Journals (Sweden)

    Arianne Morrison

    Full Text Available BAP1 is a tumor suppressor gene that is lost or deleted in diverse cancers, including uveal mela¬noma, malignant pleural mesothelioma (MPM, clear cell renal carcinoma, and cholangiocarcinoma. Recently, BAP1 germline mutations have been reported in families with combinations of these same cancers. A particular challenge for mutation screening is the classification of non-truncating BAP1 sequence variants because it is not known whether these subtle changes can affect the protein function sufficiently to predispose to cancer development. Here we report mRNA splicing analysis on a homozygous substitution mutation, BAP1 c. 2054 A&T (p.Glu685Val, identified in an MPM cell line derived from a mesothelioma patient. The mutation occurred at the 3rd nucleotide from the 3' end of exon 16. RT-PCR, cloning and subsequent sequencing revealed several aberrant splicing products not observed in the controls: 1 a 4 bp deletion at the end of exon 16 in all clones derived from the major splicing product. The BAP1 c. 2054 A&T mutation introduced a new 5' splice site (GU, which resulted in the deletion of 4 base pairs and presumably protein truncation; 2 a variety of alternative splicing products that led to retention of different introns: introns 14-16; introns 15-16; intron 14 and intron 16; 3 partial intron 14 and 15 retentions caused by activation of alternative 3' splice acceptor sites (AG in the introns. Taken together, we were unable to detect any correctly spliced mRNA transcripts in this cell line. These results suggest that aberrant splicing caused by this mutation is quite efficient as it completely abolishes normal splicing through creation of a novel 5' splice site and activation of cryptic splice sites. These data support the conclusion that BAP1 c.2054 A&T (p.E685V variant is a pathogenic mutation and contributes to MPM through disruption of normal splicing.

  2. A splice site mutation in laminin-α2 results in a severe muscular dystrophy and growth abnormalities in zebrafish.

    Directory of Open Access Journals (Sweden)

    Vandana A Gupta

    Full Text Available Congenital muscular dystrophy (CMD is a clinically and genetically heterogeneous group of inherited muscle disorders. In patients, muscle weakness is usually present at or shortly after birth and is progressive in nature. Merosin deficient congenital muscular dystrophy (MDC1A is a form of CMD caused by a defect in the laminin-α2 gene (LAMA2. Laminin-α2 is an extracellular matrix protein that interacts with the dystrophin-dystroglycan (DGC complex in membranes providing stability to muscle fibers. In an N-ethyl-N-nitrosourea mutagenesis screen to develop zebrafish models of neuromuscular diseases, we identified a mutant fish that exhibits severe muscular dystrophy early in development. Genetic mapping identified a splice site mutation in the lama2 gene. This splice site is highly conserved in humans and this mutation results in mis-splicing of RNA and a loss of protein function. Homozygous lama2 mutant zebrafish, designated lama2(cl501/cl501, exhibited reduced motor function and progressive degeneration of skeletal muscles and died at 8-15 days post fertilization. The skeletal muscles exhibited damaged myosepta and detachment of myofibers in the affected fish. Laminin-α2 deficiency also resulted in growth defects in the brain and eye of the mutant fish. This laminin-α2 deficient mutant fish represents a novel disease model to develop therapies for modulating splicing defects in congenital muscular dystrophies and to restore the muscle function in human patients with CMD.

  3. Early-onset encephalopathy with epilepsy associated with a novel splice site mutation in SMC1A. (United States)

    Lebrun, Nicolas; Lebon, Sébastien; Jeannet, Pierre-Yves; Jacquemont, Sébastien; Billuart, Pierre; Bienvenu, Thierry


    We report on the clinical and molecular characterization of a female patient with early-onset epileptic encephalopathy, who was found to carry a de novo novel splice site mutation in SMC1A. This girl shared some morphologic and anthropometric traits described in patients with clinical diagnosis of Cornelia de Lange syndrome and with SMC1A mutation but also has severe encephalopathy with early-onset epilepsy. In addition, she had midline hand stereotypies and scoliosis leading to the misdiagnosis of a Rett overlap syndrome. Molecular studies found a novel de novo splice site mutation (c.1911 + 1G > T) in SMC1A. This novel splice mutation was associated with an aberrantly processed mRNA that included intron 11 of the gene. Moreover, quantitative approach by RT-PCR showed a severe reduction of the SMC1A transcript suggesting that this aberrant transcript may be unstable and degraded. Taken together, our data suggest that the phenotype may be due to a loss-of-function of SMC1A in this patient. Our findings suggest that loss-of-function mutations of SMC1A may be associated with early-onset encephalopathy with epilepsy. © 2015 Wiley Periodicals, Inc.

  4. Contamination Analysis Tools (United States)

    Brieda, Lubos


    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  5. Analysis and prediction of gene splice sites in four Aspergillus genomes

    DEFF Research Database (Denmark)

    Wang, Kai; Ussery, David; Brunak, Søren


    Several Aspergillus fungal genomic sequences have been published, with many more in progress. Obviously, it is essential to have high-quality, consistently annotated sets of proteins from each of the genomes, in order to make meaningful comparisons. We have developed a dedicated, publicly available...

  6. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)


    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  7. Impairment of alternative splice sites defining a novel gammaretroviral exon within gag modifies the oncogenic properties of Akv murine leukemia virus

    DEFF Research Database (Denmark)

    Sørensen, Annette Balle; Lund, Anders H; Kunder, Sandra


    to be associated with specific tumor diagnoses or individual viral mutants. CONCLUSION: We present here the first example of a doubly spliced transcript within the group of gammaretroviruses, and we show that mutation of the alternative splice sites that define this novel RNA product change the oncogenic potential......) and histiocytic sarcoma. Interestingly, a broader spectrum of diagnoses was made from the two single splice-site mutants than from as well the wild-type as the double splice-site mutant. Both single- and double-spliced transcripts are produced in vivo using the SA' and/or the SD' sites, but the mechanisms......BACKGROUND: Mutations of an alternative splice donor site located within the gag region has previously been shown to broaden the pathogenic potential of the T-lymphomagenic gammaretrovirus Moloney murine leukemia virus, while the equivalent mutations in the erythroleukemia inducing Friend murine...

  8. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  9. Identification of new splice sites used for generation of rev transcripts in human immunodeficiency virus type 1 subtype C primary isolates.

    Directory of Open Access Journals (Sweden)

    Elena Delgado

    Full Text Available The HIV-1 primary transcript undergoes a complex splicing process by which more than 40 different spliced RNAs are generated. One of the factors contributing to HIV-1 splicing complexity is the multiplicity of 3' splice sites (3'ss used for generation of rev RNAs, with two 3'ss, A4a and A4b, being most commonly used, a third site, A4c, used less frequently, and two additional sites, A4d and A4e, reported in only two and one isolates, respectively. HIV-1 splicing has been analyzed mostly in subtype B isolates, and data on other group M clades are lacking. Here we examine splice site usage in three primary isolates of subtype C, the most prevalent clade in the HIV-1 pandemic, by using an in vitro infection assay of peripheral blood mononuclear cells. Viral spliced RNAs were identified by RT-PCR amplification using a fluorescently-labeled primer and software analyses and by cloning and sequencing the amplified products. The results revealed that splice site usage for generation of rev transcripts in subtype C differs from that reported for subtype B, with most rev RNAs using two previously unreported 3'ss, one located 7 nucleotides upstream of 3'ss A4a, designated A4f, preferentially used by two isolates, and another located 14 nucleotides upstream of 3'ss A4c, designated A4g, preferentially used by the third isolate. A new 5' splice site, designated D2a, was also identified in one virus. Usage of the newly identified splice sites is consistent with sequence features commonly found in subtype C viruses. These results show that splice site usage may differ between HIV-1 subtypes.

  10. Discovery of candidate disease genes in ENU-induced mouse mutants by large-scale sequencing, including a splice-site mutation in nucleoredoxin.

    Directory of Open Access Journals (Sweden)

    Melissa K Boles


    Full Text Available An accurate and precisely annotated genome assembly is a fundamental requirement for functional genomic analysis. Here, the complete DNA sequence and gene annotation of mouse Chromosome 11 was used to test the efficacy of large-scale sequencing for mutation identification. We re-sequenced the 14,000 annotated exons and boundaries from over 900 genes in 41 recessive mutant mouse lines that were isolated in an N-ethyl-N-nitrosourea (ENU mutation screen targeted to mouse Chromosome 11. Fifty-nine sequence variants were identified in 55 genes from 31 mutant lines. 39% of the lesions lie in coding sequences and create primarily missense mutations. The other 61% lie in noncoding regions, many of them in highly conserved sequences. A lesion in the perinatal lethal line l11Jus13 alters a consensus splice site of nucleoredoxin (Nxn, inserting 10 amino acids into the resulting protein. We conclude that point mutations can be accurately and sensitively recovered by large-scale sequencing, and that conserved noncoding regions should be included for disease mutation identification. Only seven of the candidate genes we report have been previously targeted by mutation in mice or rats, showing that despite ongoing efforts to functionally annotate genes in the mammalian genome, an enormous gap remains between phenotype and function. Our data show that the classical positional mapping approach of disease mutation identification can be extended to large target regions using high-throughput sequencing.

  11. Clinical, in silico, and experimental evidence for pathogenicity of two novel splice site mutations in the SH3TC2 gene

    Czech Academy of Sciences Publication Activity Database

    Laššuthová, P.; Gregor, Martin; Sarnová, Lenka; Machalová, Eliška; Sedláček, Radislav; Seeman, P.


    Roč. 26, 3-4 (2012), s. 413-420 ISSN 0167-7063 R&D Projects: GA ČR GAP303/10/2044 Institutional support: RVO:68378050 Keywords : exon trapping * peripheral neuropathy * SH3TC2 gene * splice site mutation Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.159, year: 2012

  12. Intronic PAH gene mutations cause a splicing defect by a novel mechanism involving U1snRNP binding downstream of the 5' splice site

    DEFF Research Database (Denmark)

    Martínez-Pizarro, Ainhoa; Dembic, Maja; Pérez, Belén


    Phenylketonuria (PKU), one of the most common inherited diseases of amino acid metabolism, is caused by mutations in the phenylalanine hydroxylase (PAH) gene. Recently, PAH exon 11 was identified as a vulnerable exon due to a weak 3' splice site, with different exonic mutations affecting exon 11...

  13. Hurricane Data Analysis Tool (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory


    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL:, to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  14. Java Radar Analysis Tool (United States)

    Zaczek, Mariusz P.


    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  15. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL


    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  16. A donor splice site mutation in CISD2 generates multiple truncated, non-functional isoforms in Wolfram syndrome type 2 patients. (United States)

    Cattaneo, Monica; La Sala, Lucia; Rondinelli, Maurizio; Errichiello, Edoardo; Zuffardi, Orsetta; Puca, Annibale Alessandro; Genovese, Stefano; Ceriello, Antonio


    Mutations in the gene that encodes CDGSH iron sulfur domain 2 (CISD2) are causative of Wolfram syndrome type 2 (WFS2), a rare autosomal recessive neurodegenerative disorder mainly characterized by diabetes mellitus, optic atrophy, peptic ulcer bleeding and defective platelet aggregation. Four mutations in the CISD2 gene have been reported. Among these mutations, the homozygous c.103 + 1G > A substitution was identified in the donor splice site of intron 1 in two Italian sisters and was predicted to cause a exon 1 to be skipped. Here, we employed molecular assays to characterize the c.103 + 1G > A mutation using the patient's peripheral blood mononuclear cells (PBMCs). 5'-RACE coupled with RT-PCR were used to analyse the effect of the c.103 + 1G > A mutation on mRNA splicing. Western blot analysis was used to analyse the consequences of the CISD2 mutation on the encoded protein. We demonstrated that the c.103 + 1G > A mutation functionally impaired mRNA splicing, producing multiple splice variants characterized by the whole or partial absence of exon 1, which introduced amino acid changes and a premature stop. The affected mRNAs resulted in either predicted targets for nonsense mRNA decay (NMD) or non-functional isoforms. We concluded that the c.103 + 1G > A mutation resulted in the loss of functional CISD2 protein in the two Italian WFS2 patients.

  17. Transcriptome sequencing reveals potential mechanism of cryptic 3' splice site selection in SF3B1-mutated cancers.

    Directory of Open Access Journals (Sweden)

    Christopher DeBoever


    Full Text Available Mutations in the splicing factor SF3B1 are found in several cancer types and have been associated with various splicing defects. Using transcriptome sequencing data from chronic lymphocytic leukemia, breast cancer and uveal melanoma tumor samples, we show that hundreds of cryptic 3' splice sites (3'SSs are used in cancers with SF3B1 mutations. We define the necessary sequence context for the observed cryptic 3' SSs and propose that cryptic 3'SS selection is a result of SF3B1 mutations causing a shift in the sterically protected region downstream of the branch point. While most cryptic 3'SSs are present at low frequency (<10% relative to nearby canonical 3'SSs, we identified ten genes that preferred out-of-frame cryptic 3'SSs. We show that cancers with mutations in the SF3B1 HEAT 5-9 repeats use cryptic 3'SSs downstream of the branch point and provide both a mechanistic model consistent with published experimental data and affected targets that will guide further research into the oncogenic effects of SF3B1 mutation.

  18. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  19. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel


    , analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...... that structure the inter-connections between these sites. Including contributions from a range of academic disciplines including Political Science, Media and Communication Studies, Economics, and Computer Science, this study showcases a new methodological approach that has been expressly designed to capture......As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extract...

  20. SQSTM1 splice site mutation in distal myopathy with rimmed vacuoles. (United States)

    Bucelli, Robert C; Arhzaouy, Khalid; Pestronk, Alan; Pittman, Sara K; Rojas, Luisa; Sue, Carolyn M; Evilä, Anni; Hackman, Peter; Udd, Bjarne; Harms, Matthew B; Weihl, Conrad C


    To identify the genetic etiology and characterize the clinicopathologic features of a novel distal myopathy. We performed whole-exome sequencing on a family with an autosomal dominant distal myopathy and targeted exome sequencing in 1 patient with sporadic distal myopathy, both with rimmed vacuolar pathology. We also evaluated the pathogenicity of identified mutations using immunohistochemistry, Western blot analysis, and expression studies. Sequencing identified a likely pathogenic c.1165+1 G>A splice donor variant in SQSTM1 in the affected members of 1 family and in an unrelated patient with sporadic distal myopathy. Affected patients had late-onset distal lower extremity weakness, myopathic features on EMG, and muscle pathology demonstrating rimmed vacuoles with both TAR DNA-binding protein 43 and SQSTM1 inclusions. The c.1165+1 G>A SQSTM1 variant results in the expression of 2 alternatively spliced SQSTM1 proteins: 1 lacking the C-terminal PEST2 domain and another lacking the C-terminal ubiquitin-associated (UBA) domain, both of which have distinct patterns of cellular and skeletal muscle localization. SQSTM1 is an autophagic adaptor that shuttles aggregated and ubiquitinated proteins to the autophagosome for degradation via its C-terminal UBA domain. Similar to mutations in VCP, dominantly inherited mutations in SQSTM1 are now associated with rimmed vacuolar myopathy, Paget disease of bone, amyotrophic lateral sclerosis, and frontotemporal dementia. Our data further suggest a pathogenic connection between the disparate phenotypes. © 2015 American Academy of Neurology.

  1. NOAA's Inundation Analysis Tool (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  2. Characterization of the Ryanodine Receptor Gene With a Unique 3'-UTR and Alternative Splice Site From the Oriental Fruit Moth. (United States)

    Sun, L N; Zhang, H J; Quan, L F; Yan, W T; Yue, Q; Li, Y Y; Qiu, G S


    The ryanodine receptor (RyR), the largest calcium channel protein, has been studied because of its key roles in calcium signaling in cells. Insect RyRs are molecular targets for novel diamide insecticides. The target has been focused widely because of the diamides with high activity against lepidopterous pests and safety for nontarget organisms. To study our understanding of effects of diamides on RyR, we cloned the RyR gene from the oriental fruit moth, Grapholita molesta, which is the most serious pest of stone and pome tree fruits throughout the world, to investigate the modulation of diamide insecticides on RyR mRNA expression in G. molesta (GmRyR). The full-length cDNAs of GmRyR contain a unique 3'-UTR with 625 bp and an open reading frame of 15,402 bp with a predicted protein consisting of 5,133 amino acids. GmRyR possessed a high level of overall amino acid homology with insect and vertebrate isoforms, with 77-92% and 45-47% identity, respectively. Furthermore, five alternative splice sites were identified in GmRyR. Diagnostic PCR showed that the inclusion frequency of one optional exon (f) differed between developmental stages, a finding only found in GmRyR. The lowest expression level of GmRyR mRNA was in larvae, the highest was in male pupae, and the relative expression level in male pupae was 25.67 times higher than that of in larvae. The expression level of GmRyR in the male pupae was 8.70 times higher than in female pupae, and that in male adults was 5.70 times higher than female adults. © The Author 2016. Published by Oxford University Press on behalf of the Entomological Society of America.

  3. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich


    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  4. VCAT: Visual Crosswalk Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory


    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  5. Characterization of a new 5' splice site within the caprine arthritis encephalitis virus genome: evidence for a novel auxiliary protein

    Directory of Open Access Journals (Sweden)

    Perrin Cécile


    Full Text Available Abstract Background Lentiviral genomes encode multiple structural and regulatory proteins. Expression of the full complement of viral proteins is accomplished in part by alternative splicing of the genomic RNA. Caprine arthritis encephalitis virus (CAEV and maedi-visna virus (MVV are two highly related small-ruminant lentiviruses (SRLVs that infect goats and sheep. Their genome seems to be less complex than those of primate lentiviruses since SRLVs encode only three auxiliary proteins, namely, Tat, Rev, and Vif, in addition to the products of gag, pol, and env genes common to all retroviruses. Here, we investigated the central part of the SRLV genome to identify new splice elements and their relevance in viral mRNA and protein expression. Results We demonstrated the existence of a new 5' splice (SD site located within the central part of CAEV genome, 17 nucleotides downstream from the SD site used for the rev mRNA synthesis, and perfectly conserved among SRLV strains. This new SD site was found to be functional in both transfected and infected cells, leading to the production of a transcript containing an open reading frame generated by the splice junction with the 3' splice site used for the rev mRNA synthesis. This open reading frame encodes two major protein isoforms of 18- and 17-kDa, named Rtm, in which the N-terminal domain shared by the Env precursor and Rev proteins is fused to the entire cytoplasmic tail of the transmembrane glycoprotein. Immunoprecipitations using monospecific antibodies provided evidence for the expression of the Rtm isoforms in infected cells. The Rtm protein interacts specifically with the cytoplasmic domain of the transmembrane glycoprotein in vitro, and its expression impairs the fusion activity of the Env protein. Conclusion The characterization of a novel CAEV protein, named Rtm, which is produced by an additional multiply-spliced mRNA, indicated that the splicing pattern of CAEV genome is more complex than

  6. Channel CAT: A Tactical Link Analysis Tool

    National Research Council Canada - National Science Library

    Coleman, Michael


    .... This thesis produced an analysis tool, the Channel Capacity Analysis Tool (Channel CAT), designed to provide an automated tool for the analysis of design decisions in developing client-server software...

  7. Mild recessive epidermolytic hyperkeratosis associated with a novel keratin 10 donor splice-site mutation in a family of Norfolk terrier dogs. (United States)

    Credille, K M; Barnhart, K F; Minor, J S; Dunstan, R W


    Epidermolytic hyperkeratosis in humans is caused by dominant-negative mutations in suprabasal epidermal keratins 1 and 10. However, spontaneous keratin mutations have not been confirmed in a species other than human. To describe an autosomal recessive, mild, nonpalmar/plantar epidermolytic ichthyosis segregating in an extended pedigree of Norfolk terrier dogs due to a splice-site mutation in the gene encoding keratin 10 (KRT10). Dogs were evaluated clinically, and skin samples were examined by light and electron microscopy. Genomic DNA samples and cDNA from skin RNA were sequenced and defined a mutation in KRT10. Consequences of the mutation were evaluated by assessing protein expression with immunohistochemistry and Western blotting and gene expression with real-time RT-PCR (reverse transcriptase-polymerase chain reaction). Adult dogs with the disease had generalized, pigmented hyperkeratosis with epidermal fragility. Light microscopic examination defined epidermolysis with hyperkeratosis; ultrastructural changes included a decrease in tonofilaments and abnormal filament aggregation in upper spinous and granular layer keratinocytes. Affected dogs were homozygous for a single base GT-->TT change in the consensus donor splice site of intron 5 in KRT10. Keratin 10 protein was not detected with immunoblotting in affected dogs. Heterozygous dogs were normal based on clinical and histological appearance and keratin 10 protein expression. The mutation caused activation of at least three cryptic or alternative splice sites. Use of the cryptic sites resulted in transcripts containing premature termination codons. One transcript could result in shortening of the proximal portion of the 2B domain before the stutter region. Quantitative real-time PCR indicated a significant decrease in KRT10 mRNA levels in affected dogs compared with wild-type dogs. This disease is the first confirmed spontaneous keratin mutation in a nonhuman species and is the first reported recessive form

  8. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  9. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  10. Homozygosity for the common GAA gene splice site mutation c.-32-13T>G in Pompe disease is associated with the classical adult phenotypical spectrum. (United States)

    Musumeci, Olimpia; Thieme, Andrea; Claeys, Kristl G; Wenninger, Stephan; Kley, Rudolf A; Kuhn, Marius; Lukacs, Zoltan; Deschauer, Marcus; Gaeta, Michele; Toscano, Antonio; Gläser, Dieter; Schoser, Benedikt


    Homozygosity for the common Caucasian splice site mutation c.-32-13T>G in intron 1 of the GAA gene is rather rare in Pompe patients. We report on the clinical, biochemical, morphological, muscle imaging, and genetic findings of six adult Pompe patients from five unrelated families with the c.-32-13T>G GAA gene mutation in homozygous state. All patients had decreased GAA activity and elevated creatine kinase levels. Five patients, aged between 43 and 61 years (median 53 years), initially presented with myalgia, hyperCKaemia, and/or exercise induced fatigue at an age of onset (12-55 years). All but one had proximal lower limb weakness combined with axial weakness and moderate respiratory insufficiency; the sixth patient presented with hyperCKaemia only. Muscle biopsies showed PAS-positive vacuolar myopathy with lysosomal changes and reduced GAA activity. Muscle MRI of lower limb muscles revealed a moderate adipose substitution of the gluteal muscles, biceps femoris and slight fatty infiltration of all thigh muscles. One MRI of the respiratory muscles revealed a diaphragmatic atrophy with unilateral diaphragm elevation. So, the common Caucasian, so called mild, splice site mutation c.-32-13T>G in intron 1 of the GAA gene in a homozygote status reflects the full adult Pompe disease phenotype severity spectrum. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Correction of a splice-site mutation in the beta-globin gene stimulated by triplex-forming peptide nucleic acids

    DEFF Research Database (Denmark)

    Chin, Joanna Y; Kuan, Jean Y; Lonkar, Pallavi S


    Splice-site mutations in the beta-globin gene can lead to aberrant transcripts and decreased functional beta-globin, causing beta-thalassemia. Triplex-forming DNA oligonucleotides (TFOs) and peptide nucleic acids (PNAs) have been shown to stimulate recombination in reporter gene loci in mammalian...... DNA fragments, can promote single base-pair modification at the start of the second intron of the beta-globin gene, the site of a common thalassemia-associated mutation. This single base pair change was detected by the restoration of proper splicing of transcripts produced from a green fluorescent...... cells via site-specific binding and creation of altered helical structures that provoke DNA repair. We have designed a series of triplex-forming PNAs that can specifically bind to sequences in the human beta-globin gene. We demonstrate here that these PNAs, when cotransfected with recombinatory donor...

  12. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.


    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  13. Climate Data Analysis Tools - (CDAT) (United States)

    Doutriaux, C.; Jennifer, A.; Drach, R.; Dubois, P.; Williams, D.


    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems. The power of the system comes from Python and its ability to seamlessly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and graphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management System or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS). One of the most difficult challenges facing climate researchers today is the cataloging and analysis of massive amounts of multi-dimensional global atmospheric and oceanic model data. To reduce the labor intensive and time-consuming process of data management, retrieval, and analysis, PCMDI and other DOE sites have come together to develop intelligent filing system and data management software for the linking of storage devices located throughout the United States and the international climate research community. This effort, headed by PCMDI, NCAR, and ANL will allow users anywhere to remotely access this distributed multi-petabyte archive and perform analysis. PCMDI's CDAT is an innovative system that supports exploration and visualization of climate scientific datasets. As an "open system", the software sub-systems (i.e., modules) are independent and freely available to the global climate community. CDAT is easily extended to include new modules and as a result of its flexibility, PCMDI has integrated other popular software components, such as: the popular Live Access Server (LAS) and the Distributed Oceanographic Data System (DODS). Together with ANL's Globus middleware

  14. Integrated Radiation Analysis and Design Tools (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  15. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.


    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  16. Sustainability Tools Inventory - Initial Gaps Analysis | Science ... (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  17. Protein analysis tools and services at IBIVU

    Directory of Open Access Journals (Sweden)

    Brandt Bernd W.


    Full Text Available During the last years several new tools applicable to protein analysis have made available on the IBIVU web site. Recently, a number of tools, ranging from multiple sequence alignment construction to domain prediction, have been updated and/or extended with services for programmatic access using SOAP. We provide an overview of these tools and their application.



    Bhavesh Patel; Jyotindra Dharwa


    SPSS tool is statistical analysis tool. This tool is used for analyzing the large volume of available data, extracting useful information and knowledge to support the major decision-making processes. SPSS tool can be applied in educational sector for improving the performance of students by finding the highly affected parameter on student performance. This research study is carried out by collecting the student performance parameters and its related dataset. In this research study we have col...

  19. Gene trap mutagenesis of hnRNP A2/B1: a cryptic 3' splice site in the neomycin resistance gene allows continued expression of the disrupted cellular gene

    Directory of Open Access Journals (Sweden)

    DeGregori James V


    Full Text Available Abstract Background Tagged sequence mutagenesis is a process for constructing libraries of sequenced insertion mutations in embryonic stem cells that can be transmitted into the mouse germline. To better predict the functional consequences of gene entrapment on cellular gene expression, the present study characterized the effects of a U3Neo gene trap retrovirus inserted into an intron of the hnRNP A2/B1 gene. The mutation was selected for analysis because it occurred in a highly expressed gene and yet did not produce obvious phenotypes following germline transmission. Results Sequences flanking the integrated gene trap vector in 1B4 cells were used to isolate a full-length cDNA whose predicted amino acid sequence is identical to the human A2 protein at all but one of 341 amino acid residues. hnRNP A2/B1 transcripts extending into the provirus utilize a cryptic 3' splice site located 28 nucleotides downstream of the neomycin phosphotransferase start codon. The inserted Neo sequence and proviral poly(A site function as an 3' terminal exon that is utilized to produce hnRNP A2/B1-Neo fusion transcripts, or skipped to produce wild-type hnRNP A2/B1 transcripts. This results in only a modest disruption of hnRNPA2/B1 gene expression. Conclusions Expression of the occupied hnRNP A2/B1 gene and utilization of the viral poly(A site are consistent with an exon definition model of pre-mRNA splicing. These results reveal a mechanism by which U3 gene trap vectors can be expressed without disrupting cellular gene expression, thus suggesting ways to improve these vectors for gene trap mutagenesis.

  20. Exome Sequencing Identifies a Novel LMNA Splice-Site Mutation and Multigenic Heterozygosity of Potential Modifiers in a Family with Sick Sinus Syndrome, Dilated Cardiomyopathy, and Sudden Cardiac Death.

    Directory of Open Access Journals (Sweden)

    Michael V Zaragoza

    Full Text Available The goals are to understand the primary genetic mechanisms that cause Sick Sinus Syndrome and to identify potential modifiers that may result in intrafamilial variability within a multigenerational family. The proband is a 63-year-old male with a family history of individuals (>10 with sinus node dysfunction, ventricular arrhythmia, cardiomyopathy, heart failure, and sudden death. We used exome sequencing of a single individual to identify a novel LMNA mutation and demonstrated the importance of Sanger validation and family studies when evaluating candidates. After initial single-gene studies were negative, we conducted exome sequencing for the proband which produced 9 gigabases of sequencing data. Bioinformatics analysis showed 94% of the reads mapped to the reference and identified 128,563 unique variants with 108,795 (85% located in 16,319 genes of 19,056 target genes. We discovered multiple variants in known arrhythmia, cardiomyopathy, or ion channel associated genes that may serve as potential modifiers in disease expression. To identify candidate mutations, we focused on ~2,000 variants located in 237 genes of 283 known arrhythmia, cardiomyopathy, or ion channel associated genes. We filtered the candidates to 41 variants in 33 genes using zygosity, protein impact, database searches, and clinical association. Only 21 of 41 (51% variants were validated by Sanger sequencing. We selected nine confirmed variants with minor allele frequencies G, a novel heterozygous splice-site mutation as the primary mutation with rare or novel variants in HCN4, MYBPC3, PKP4, TMPO, TTN, DMPK and KCNJ10 as potential modifiers and a mechanism consistent with haploinsufficiency.



    Österlind, Magnus


    The Enterprise Architecture Analysis Tool, EAAT, is a software tool developed by the department of Industrial Information- and Control systems, ICS, at the Royal Institute of Technology, Stockholm, Sweden. EAAT is a modeling tool that combines Enterprise Architecture (EA) modeling with probabilistic relational modeling. Therefore EAAT makes it possible to design, describe and analyze the organizational structure, business processes, information systems and infrastructure within an enterprise....

  2. A novel splice site mutation in the dentin sialophosphoprotein gene in a Chinese family with dentinogenesis imperfecta type II

    International Nuclear Information System (INIS)

    Wang Haoyang; Hou Yanning; Cui Yingxia; Huang Yufeng; Shi Yichao; Xia Xinyi; Lu Hongyong; Wang Yunhua; Li Xiaojun


    Twenty-four individuals were investigated that spanned six generations in a Chinese family affected with an apparently autosomal dominant form of dentinogenesis imperfecta type II (DGI-II, OMIM 125490). All affected individuals presented with typical, clinical and radiographic features of DGI-II, but without bilateral progressive high-frequency sensorineural hearing loss. To investigate the mutated molecule, a positional candidate approach was used to determine the mutated gene in this family. Genomic DNA was obtained from 24 affected individuals, 18 unaffected relatives of the family and 50 controls. Haplotype analysis was performed using leukocyte DNA for 6 short tandem repeat (STR) markers present in chromosome 4 (D4S1534, GATA62A11, DSPP, DMP1, SPP1 and D4S1563). In the critical region between D4S1534 and DMP1, the dentin sialophosphoprotein (DSPP) gene (OMIM *125485) was considered as the strongest candidate gene. The first four exons and exon/intron boundaries of the gene were analyzed using DNA from 24 affected individuals and 18 unaffected relatives of the same family. DNA sequencing revealed a heterozygous deletion mutation in intron 2 (at positions -3 to -25), which resulted in a frameshift mutation, that changed the acceptor site sequence from CAG to AAG (IVS2-3C→A) and may also have disrupted the branch point consensus sequence in intron 2. The mutation was found in the 24 affected individuals, but not in the 18 unaffected relatives and 50 controls. The deletion was identified by allele-specific sequencing and denaturing high-performance liquid chromatography (DHPLC) analysis. We conclude that the heterozygous deletion mutation contributed to the pathogenesis of DGI-II

  3. Integrating Reliability Analysis with a Performance Tool (United States)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael


    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  4. A spontaneous Fatp4/Scl27a4 splice site mutation in a new murine model for congenital ichthyosis. (United States)

    Tao, Jianning; Koster, Maranke I; Harrison, Wilbur; Moran, Jennifer L; Beier, David R; Roop, Dennis R; Overbeek, Paul A


    Congenital ichthyoses are life-threatening conditions in humans. We describe here the identification and molecular characterization of a novel recessive mutation in mice that results in newborn lethality with severe congenital lamellar ichthyosis. Mutant newborns have a taut, shiny, non-expandable epidermis that resembles cornified manifestations of autosomal-recessive congenital ichthyosis in humans. The skin is stretched so tightly that the newborn mice are immobilized. The genetic defect was mapped to a region near the proximal end of chromosome 2 by SNP analysis, suggesting Fatp4/Slc27a4 as a candidate gene. FATP4 mutations in humans cause ichthyosis prematurity syndrome (IPS), and mutations of Fatp4 in mice have previously been found to cause a phenotype that resembles human congenital ichthyoses. Characterization of the Fatp4 cDNA revealed a fusion of exon 8 to exon 10, with deletion of exon 9. Genomic sequencing identified an A to T mutation in the splice donor sequence at the 3'-end of exon 9. Loss of exon 9 results in a frame shift mutation upstream from the conserved very long-chain acyl-CoA synthase (VLACS) domain. Histological studies revealed that the mutant mice have defects in keratinocyte differentiation, along with hyperproliferation of the stratum basale of the epidermis, a hyperkeratotic stratum corneum, and reduced numbers of secondary hair follicles. Since Fatp4 protein is present primarily at the stratum granulosum and the stratum spinosum, the hyperproliferation and the alterations in hair follicle induction suggest that very long chain fatty acids, in addition to being required for normal cornification, may influence signals from the stratum corneum to the basal cells that help to orchestrate normal skin differentiation.

  5. Quick Spacecraft Thermal Analysis Tool, Phase II (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  6. Characterization of the Ryanodine Receptor Gene With a Unique 3′-UTR and Alternative Splice Site From the Oriental Fruit Moth (United States)

    Sun, L. N.; Zhang, H. J.; Quan, L. F.; Yan, W. T.; Yue, Q.; Li, Y. Y.; Qiu, G. S.


    The ryanodine receptor (RyR), the largest calcium channel protein, has been studied because of its key roles in calcium signaling in cells. Insect RyRs are molecular targets for novel diamide insecticides. The target has been focused widely because of the diamides with high activity against lepidopterous pests and safety for nontarget organisms. To study our understanding of effects of diamides on RyR, we cloned the RyR gene from the oriental fruit moth, Grapholita molesta, which is the most serious pest of stone and pome tree fruits throughout the world, to investigate the modulation of diamide insecticides on RyR mRNA expression in G. molesta (GmRyR). The full-length cDNAs of GmRyR contain a unique 3′-UTR with 625 bp and an open reading frame of 15,402 bp with a predicted protein consisting of 5,133 amino acids. GmRyR possessed a high level of overall amino acid homology with insect and vertebrate isoforms, with 77–92% and 45–47% identity, respectively. Furthermore, five alternative splice sites were identified in GmRyR. Diagnostic PCR showed that the inclusion frequency of one optional exon (f) differed between developmental stages, a finding only found in GmRyR. The lowest expression level of GmRyR mRNA was in larvae, the highest was in male pupae, and the relative expression level in male pupae was 25.67 times higher than that of in larvae. The expression level of GmRyR in the male pupae was 8.70 times higher than in female pupae, and that in male adults was 5.70 times higher than female adults. PMID:28076278

  7. A unique LAMB3 splice-site mutation with founder effect from the Balkans causes lethal epidermolysis bullosa in several European countries. (United States)

    Mayer, B; Silló, P; Mazán, M; Pintér, D; Medvecz, M; Has, C; Castiglia, D; Petit, F; Charlesworth, A; Hatvani, Zs; Pamjav, H; Kárpáti, S


    We have encountered repeated cases of recessive lethal generalized severe (Herlitz-type) junctional epidermolysis bullosa (JEB gen sev) in infants born to Hungarian Roma parents residing in a small region of Hungary. To identify the disease-causing mutation and to investigate the genetic background of its unique carrier group. The LAMB3 gene was analysed in peripheral-blood genomic DNA samples, and the pathological consequences of the lethal defect were confirmed by cutaneous LAMB3cDNA sequencing. A median joining haplotype network within the Y chromosome H1a-M82 haplogroup of individuals from the community was constructed, and LAMB3 single-nucleotide polymorphism (SNP) patterns were also determined. An unconventional intronic splice-site mutation (LAMB3, c.1133-22G>A) was identified. Thirty of 64 voluntarily screened Roma from the closed community carried the mutation, but none of the 306 Roma from other regions of the country did. The age of the mutation was estimated to be 548 ± 222 years. Within the last year, more patients with JEB gen sev carrying the same unusual mutation have been identified in three unrelated families, all immigrants from the Balkans. Two were compound heterozygous newborns, in Germany and Italy, and one homozygous newborn died in France. Only the French family recognized their Roma background. LAMB3SNP haplotyping confirmed the link between the apparently unrelated Hungarian, German and Italian male cases, but could not verify the same background in the female newborn from France. The estimated age of the mutation corresponds to the time period when Roma were wandering in the Balkans. © 2016 British Association of Dermatologists.

  8. Paediatric Automatic Phonological Analysis Tools (APAT). (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T


    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  9. Photogrammetry Tool for Forensic Analysis (United States)

    Lane, John


    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  10. Surface analysis of stone and bone tools (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.


    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  11. A short in-frame deletion in NTRK1 tyrosine kinase domain caused by a novel splice site mutation in a patient with congenital insensitivity to pain with anhidrosis

    Directory of Open Access Journals (Sweden)

    Arístegui Javier


    Full Text Available Abstract Background Congenital insensitivity to pain with anhidrosis (CIPA is a rare autosomal recessive genetic disease characterized by the lack of reaction to noxious stimuli and anhidrosis. It is caused by mutations in the NTRK1 gene, which encodes the high affinity tyrosine kinase receptor I for Neurotrophic Growth Factor (NGF. Case Presentation We present the case of a female patient diagnosed with CIPA at the age of 8 months. The patient is currently 6 years old and her psychomotor development conforms to her age (RMN, SPECT and psychological study are in the range of normality. PCR amplification of DNA, followed by direct sequencing, was used to investigate the presence of NTRK1 gene mutations. Reverse transcriptase (RT-PCR amplification of RNA, followed by cloning and sequencing of isolated RT-PCR products was used to characterize the effect of the mutations on NTRK1 mRNA splicing. The clinical diagnosis of CIPA was confirmed by the detection of two splice-site mutations in NTRK1, revealing that the patient was a compound heterozygote at this gene. One of these alterations, c.574+1G>A, is located at the splice donor site of intron 5. We also found a second mutation, c.2206-2 A>G, not previously reported in the literature, which is located at the splice acceptor site of intron 16. Each parent was confirmed to be a carrier for one of the mutations by DNA sequencing analysis. It has been proposed that the c.574+1G>A mutation would cause exon 5 skipping during NTRK1 mRNA splicing. We could confirm this prediction and, more importantly, we provide evidence that the novel c.2206-2A>G mutation also disrupts normal NTRK1 splicing, leading to the use of an alternative splice acceptor site within exon 17. As a consequence, this mutation would result in the production of a mutant NTRK1 protein with a seven aminoacid in-frame deletion in its tyrosine kinase domain. Conclusions We present the first description of a CIPA-associated NTRK1 mutation

  12. Splice-site mutations cause Rrp6-mediated nuclear retention of the unspliced RNAs and transcriptional down-regulation of the splicing-defective genes.

    Directory of Open Access Journals (Sweden)

    Andrea B Eberle

    Full Text Available BACKGROUND: Eukaryotic cells have developed surveillance mechanisms to prevent the expression of aberrant transcripts. An early surveillance checkpoint acts at the transcription site and prevents the release of mRNAs that carry processing defects. The exosome subunit Rrp6 is required for this checkpoint in Saccharomyces cerevisiae, but it is not known whether Rrp6 also plays a role in mRNA surveillance in higher eukaryotes. METHODOLOGY/PRINCIPAL FINDINGS: We have developed an in vivo system to study nuclear mRNA surveillance in Drosophila melanogaster. We have produced S2 cells that express a human beta-globin gene with mutated splice sites in intron 2 (mut beta-globin. The transcripts encoded by the mut beta-globin gene are normally spliced at intron 1 but retain intron 2. The levels of the mut beta-globin transcripts are much lower than those of wild type (wt ss-globin mRNAs transcribed from the same promoter. We have compared the expression of the mut and wt beta-globin genes to investigate the mechanisms that down-regulate the production of defective mRNAs. Both wt and mut beta-globin transcripts are processed at the 3', but the mut beta-globin transcripts are less efficiently cleaved than the wt transcripts. Moreover, the mut beta-globin transcripts are less efficiently released from the transcription site, as shown by FISH, and this defect is restored by depletion of Rrp6 by RNAi. Furthermore, transcription of the mut beta-globin gene is significantly impaired as revealed by ChIP experiments that measure the association of the RNA polymerase II with the transcribed genes. We have also shown that the mut beta-globin gene shows reduced levels of H3K4me3. CONCLUSIONS/SIGNIFICANCE: Our results show that there are at least two surveillance responses that operate cotranscriptionally in insect cells and probably in all metazoans. One response requires Rrp6 and results in the inefficient release of defective mRNAs from the transcription site. The

  13. Performance analysis of GYRO: a tool evaluation

    International Nuclear Information System (INIS)

    Worley, P; Candy, J; Carrington, L; Huck, K; Kaiser, T; Mahinthakumar, G; Malony, A; Moore, S; Reed, D; Roth, P; Shan, H; Shende, S; Snavely, A; Sreepathi, S; Wolf, F; Zhang, Y


    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses

  14. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David


    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  15. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.


    Work is in progress on interactive tools for linear and non-linear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  16. Rapid Benefit Indicators (RBI) Spatial Analysis Tools (United States)

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  17. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.


    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  18. KAFE - A Flexible Image Analysis Tool (United States)

    Burkutean, Sandra


    We present KAFE, the Keywords of Astronomical FITS-Images Explorer - a web-based FITS images post-processing analysis tool designed to be applicable in the radio to sub-mm wavelength domain. KAFE was developed to enrich selected FITS files with metadata based on a uniform image analysis approach as well as to provide advanced image diagnostic plots. It is ideally suited for data mining purposes and multi-wavelength/multi-instrument data samples that require uniform data diagnostic criteria.

  19. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    Davey, E.C.; Feher, M.P.; Lupton, L.R.


    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  20. Decision Analysis Tools for Volcano Observatories (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.


    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  1. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan


    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN:

  2. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste


    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  3. BBAT: Bunch and bucket analysis tool

    International Nuclear Information System (INIS)

    Deng, D.P.


    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation

  4. Enhancement of Local Climate Analysis Tool (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.


    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  5. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter


    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... and the value of associated rewards in states of interest for a real-world example from a case company in the Danish baked goods industry. The developments are presented in a generalised fashion to make them relevant to the general problem of implementing quantitative probabilistic model checking of graph...

  6. SEAT: A strategic engagement analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, J.; Michelsen, C.; Morgeson, D.


    The Strategic Engagement Analysis Tool (SEAT) is a prototype of an expert system knowledge-based discrete event simulation. SEAT realistically represents the interrelationships between the eight major subsystems in the strategic targeting and assault domain. Some of the subsystems employ run-time cognitive decision making and reasoning capabilities to represent human tactical and operational strategy decisions. SEAT's goal is to allow analysts to conduct sensitivity analysis and to determine cause-effect relationships. An intelligent interface mechanism is provided to aid the analyst in scenario creation. The interface was designed to provide on-line documentation, support for model input, logic control, and data validation prior to simulation execution. 4 refs., 3 figs.

  7. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do


    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  8. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota


    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  9. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Pluess, Ch.; Montanarini, M.; Bernauer, M.


    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  10. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)


    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  11. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story


    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  12. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y


    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  13. Eight nucleotide substitutions inhibit splicing to HPV-16 3'-splice site SA3358 and reduce the efficiency by which HPV-16 increases the life span of primary human keratinocytes.

    Directory of Open Access Journals (Sweden)

    Xiaoze Li

    Full Text Available The most commonly used 3'-splice site on the human papillomavirus type 16 (HPV-16 genome named SA3358 is used to produce HPV-16 early mRNAs encoding E4, E5, E6 and E7, and late mRNAs encoding L1 and L2. We have previously shown that SA3358 is suboptimal and is totally dependent on a downstream splicing enhancer containingmultiple potential ASF/SF2 binding sites. Here weshow that only one of the predicted ASF/SF2 sites accounts for the majority of the enhancer activity. We demonstrate that single nucleotide substitutions in this predicted ASF/SF2 site impair enhancer function and that this correlates with less efficient binding to ASF/SF2 in vitro. We provide evidence that HPV-16 mRNAs that arespliced to SA3358 interact with ASF/SF2 in living cells. In addition,mutational inactivation of the ASF/SF2 site weakened the enhancer at SA3358 in episomal forms of the HPV-16 genome, indicating that the enhancer is active in the context of the full HPV-16 genome.This resulted in induction of HPV-16 late gene expression as a result of competition from late splice site SA5639. Furthermore, inactivation of the ASF/SF2 site of the SA3358 splicing enhancer reduced the ability of E6- and E7-encoding HPV-16 plasmids to increase the life span of primary keratinocytes in vitro, demonstrating arequirement for an intact splicing enhancer of SA3358 forefficient production of the E6 and E7 mRNAs. These results link the strength of the HPV-16 SA3358 splicing enhancer to expression of E6 and E7 and to the pathogenic properties of HPV-16.

  14. Reversion of the Arabidopsis rpn12a-1 exon-trap mutation by an intragenic suppressor that weakens the chimeric 5’ splice site [v2; ref status: indexed,

    Directory of Open Access Journals (Sweden)

    Jasmina Kurepa


    Full Text Available Background: In the Arabidopsis 26S proteasome mutant rpn12a-1, an exon-trap T-DNA is inserted 531 base pairs downstream of the RPN12a STOP codon. We have previously shown that this insertion activates a STOP codon-associated latent 5' splice site that competes with the polyadenylation signal during processing of the pre-mRNA. As a result of this dual input from splicing and polyadenylation in the rpn12a-1 mutant, two RPN12a transcripts are produced and they encode the wild-type RPN12a and a chimeric RPN12a-NPTII protein. Both proteins form complexes with other proteasome subunits leading to the formation of wild-type and mutant proteasome versions. The net result of this heterogeneity of proteasome particles is a reduction of total cellular proteasome activity. One of the consequences of reduced proteasomal activity is decreased sensitivity to the major plant hormone cytokinin. Methods: We performed ethyl methanesulfonate mutagenesis of rpn12a-1 and isolated revertants with wild-type cytokinin sensitivity. Results: We describe the isolation and analyses of suppressor of rpn12a-1 (sor1. The sor1 mutation is intragenic and located at the fifth position of the chimeric intron. This mutation weakens the activated 5' splice site associated with the STOP codon and tilts the processing of the RPN12a mRNA back towards polyadenylation. Conclusions: These results validate our earlier interpretation of the unusual nature of the rpn12a-1 mutation. Furthermore, the data show that optimal 26S proteasome activity requires RPN12a accumulation beyond a critical threshold. Finally, this finding reinforces our previous conclusion that proteasome function is critical for the cytokinin-dependent regulation of plant growth.

  15. Suppression of HPV-16 late L1 5′-splice site SD3632 by binding of hnRNP D proteins and hnRNP A2/B1 to upstream AUAGUA RNA motifs (United States)

    Li, Xiaoze; Johansson, Cecilia; Glahder, Jacob; Mossberg, Ann-Kristin; Schwartz, Stefan


    Human papillomavirus type 16 (HPV-16) 5′-splice site SD3632 is used exclusively to produce late L1 mRNAs. We identified a 34-nt splicing inhibitory element located immediately upstream of HPV-16 late 5′-splice site SD3632. Two AUAGUA motifs located in these 34 nt inhibited SD3632. Two nucleotide substitutions in each of the HPV-16 specific AUAGUA motifs alleviated splicing inhibition and induced late L1 mRNA production from episomal forms of the HPV-16 genome in primary human keratinocytes. The AUAGUA motifs bind specifically not only to the heterogeneous nuclear RNP (hnRNP) D family of RNA-binding proteins including hnRNP D/AUF, hnRNP DL and hnRNP AB but also to hnRNP A2/B1. Knock-down of these proteins induced HPV-16 late L1 mRNA expression, and overexpression of hnRNP A2/B1, hnRNP AB, hnRNP DL and the two hnRNP D isoforms hnRNP D37 and hnRNP D40 further suppressed L1 mRNA expression. This inhibition may allow HPV-16 to hide from the immune system and establish long-term persistent infections with enhanced risk at progressing to cancer. There is an inverse correlation between expression of hnRNP D proteins and hnRNP A2/B1 and HPV-16 L1 production in the cervical epithelium, as well as in cervical cancer, supporting the conclusion that hnRNP D proteins and A2/B1 inhibit HPV-16 L1 mRNA production. PMID:24013563

  16. A novel 'splice site' HCN4 Gene mutation, c.1737+1 G>T, causes familial bradycardia, reduced heart rate response, impaired chronotropic competence and increased short-term heart rate variability. (United States)

    Hategan, Lidia; Csányi, Beáta; Ördög, Balázs; Kákonyi, Kornél; Tringer, Annamária; Kiss, Orsolya; Orosz, Andrea; Sághy, László; Nagy, István; Hegedűs, Zoltán; Rudas, László; Széll, Márta; Varró, András; Forster, Tamás; Sepp, Róbert


    The most important molecular determinant of heart rate regulation in sino-atrial pacemaker cells includes hyperpolarization-activated, cyclic nucleotide-gated ion channels, the major isoform of which is encoded by the HCN4 gene. Mutations affecting the HCN4 gene are associated primarily with sick sinus syndrome. A novel c.1737+1 G>T 'splice-site' HCN4 mutation was identified in a large family with familial bradycardia which co-segregated with the disease providing a two-point LOD score of 4.87. Twelve out of the 22 investigated family members [4 males, 8 females average age 36 (SD 6) years] were considered as clinically affected (heart rateheart rates [62 (SD 8) vs. 73 (SD 8) bpm, p=0.0168) were significantly lower in carriers on 24-hour Holter recordings. Under maximum exercise test carriers achieved significantly lower heart rates than non-carrier family members, and percent heart rate reserve and percent corrected heart rate reserve were significantly lower in carriers. Applying rigorous criteria for chronotropic incompetence a higher number of carriers exhibited chronotropic incompetence. Parameters, characterizing short-term variability of heart rate (i.e. rMSSD and pNN50%) were increased in carrier family members, even after normalization for heart rate, in the 24-hour ECG recordings with the same relative increase in 5-minute recordings. The identified novel 'splice site' HCN4 gene mutation, c.1737+1 G>T, causes familial bradycardia and leads to reduced heart rate response, impaired chronotropic competence and increased short-term heart rate variability in the mutation carriers. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel


    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  18. Airborne LIDAR Data Processing and Analysis Tools (United States)

    Zhang, K.


    Airborne LIDAR technology allows accurate and inexpensive measurements of topography, vegetation canopy heights, and buildings over large areas. In order to provide researchers high quality data, NSF has created the National Center for Airborne Laser Mapping (NCALM) to collect, archive, and distribute the LIDAR data. However, the LIDAR systems collect voluminous irregularly-spaced, three-dimensional point measurements of ground and non-ground objects scanned by the laser beneath the aircraft. To advance the use of the technology and data, NCALM is developing public domain algorithms for ground and non-ground measurement classification and tools for data retrieval and transformation. We present the main functions of the ALDPAT (Airborne LIDAR Data Processing and Analysis Tools) developed by NCALM. While Geographic Information Systems (GIS) provide a useful platform for storing, analyzing, and visualizing most spatial data, the shear volume of raw LIDAR data makes most commercial GIS packages impractical. Instead, we have developed a suite of applications in ALDPAT which combine self developed C++ programs with the APIs of commercial remote sensing and GIS software. Tasks performed by these applications include: 1) transforming data into specified horizontal coordinate systems and vertical datums; 2) merging and sorting data into manageable sized tiles, typically 4 square kilometers in dimension; 3) filtering point data to separate measurements for the ground from those for non-ground objects; 4) interpolating the irregularly spaced elevations onto a regularly spaced grid to allow raster based analysis; and 5) converting the gridded data into standard GIS import formats. The ALDPAT 1.0 is available through

  19. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.


    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  20. Sustainability Tools Inventory Initial Gap Analysis (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...


    Bercovici, Benjamin; McMahon, Jay


    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at Along with the commented code, one can find the code documentation at This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds

  2. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Chakraborty, S.


    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP) [de

  3. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten


    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...

  4. Parallel Enhancements of the General Mission Analysis Tool Project (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  5. Parallel Enhancements of the General Mission Analysis Tool, Phase I (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  6. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    Energy Technology Data Exchange (ETDEWEB)

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.


    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  7. Ball Bearing Analysis with the ORBIS Tool (United States)

    Halpin, Jacob D.


    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  8. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan


    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  9. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens


    . The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur.......This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...

  10. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley


    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  11. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley


    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  12. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri


    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  13. Buffer$--An Economic Analysis Tool (United States)

    Gary Bentrup


    Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...

  14. 5D Task Analysis Visualization Tool Phase II, Phase II (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  15. 5D Task Analysis Visualization Tool, Phase I (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  16. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)


    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  17. Graphical Acoustic Liner Design and Analysis Tool (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)


    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  18. An Integrated Tool for System Analysis of Sample Return Vehicles (United States)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.


    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  19. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool (United States)

    Maul, William A.; Fulton, Christopher E.


    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  20. Navigating freely-available software tools for metabolomics analysis. (United States)

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph


    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  1. Surrogate Analysis and Index Developer (SAID) tool (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.


    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  2. A Lexical Analysis Tool with Ambiguity Support


    Quesada, Luis; Berzal, Fernando; Cortijo, Francisco J.


    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  3. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike


    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  4. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.


    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  5. Novel mutation in the 5' splice site of exon 4 of the TCOF1 gene in the patient with Treacher Collins syndrome. (United States)

    Marszalek, Bozena; Wisniewski, Slawomir A; Wojcicki, Piotr; Kobus, Kazimierz; Trzeciak, Wieslaw H


    Treacher Collins syndrome (TCS) is caused by mutations in the TCOF1 gene. This gene encodes a serine/alanine-rich protein called treacle. The structure of the entire TCOF1 gene was investigated in a patient with TCS. We detected a novel deletion (376delAAGGTGAGTGGGACTGCC) spanning 3 bp of exon 4 and 15 bp of the adjacent intronic sequence. This mutation causes premature termination of translation, resulting in a truncated protein devoid of nucleolar localization signal, and potential phosphorylation sites. Real-time PCR analysis showed different melting temperatures of the amplified fragment containing normal allele and that harboring the 18 bp deletion, thus providing a rapid screening assay for this and other deletions of the TCOF1 gene. Copyright 2003 Wiley-Liss, Inc.

  6. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car...

  7. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault


    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  8. A Tool for the Concise Analysis of Patient Safety Incidents. (United States)

    Pham, Julius Cuong; Hoffman, Carolyn; Popescu, Ioana; Ijagbemi, O Mayowa; Carson, Kathryn A


    Patient safety incidents, sometimes referred to as adverse events, incidents, or patient safety events, are too common an occurrence in health care. Most methods for incident analysis are time and labor intensive. Given the significant resource requirements of a root cause analysis, for example, there is a need for a more targeted and efficient method of analyzing a larger number of incidents. Although several concise incident analysis tools are in existence, there are no published studies regarding their usability or effectiveness. Building on previous efforts, a Concise Incident Analysis (CIA) methodology and tool were developed to facilitate analysis of no- or low-harm incidents. Staff from 11 hospitals in five countries-Australia, Canada, Hong Kong, India, and the United States-pilot tested the tool in two phases. The tool was evaluated and refined after each phase on the basis of user perceptions of usability and effectiveness. From September 2013 through January 2014, 52 patient safety incidents were analyzed. A broad variety of incident types were investigated, the most frequent being patient falls (25%). Incidents came from a variety of hospital work areas, the most frequent being from the medical ward (37%). Most incidents investigated resulted in temporary harm or no harm (94%). All or most sites found the tool "understandable" (100%), "easy to use" (89%), and "effective" (89%). Some 95% of participants planned to continue to use all or some parts of the tool after the pilot. Qualitative feedback suggested that the tool allowed analysis of incidents that were not currently being analyzed because of insufficient resources. The tool was described as simple to use, easy to document, and aligned with the flow of the incident analysis. A concise tool for the investigation of patient safety incidents with low or no harm was well accepted across a select group of hospitals from five countries.

  9. RNAmute: RNA secondary structure mutation analysis tool

    Directory of Open Access Journals (Sweden)

    Barash Danny


    Full Text Available Abstract Background RNAMute is an interactive Java application that calculates the secondary structure of all single point mutations, given an RNA sequence, and organizes them into categories according to their similarity with respect to the wild type predicted structure. The secondary structure predictions are performed using the Vienna RNA package. Several alternatives are used for the categorization of single point mutations: Vienna's RNAdistance based on dot-bracket representation, as well as tree edit distance and second eigenvalue of the Laplacian matrix based on Shapiro's coarse grain tree graph representation. Results Selecting a category in each one of the processed tables lists all single point mutations belonging to that category. Selecting a mutation displays a graphical drawing of the single point mutation and the wild type, and includes basic information such as associated energies, representations and distances. RNAMute can be used successfully with very little previous experience and without choosing any parameter value alongside the initial RNA sequence. The package runs under LINUX operating system. Conclusion RNAMute is a user friendly tool that can be used to predict single point mutations leading to conformational rearrangements in the secondary structure of RNAs. In several cases of substantial interest, notably in virology, a point mutation may lead to a loss of important functionality such as the RNA virus replication and translation initiation because of a conformational rearrangement in the secondary structure.

  10. Using a minigene approach to characterize a novel splice site mutation in human F7 gene causing inherited factor VII deficiency in a Chinese pedigree. (United States)

    Yu, T; Wang, X; Ding, Q; Fu, Q; Dai, J; Lu, Y; Xi, X; Wang, H


    Factor VII deficiency which transmitted as an autosomal recessive disorder is a rare haemorrhagic condition. The aim of this study was to identify the molecular genetic defect and determine its functional consequences in a Chinese pedigree with FVII deficiency. The proband was diagnosed as inherited coagulation FVII deficiency by reduced plasma levels of FVII activity (4.4%) and antigen (38.5%). All nine exons and their flanking sequence of F7 gene were amplified by polymerase chain reaction (PCR) for the proband and the PCR products were directly sequenced. The compound heterozygous mutations of F7 (NM_000131.3) c.572-1G>A and F7 (NM_000131.3) c.1165T>G; p.Cys389Gly were identified in the proband's F7 gene. To investigate the splicing patterns associated with F7 c.572-1G>A, ectopic transcripts in leucocytes of the proband were analyzed. F7 minigenes, spanning from intron 4 to intron 7 and carrying either an A or a G at position -1 of intron 5, were constructed and transiently transfected into human embryonic kidney (HEK) 293T cells, followed by RT-PCR analysis. The aberrant transcripts from the F7 c.572-1G>A mutant allele were not detected by ectopic transcription study. Sequencing of the RT-PCR products from the mutant transfectant demonstrated the production of an erroneously spliced mRNA with exon 6 skipping, whereas a normal splicing occurred in the wide type transfectant. The aberrant mRNA produced from the F7 c.572-1G>A mutant allele is responsible for the factor VII deficiency in this pedigree.

  11. Bayesian data analysis tools for atomic physics (United States)

    Trassinelli, Martino


    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

  12. Biofuel transportation analysis tool : description, methodology, and demonstration scenarios (United States)


    This report describes a Biofuel Transportation Analysis Tool (BTAT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Department of Defense (DOD) Office of Naval Research ...

  13. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert


    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  14. The environment power system analysis tool development program (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.


    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  15. Failure Modes and Effects Analysis (FMEA) Assistant Tool (United States)

    National Aeronautics and Space Administration — The FMEA Assistant tool offers a new and unique approach to assist hardware developers and safety analysts perform failure analysis by using model based systems...

  16. Surface Operations Data Analysis and Adaptation Tool, Phase II (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  17. Constructing Social Networks From Secondary Storage With Bulk Analysis Tools (United States)



  18. A novel intronic splice site deletion of the IL-2 receptor common gamma chain results in expression of a dysfunctional protein and T-cell-positive X-linked Severe combined immunodeficiency. (United States)

    Gray, P E A; Logan, G J; Alexander, I E; Poulton, S; Roscioli, T; Ziegler, J


    X-linked severe combined immunodeficiency is caused by mutations in the IL-2 receptor common gamma chain and classically presents in the first 6 months of life with predisposition to bacterial, viral and fungal infections. In most instances, affected individuals are lymphopenic with near complete absence of T cells and NK cells. We report a boy who presented at 12 months of age with Pneumocystis jiroveci pneumonia and a family history consistent with X-linked recessive inheritance. He had a normal lymphocyte count including the presence of T cells and a broad T-cell-receptor diversity, as well as normal surface expression of the common gamma chain (CD132) protein. He however had profound hypogammaglobulinaemia, and IL-2-induced STAT5 phosphorylation was absent. Sequencing of IL-2RG demonstrated a 12-base pair intronic deletion close to the canonical splice site of exon 5, which resulted in a variety of truncated IL2RG mRNA species. A review of the literature identified 4 other patients with T-cell-positive X-SCID, with the current patient being the first associated with an mRNA splicing defect. This case raises the question of how a dysfunctional protein incapable of mediating STAT5 phosphorylation might nonetheless support T-cell development. Possible explanations are that STAT5-mediated signal transduction may be less relevant to IL7-receptor-mediated T-cell development than are other IL7R-induced intracellular transduction pathways or that a low level of STAT5 phosphorylation, undetectable in the laboratory, may be sufficient to support some T-cell development. © 2014 John Wiley & Sons Ltd.

  19. Clinical presentation and molecular identification of four uncommon alpha globin variants in Thailand. Initiation codon mutation of α2-globin Gene (HBA2:c.1delA), donor splice site mutation of α1-globin gene (IVSI-1, HBA1:c.95 + 1G>A), hemoglobin Queens Park/Chao Pra Ya (HBA1:c.98T>A) and hemoglobin Westmead (HBA2:c.369C>G). (United States)

    Viprakasit, Vip; Ekwattanakit, Supachai; Chalaow, Nipon; Riolueang, Suchada; Wijit, Sirirat; Tanyut, Porntep; Chat-Uthai, Nunthawut; Tachavanich, Kalaya


    Alpha thalassemia is the most common genetic disease in the world with the prevalence of carriers ranging from 5-50% in several populations. Coinheritance of two defective α-globin genes usually gives rise to a symptomatic condition, hemoglobin (Hb) H disease. Previously, it has been suggested from several studies in different populations that nondeletional Hb H disease (--/α(T)α or --/αα(T)) is generally more severe than the deletional type (--/-α). In this report, we describe four rare nondeletional α-thalassemia mutations in Thai individuals, including initiation codon mutation (HBA2:c.1delA), donor splice site mutation (IVSI-1, HBA1:c.95 + 1G>A), Hb Queens Park (HBA1:c.98T>A) [α32(B13)Met>Lys], and Hb Westmead (HBA2:c.369C>G) [α122(H5)His>Gln]. Interactions of the first three mutations with the α(0)-thalassemia resulted in nondeletional Hb H disease; however, their clinical presentations were rather mild and some were detected accidentally. This suggests that a genotype-phenotype correlation of α-thalassemia syndrome might be more heterogeneous and so the type of mutation does not simply imply the prediction of the resulting phenotype. Our data will be of use in future genetic counseling of such conditions that are increasingly identified thanks to the improvement of molecular analysis in routine laboratories. © 2013 S. Karger AG, Basel.

  20. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp


    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  1. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  2. NMR spectroscopy: a tool for conformational analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto, E-mail: [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Lab. de Fisico-Quimica Organica; Freitas, Matheus P. [Universidade Federal de Lavras (UFLA), MG (Brazil). Dept. de Qumica


    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  3. NMR spectroscopy: a tool for conformational analysis

    International Nuclear Information System (INIS)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto; Freitas, Matheus P.


    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  4. Advanced tools for in vivo skin analysis. (United States)

    Cal, Krzysztof; Zakowiecki, Daniel; Stefanowska, Justyna


    A thorough examination of the skin is essential for accurate disease diagnostics, evaluation of the effectiveness of topically applied drugs and the assessment of the results of dermatologic surgeries such as skin grafts. Knowledge of skin parameters is also important in the cosmetics industry, where the effects of skin care products are evaluated. Due to significant progress in the electronics and computer industries, sophisticated analytic devices are increasingly available for day-to-day diagnostics. The aim of this article is to review several advanced methods for in vivo skin analysis in humans: magnetic resonance imaging, electron paramagnetic resonance, laser Doppler flowmetry and time domain reflectometry. The molecular bases of these techniques are presented, and several interesting applications in the field are discussed. Methods for in vivo assessment of the biomechanical properties of human skin are also reviewed.

  5. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)


    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  6. SAGE Research Methods Datasets: A Data Analysis Educational Tool. (United States)

    Vardell, Emily


    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  7. A computer aided tolerancing tool, II: tolerance analysis

    NARCIS (Netherlands)

    Salomons, O.W.; Haalboom, F.J.; Jonge poerink, H.J.; van Slooten, F.; van Slooten, F.; van Houten, Frederikus J.A.M.; Kals, H.J.J.


    A computer aided tolerance analysis tool is presented that assists the designer in evaluating worst case quality of assembly after tolerances have been specified. In tolerance analysis calculations, sets of equations are generated. The number of equations can be restricted by using a minimum number

  8. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran


    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  9. Gender analysis of use of participatory tools among extension workers

    African Journals Online (AJOL)

    (c2 = 0.833, p = 0.361; t = 0.737, p = 0.737, CC = 0.396) Participatory tools used by both male and female extension personnel include resource map, mobility map, transect map, focus group discussion, venn diagram, seasonal calendar, SWOT analysis, semistructured interview, daily activity schedule, resource analysis, ...

  10. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa


    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  11. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis


    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  12. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup


    The purpose of this paper is to develop a simplified ship collision analysis tool in order to rapidly estimate the structural damage and energy absorption of both striking and struck ships as well as prediction of rupture of cargo oil tanks of struck tankers. The present tool calculates external...... to the collision scenario thatwhere a VLCC in ballast condition collides perpendicularly with the mid part of another D/H VLCC in fully loaded condition. The results obtained from the present tool are compared with those obtained by large scale FEA, and fairy good agreements are achieved. The applicability...

  13. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  14. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)


    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  15. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.


    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  16. Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool (United States)


    REQUIREMENTS GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL by Jonathan M. Swan June 2017 Thesis Advisor...GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL 5. FUNDING NUMBERS 6. AUTHOR(S) Jonathan M. Swan 7. PERFORMING ORGANIZATION...maximum 200 words) This thesis conducts an analysis of the system requirements for the Logistics Analysis and Wargame Support Tool (LAWST). It studies

  17. SmashCommunity: A metagenomic annotation and analysis tool

    DEFF Research Database (Denmark)

    Arumugam, Manimozhiyan; Harrington, Eoghan D; Foerstner, Konrad U


    SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate the quanti......SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate...

  18. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  19. Analysis of design tool attributes with regards to sustainability benefits (United States)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.


    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  20. A Comparative Analysis of Life-Cycle Assessment Tools for ... (United States)

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal solid waste-decision support tool (MSW-DST), solid waste optimization life-cycle framework (SWOLF), environmental assessment system for environmental technologies (EASETECH), and waste and resources assessment for the environment (WRATE). WARM, MSW-DST, and SWOLF were developed for US-specific materials management strategies, while WRATE and EASETECH were developed for European-specific conditions. All of the tools (with the exception of WARM) allow specification of a wide variety of parameters (e.g., materials composition and energy mix) to a varying degree, thus allowing users to model specific EOL materials management methods even outside the geographical domain they are originally intended for. The flexibility to accept user-specified input for a large number of parameters increases the level of complexity and the skill set needed for using these tools. The tools were evaluated and compared based on a series of criteria, including general tool features, the scope of the analysis (e.g., materials and processes included), and the impact categories analyzed (e.g., climate change, acidification). A series of scenarios representing materials management problems currently relevant to c

  1. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning


    The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where an a...... provide the possibility for the designer to work both with the aesthetics as well as the technical aspects of architectural design.......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered...

  2. Tools for T-RFLP data analysis using Excel. (United States)

    Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie


    Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.

  3. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.


    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  4. An Online Image Analysis Tool for Science Education (United States)

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.


    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  5. Tools of Audience Analysis in Contemporary Political Campaigns. (United States)

    Friedenberg, Robert V.

    This paper examines two basic tools of audience analysis as they are used in contemporary political campaingning: public opinion polls and interpretations of voter statistics. The raw data used in the statistical analyses reported in this investigation come from national polls and voter statistics provided to Republican candidates running in local…

  6. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    ABSTRACT: Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical. Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a ...

  7. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory


    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  8. Rapid Benefit Indicators (RBI) Spatial Analysis Tools - Manual (United States)

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  9. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila


    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  10. An Automated Data Analysis Tool for Livestock Market Data (United States)

    Williams, Galen S.; Raper, Kellie Curry


    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  11. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  12. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Clough, D; Fletcher, S; Longstaff, A P; Willoughby, P


    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  13. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)


    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  14. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study (United States)

    Flores, Melissa; Malin, Jane T.


    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  15. Analysis for Non-Traditional Security Challenges: Methods and Tools (United States)


    Course of Action COCOM Combatant Commander COI Community of Interest CPB Cultural Preparation of the Battlefield CPM Critical Path Method DARPA Defense...Secretary of Defense (Program Analysis and Evaluation) PACOM United States Pacific Command PERT Program Evaluation Review Technique PMESII Political...Availability .U of each pos f1 00" tool.Ow, pomty al wi c J Metod F pl Tools I I LL J L L L Dvnmuoc jI.1n~stl osb M00io ~g~osgdowr01Vfl) X x

  16. A dataflow analysis tool for parallel processing of algorithms (United States)

    Jones, Robert L., III


    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  17. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Directory of Open Access Journals (Sweden)

    Kurovskaya Yulia G.


    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  18. The Development of a Humanitarian Health Ethics Analysis Tool. (United States)

    Fraser, Veronique; Hunt, Matthew R; de Laat, Sonya; Schwartz, Lisa


    Introduction Health care workers (HCWs) who participate in humanitarian aid work experience a range of ethical challenges in providing care and assistance to communities affected by war, disaster, or extreme poverty. Although there is increasing discussion of ethics in humanitarian health care practice and policy, there are very few resources available for humanitarian workers seeking ethical guidance in the field. To address this knowledge gap, a Humanitarian Health Ethics Analysis Tool (HHEAT) was developed and tested as an action-oriented resource to support humanitarian workers in ethical decision making. While ethical analysis tools increasingly have become prevalent in a variety of practice contexts over the past two decades, very few of these tools have undergone a process of empirical validation to assess their usefulness for practitioners. A qualitative study consisting of a series of six case-analysis sessions with 16 humanitarian HCWs was conducted to evaluate and refine the HHEAT. Participant feedback inspired the creation of a simplified and shortened version of the tool and prompted the development of an accompanying handbook. The study generated preliminary insight into the ethical deliberation processes of humanitarian health workers and highlighted different types of ethics support that humanitarian workers might find helpful in supporting the decision-making process.

  19. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin


    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  20. SMART: Statistical Metabolomics Analysis-An R Tool. (United States)

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou


    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from .

  1. Formal Analysis Tools for the Synchronous Aspect Language Larissa

    Directory of Open Access Journals (Sweden)

    Stauch David


    Full Text Available Abstract We present two tools for the formal analysis of the aspect language Larissa, which extends the simple synchronous language Argos. The first tool concerns the combination of design-by-contract with Larissa aspects, and shows how we can apply an aspect not only to a program, but to a specification of programs in form of a contract, and obtain a new contract. The second concerns aspect interferences, that is, aspects that influence each other in an unintended way if they are applied to the same program. We present a way to weave aspects in a less conflict-prone manner, and a means to detect remaining conflicts statically. These tools are quite powerful, compared to those available for other aspect languages.

  2. Formal Analysis Tools for the Synchronous Aspect Language Larissa

    Directory of Open Access Journals (Sweden)

    David Stauch


    Full Text Available We present two tools for the formal analysis of the aspect language Larissa, which extends the simple synchronous language Argos. The first tool concerns the combination of design-by-contract with Larissa aspects, and shows how we can apply an aspect not only to a program, but to a specification of programs in form of a contract, and obtain a new contract. The second concerns aspect interferences, that is, aspects that influence each other in an unintended way if they are applied to the same program. We present a way to weave aspects in a less conflict-prone manner, and a means to detect remaining conflicts statically. These tools are quite powerful, compared to those available for other aspect languages.

  3. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen


    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  4. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.


    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  5. Is motion analysis a valid tool for assessing laparoscopic skill? (United States)

    Mason, John D; Ansell, James; Warren, Neil; Torkington, Jared


    The use of simulation for laparoscopic training has led to the development of objective tools for skills assessment. Motion analysis represents one area of focus. This study was designed to assess the evidence for the use of motion analysis as a valid tool for laparoscopic skills assessment. Embase, MEDLINE and PubMed were searched using the following domains: (1) motion analysis, (2) validation and (3) laparoscopy. Studies investigating motion analysis as a tool for assessment of laparoscopic skill in general surgery were included. Common endpoints in motion analysis metrics were compared between studies according to a modified form of the Oxford Centre for Evidence-Based Medicine levels of evidence and recommendation. Thirteen studies were included from 2,039 initial papers. Twelve (92.3 %) reported the construct validity of motion analysis across a range of laparoscopic tasks. Of these 12, 5 (41.7 %) evaluated the ProMIS Augmented Reality Simulator, 3 (25 %) the Imperial College Surgical Assessment Device (ICSAD), 2 (16.7 %) the Hiroshima University Endoscopic Surgical Assessment Device (HUESAD), 1 (8.33 %) the Advanced Dundee Endoscopic Psychomotor Tester (ADEPT) and 1 (8.33 %) the Robotic and Video Motion Analysis Software (ROVIMAS). Face validity was reported by 1 (7.7 %) study each for ADEPT and ICSAD. Concurrent validity was reported by 1 (7.7 %) study each for ADEPT, ICSAD and ProMIS. There was no evidence for predictive validity. Evidence exists to validate motion analysis for use in laparoscopic skills assessment. Valid parameters are time taken, path length and number of hand movements. Future work should concentrate on the conversion of motion data into competency-based scores for trainee feedback.

  6. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles (United States)

    Ivanco, Thomas G.


    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  7. Procrustes rotation as a diagnostic tool for projection pursuit analysis. (United States)

    Wentzell, Peter D; Hou, Siyuan; Silva, Carolina Santos; Wicks, Chelsi C; Pimentel, Maria Fernanda


    Projection pursuit (PP) is an effective exploratory data analysis tool because it optimizes the projection of high dimensional data using distributional characteristics rather than variance or distance metrics. The recent development of fast and simple PP algorithms based on minimization of kurtosis for clustering data has made this powerful tool more accessible, but under conditions where the sample-to-variable ratio is small, PP fails due to opportunistic overfitting of random correlations to limiting distributional targets. Therefore, some kind of variable compression or data regularization is required in these cases. However, this introduces an additional parameter whose optimization is manually time consuming and subject to bias. The present work describes the use of Procrustes analysis as diagnostic tool that can be used to evaluate the results of PP analysis in an efficient manner. Through Procrustes rotation, the similarity of different PP projections can be examined in an automated fashion with "Procrustes maps" to establish regions of stable projections as a function of the parameter to be optimized. The application of this diagnostic is demonstrated using principal components analysis to compress FTIR spectra from ink samples of ten different brands of pen, and also in conjunction with regularized PP for soybean disease classification. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Volumetric measurements of pulmonary nodules: variability in automated analysis tools (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot


    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  9. Multi-Spacecraft Analysis with Generic Visualization Tools (United States)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.


    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  10. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.


    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  11. Analysis of functionality free CASE-tools databases design

    Directory of Open Access Journals (Sweden)

    A. V. Gavrilov


    Full Text Available The introduction in the educational process of database design CASEtechnologies requires the institution of significant costs for the purchase of software. A possible solution could be the use of free software peers. At the same time this kind of substitution should be based on even-com representation of the functional characteristics and features of operation of these programs. The purpose of the article – a review of the free and non-profi t CASE-tools database design, as well as their classifi cation on the basis of the analysis functionality. When writing this article were used materials from the offi cial websites of the tool developers. Evaluation of the functional characteristics of CASEtools for database design made exclusively empirically with the direct work with software products. Analysis functionality of tools allow you to distinguish the two categories CASE-tools database design. The first category includes systems with a basic set of features and tools. The most important basic functions of these systems are: management connections to database servers, visual tools to create and modify database objects (tables, views, triggers, procedures, the ability to enter and edit data in table mode, user and privilege management tools, editor SQL-code, means export/import data. CASE-system related to the first category can be used to design and develop simple databases, data management, as well as a means of administration server database. A distinctive feature of the second category of CASE-tools for database design (full-featured systems is the presence of visual designer, allowing to carry out the construction of the database model and automatic creation of the database on the server based on this model. CASE-system related to this categories can be used for the design and development of databases of any structural complexity, as well as a database server administration tool. The article concluded that the

  12. The RUMBA software: tools for neuroimaging data analysis. (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio


    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  13. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric


    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  14. Analysis and Prediction of Micromilling Stability with Variable Tool Geometry

    Directory of Open Access Journals (Sweden)

    Ziyang Cao


    Full Text Available Micromilling can fabricate miniaturized components using micro-end mill at high rotational speeds. The analysis of machining stability in micromilling plays an important role in characterizing the cutting process, estimating the tool life, and optimizing the process. A numerical analysis and experimental method are presented to investigate the chatter stability in micro-end milling process with variable milling tool geometry. The schematic model of micromilling process is constructed and the calculation formula to predict cutting force and displacements is derived. This is followed by a detailed numerical analysis on micromilling forces between helical ball and square end mills through time domain and frequency domain method and the results are compared. Furthermore, a detailed time domain simulation for micro end milling with straight teeth and helical teeth end mill is conducted based on the machine-tool system frequency response function obtained through modal experiment. The forces and displacements are predicted and the simulation result between variable cutter geometry is deeply compared. The simulation results have important significance for the actual milling process.

  15. Aerospace Power Systems Design and Analysis (APSDA) Tool (United States)

    Truong, Long V.


    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.


    Directory of Open Access Journals (Sweden)

    A. V. Tyurin


    Full Text Available A concept for organization and planning of computational experiment aimed at implementation of multivariate analysis of complex multifactor models is proposed. It is based on the generation of calculations tree. The logical and structural schemes of the tree are given and software tools, as well, for the automation of work with it: calculation generation, carrying out calculations and analysis of the obtained results. Computer modeling systems and such special-purpose systems as RACS and PRADIS do not solve the problems connected with effective carrying out of computational experiment, consisting of its organization, planning, execution and analysis of the results. Calculation data storage for computational experiment organization is proposed in the form of input and output data tree. Each tree node has a reference to the calculation of model step performed earlier. The storage of calculations tree is realized in a specially organized directory structure. A software tool is proposed for creating and modifying design scheme that stores the structure of one branch of the calculation tree with the view of effective planning of multivariate calculations. A set of special-purpose software tools gives the possibility for the quick generation and modification of the tree, addition of calculations with step-by-step change in the model factors. To perform calculations, software environment in the form of a graphical user interface for creating and modifying calculation script has been developed. This environment makes it possible to traverse calculation tree in a certain order and to perform serial and parallel initiation of computational modules. To analyze the results, software tool has been developed, operating on the base of the tag tree. It is a special tree that stores input and output data of the calculations in the set of changes form of appropriate model factors. The tool enables to select the factors and responses of the model at various steps

  17. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  18. Design and Application of the Exploration Maintainability Analysis Tool (United States)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew


    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew

  19. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci


    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  20. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    Plott, B.


    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  1. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis (United States)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid


    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  2. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu


    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  3. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    Couet, O.; Ferrero-Merlino, B.; Molnar, Z.; Moscicki, J.T.; Pfeiffer, A.; Sang, M.


    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  4. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.


    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  5. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III


    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  6. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.


    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  7. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools (United States)

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.


    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  8. Schema for the LANL infrasound analysis tool, infrapy

    Energy Technology Data Exchange (ETDEWEB)

    Dannemann, Fransiska Kate [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Marcillo, Omar Eduardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The purpose of this document is to define the schema used for the operation of the infrasound analysis tool, infrapy. The tables described by this document extend the CSS3.0 or KB core schema to include information required for the operation of infrapy. This document is divided into three sections, the first being this introduction. Section two defines eight new, infrasonic data processing-specific database tables. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams.

  9. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima


    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  10. Software Tools for the Analysis of Functional Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Mehdi Behroozi


    Full Text Available Functional magnetic resonance imaging (fMRI has become the most popular method for imaging of brain functions. Currently, there is a large variety of software packages for the analysis of fMRI data, each providing many features for users. Since there is no single package that can provide all the necessary analyses for the fMRI data, it is helpful to know the features of each software package. In this paper, several software tools have been introduced and they have been evaluated for comparison of their functionality and their features. The description of each program has been discussed and summarized.

  11. Software Tools for the Analysis of functional Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Mehdi Behroozi


    Full Text Available Functional magnetic resonance imaging (fMRI has become the most popular method for imaging of brain functions. Currently, there is a large variety of software packages for the analysis of fMRI data, each providing many features for users. Since there is no single package that can provide all the necessary analyses for the fMRI data, it is helpful to know the features of each software package. In this paper, several software tools have been introduced and they have been evaluated for comparison of their functionality and their features. The description of each program has been discussed and summarized

  12. Smart roadside initiative macro benefit analysis : user's guide for the benefit-cost analysis tool. (United States)

    Through the Smart Roadside Initiative (SRI), a Benefit-Cost Analysis (BCA) tool was developed for the evaluation of : various new transportation technologies at a State level and to provide results that could support technology adoption : by a State ...

  13. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C


    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  14. Analysis tools for the interplay between genome layout and regulation. (United States)

    Bouyioukos, Costas; Elati, Mohamed; Képès, François


    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  15. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen


    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  16. Message Correlation Analysis Tool for NOvA

    CERN Multimedia

    CERN. Geneva


    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  17. Bibliological analysis: security tool in collections of rare books

    Directory of Open Access Journals (Sweden)

    Raphael Diego Greenhalgh


    Full Text Available The historical-cultural and economic value of rare book makes the same is often the object of robbery and theft of rare books. Therefore, the guardians of this type of library institutions have to strengthen their safety in order to inhibit such criminal practice. This paper, through literature review has the objective analyzing the possibility of using bibliological analysis as a security tool. Because with the detailed description of the copies is possible to increase the knowledge about the collection of an institution and attribute unequivocal property to rare specimens, that not always receive stamps or other marks, making it difficult to return the books in case of robbery and theft and possible recovery thereof. Therefore, the bibliological analysis individualizes the exemplary, besides allowing the handling and extensive knowledge of this, essential factors to security of the books.

  18. Mechanical System Analysis/Design Tool (MSAT) Quick Guide (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack


    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  19. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)


    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  20. Message correlation analysis tool for NOvA

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qiming [Fermilab; Biery, Kurt A. [Fermilab; Kowalkowski, James B. [Fermilab


    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  1. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.


    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  2. The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy (United States)

    Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.


    In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

  3. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter


    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...... and parameterised reward annotations. SBOAT allows the optimisation of these processes by specifying optimisation goals by means of probabilistic control tree logic (PCTL). Optimisation is performed by means of an evolutionary algorithm where stochastic model checking, in the form of the PRISM model checker......, is used to compute the fitness, the performance of a candidate in terms of the specified goals, of variants of a process. Our evolutionary algorithm approach uses a matrix representation of process models to efficiently allow mutation and crossover of a process model to be performed, allowing broad...

  4. metaSNV: A tool for metagenomic strain level analysis.

    Directory of Open Access Journals (Sweden)

    Paul Igor Costea

    Full Text Available We present metaSNV, a tool for single nucleotide variant (SNV analysis in metagenomic samples, capable of comparing populations of thousands of bacterial and archaeal species. The tool uses as input nucleotide sequence alignments to reference genomes in standard SAM/BAM format, performs SNV calling for individual samples and across the whole data set, and generates various statistics for individual species including allele frequencies and nucleotide diversity per sample as well as distances and fixation indices across samples. Using published data from 676 metagenomic samples of different sites in the oral cavity, we show that the results of metaSNV are comparable to those of MIDAS, an alternative implementation for metagenomic SNV analysis, while data processing is faster and has a smaller storage footprint. Moreover, we implement a set of distance measures that allow the comparison of genomic variation across metagenomic samples and delineate sample-specific variants to enable the tracking of specific strain populations over time. The implementation of metaSNV is available at:

  5. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  6. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials. (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian


    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  7. msBiodat analysis tool, big data analysis for high-throughput experiments. (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver


    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at

  8. Web analytics tools and web metrics tools: An overview and comparative analysis


    Bekavac, Ivan; Garbin Praničević, Daniela


    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  9. Multi-Mission Power Analysis Tool (MMPAT) Version 3 (United States)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.


    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  10. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli


    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  11. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov


    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  12. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana


    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  13. Sensitivity analysis of an information fusion tool: OWA operator (United States)

    Zarghaami, Mahdi; Ardakanian, Reza; Szidarovszky, Ferenc


    The successful design and application of the Ordered Weighted Averaging (OWA) method as a decision making tool depend on the efficient computation of its order weights. The most popular methods for determining the order weights are the Fuzzy Linguistic Quantifiers approach and the Minimal Variability method which give different behavior patterns for OWA. These methods will be compared by using Sensitivity Analysis on the outputs of OWA with respect to the optimism degree of the decision maker. The theoretical results are illustrated in a water resources management problem. The Fuzzy Linguistic Quantifiers approach gives more information about the behavior of the OWA outputs in comparison to the Minimal Variability method. However, in using the Minimal Variability method, the OWA has a linear behavior with respect to the optimism degree and therefore it has better computation efficiency.

  14. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis. (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco


    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  15. ASOURCE: Source Term Analysis Tool for Advanced Fuel Cycle

    International Nuclear Information System (INIS)

    Cho, Dong Keun; Kook, Dong Hak; Choi, Jong Won; Choi, Heui Joo; Jeong, Jong Tae


    In 2007, the 3 rd Comprehensive Nuclear Energy Promotion Plan, passed at the 254 th meeting of the Atomic Energy Commission, was announced as an R and D action plan for the development of an advanced fuel cycle adopting a sodium-cooled fast reactor (SFR) in connection with a pyroprocess for a sustainable stable energy supply and a reduction in the amount of spent fuel (SF). It is expected that this fuel cycle can greatly reduce the SF inventory through a recycling process in which transuranics (TRU) and long-lived nuclides are burned in the SFR and cesium and strontium are disposed of after sufficient interim storage. For the success of the R and D plan, there are several issues related to the source term analysis. These are related with the following: (a) generation of inflow and outflow source terms of mixed SF in each process for the design of the pyroprocess facility, (b) source terms of mixed radwaste in a canister for the design of storage and disposal systems, (c) overall inventory estimation for TRU and long-lived nuclides for the design of the SFR, and (d) best estimate source terms for the practical design of the interim storage facility of SFs. A source term evaluation for a SF or radwaste with a single irradiation profile can be easily accomplished with the conventional computation tool. However, source term assessment for a batch of SFs or a mixture of radwastes generated from SFs with different irradiation profiles. A task that is essential to support the aforementioned activities is not possible with the conventional tool. Therefore, hybrid computing program for source term analysis to support the advanced fuel cycle was developed


    Ortiz, C. J.


    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  17. Steam Generator Analysis Tools and Modeling of Degradation Mechanisms

    International Nuclear Information System (INIS)

    Yetisir, M.; Pietralik, J.; Tapping, R.L.


    The degradation of steam generators (SGs) has a significant effect on nuclear heat transport system effectiveness and the lifetime and overall efficiency of a nuclear power plant. Hence, quantification of the effects of degradation mechanisms is an integral part of a SG degradation management strategy. Numerical analysis tools such as THIRST, a 3-dimensional (3D) thermal hydraulics code for recirculating SGs; SLUDGE, a 3D sludge prediction code; CHECWORKS a flow-accelerated corrosion prediction code for nuclear piping, PIPO-FE, a SG tube vibration code; and VIBIC and H3DMAP, 3D non-linear finite-element codes to predict SG tube fretting wear can be used to assess the impacts of various maintenance activities on SG thermal performance. These tools are also found to be invaluable at the design stage to influence the design by determining margins or by helping the designers minimize or avoid known degradation mechanisms. In this paper, the aforementioned numerical tools and their application to degradation mechanisms in CANDU recirculating SGs are described. In addition, the following degradation mechanisms are identified and their effect on SG thermal efficiency and lifetime are quantified: primary-side fouling, secondary-side fouling, fretting wear, and flow-accelerated corrosion (FAC). Primary-side tube inner diameter fouling has been a major contributor to SG thermal degradation. Using the results of thermalhydraulic analysis and field data, fouling margins are calculated. Individual effects of primary- and secondary-side fouling are separated through analyses, which allow station operators to decide what type of maintenance activity to perform and when to perform the maintenance activity. Prediction of the fretting-wear rate of tubes allows designers to decide on the number and locations of support plates and U-bend supports. The prediction of FAC rates for SG internals allows designers to select proper materials, and allows operators to adjust the SG maintenance


    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ


    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  19. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac


    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  20. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior. (United States)

    Forsström, David; Hesser, Hugo; Carlbring, Per


    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  1. Interaction tools for underwater shock analysis in naval platform design

    NARCIS (Netherlands)

    Aanhold, J.E.; Tuitman, J.T.; Trouwborst, W.; Vaders, J.A.A.


    In order to satisfy the need for good quality UNDerwater EXplosion (UNDEX) response estimates of naval platforms, TNO developed two 3D simulation tools: the Simplified Interaction Tool (SIT) and the hydro/structural code 3DCAV. Both tools are an add-on to LS-DYNA. SIT is a module of user routines

  2. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.


    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models...... is reported. (C) 1999 Elsevier Science S.A. All rights reserved....

  3. Discourse Analysis: A Tool for Helping Educators to Teach Science

    Directory of Open Access Journals (Sweden)

    Katerina Plakitsi


    Full Text Available This article refers to a part of a collaborative action research project in three elementary science classrooms. The project aims at the transformation of the nature and type of teachers' discursive practices into more collaborative inquiries. The basic strategy is to give the teachers the opportunity to analyze their discourse using a three-dimensional context of analysis. The teachers analyzed their discursive repertoires when teaching science. They studied the companion meaning, i.e., the different layers of explicit and tacit messages they communicate about Nature of Science (NoS, Nature of Teaching (NoT, and Nature of Language (NoL. The question investigated is the following: Could an action research program, which involves teachers in the analysis of their own discursive practices, lead to the transformation of discourse modes that take place in the science classrooms to better communicate aspects of NoS, NoT and NoL in a collaborative, inquiry-based context? Results indicate that the teachers' involvement in their discourse analysis led to a transformation in the discursive repertoires in their science classrooms. Gradually, the teachers' companion meanings that were created, implicitly/explicitly, from the dialogues taking place during science lessons were more appropriate for the establishment of a productive collaborative inquiry learning context. We argue that discourse analysis could be used for research purposes, as a training medium or as a reflective tool on how teachers communicate science. URN:

  4. Kinematic Analysis of a 3-dof Parallel Machine Tool with Large Workspace

    Directory of Open Access Journals (Sweden)

    Shi Yan


    Full Text Available Kinematics of a 3-dof (degree of freedom parallel machine tool with large workspace was analyzed. The workspace volume and surface and boundary posture angles of the 3-dof parallel machine tool are relatively large. Firstly, three dimensional simulation manipulator of the 3-dof parallel machine tool was constructed, and its joint distribution was described. Secondly, kinematic models of the 3-dof parallel machine tool were fixed on, including displacement analysis, velocity analysis, and acceleration analysis. Finally, the kinematic models of the machine tool were verified by a numerical example. The study result has an important significance to the application of the parallel machine tool.

  5. Abstract interfaces for data analysis - component architecture for data analysis tools

    International Nuclear Information System (INIS)

    Barrand, G.; Binko, P.; Doenszelmann, M.; Pfeiffer, A.; Johnson, A.


    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis'99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. The authors give an overview of the architecture and design of the various components for data analysis as discussed in AIDA

  6. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri


    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  7. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems (United States)

    Pakarinen, Jyri


    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  8. Trade-Space Analysis Tool for Constellations (TAT-C) (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja


    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of

  9. Actigraphy and motion analysis: new tools for psychiatry. (United States)

    Teicher, M H


    Altered locomotor activity is a cardinal sign of several psychiatric disorders. With advances in technology, activity can now be measured precisely. Contemporary studies quantifying activity in psychiatric patients are reviewed. Studies were located by a Medline search (1965 to present; English language only) cross-referencing motor activity and major psychiatric disorders. The review focused on mood disorders and attention-deficit hyperactivity disorder (ADHD). Activity levels are elevated in mania, agitated depression, and ADHD and attenuated in bipolar depression and seasonal depression. The percentage of low-level daytime activity is directly related to severity of depression, and change in this parameter accurately mirrors recovery. Demanding cognitive tasks elicit fidgeting in children with ADHD, and precise measures of activity and attention may provide a sensitive and specific marker for this disorder. Circadian rhythm analysis enhances the sophistication of activity measures. Affective disorders in children and adolescents are characterized by an attenuated circadian rhythm and an enhanced 12-hour harmonic rhythm (diurnal variation). Circadian analysis may help to distinguish between the activity patterns of mania (dysregulated) and ADHD (intact or enhanced). Persistence of hyperactivity or circadian dysregulation in bipolar patients treated with lithium appears to predict rapid relapse once medication is discontinued. Activity monitoring is a valuable research tool, with the potential to aid clinicians in diagnosis and in prediction of treatment response.


    Directory of Open Access Journals (Sweden)



    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  11. Study of academic achievements using spatial analysis tools (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.


    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  12. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis......, on the data of the funnel, which jointly estimates the FAT and the PET. Ideal funnels are lean and symmetric. Empirical funnels are wide, and most have asymmetries biasing the plain average. Many asymmetries are due to censoring made during the research-publication process. The PET is tooled to correct...

  13. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer


    Shang-Liang Chen; Yin-Ting Cheng; Chin-Fa Su


    Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as ...

  14. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools


    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.


    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnost...

  15. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.


    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  16. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko


    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  17. Prenatal diagnosis and molecular genetic analysis of short rib-polydactyly syndrome type III (Verma-Naumoff in a second-trimester fetus with a homozygous splice site mutation in intron 4 in the NEK1 gene

    Directory of Open Access Journals (Sweden)

    Chih-Ping Chen


    Conclusion: Polydactyly, micromelia, metaphyseal spurs, widened humeral metaphyses, and shortened ribs can be prominent prenatal ultrasound findings of SRPS III. The present case provides evidence for a correlation of a mutation in the NEK1 gene with SRPS III.

  18. The use of current risk analysis tools evaluated towards preventing external domino accidents

    NARCIS (Netherlands)

    Reniers, Genserik L L; Dullaert, W.; Ale, B. J.M.; Soudan, K.

    Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that

  19. Analysis of the influence of tool dynamics in diamond turning

    Energy Technology Data Exchange (ETDEWEB)

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.


    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  20. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities, Phase I (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool, which...

  1. Bioelectrical impedance analysis: A new tool for assessing fish condition (United States)

    Hartman, Kyle J.; Margraf, F. Joseph; Hafs, Andrew W.; Cox, M. Keith


    Bioelectrical impedance analysis (BIA) is commonly used in human health and nutrition fields but has only recently been considered as a potential tool for assessing fish condition. Once BIA is calibrated, it estimates fat/moisture levels and energy content without the need to kill fish. Despite the promise held by BIA, published studies have been divided on whether BIA can provide accurate estimates of body composition in fish. In cases where BIA was not successful, the models lacked the range of fat levels or sample sizes we determined were needed for model success (range of dry fat levels of 29%, n = 60, yielding an R2 of 0.8). Reduced range of fat levels requires an increased sample size to achieve that benchmark; therefore, standardization of methods is needed. Here we discuss standardized methods based on a decade of research, identify sources of error, discuss where BIA is headed, and suggest areas for future research.

  2. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies. (United States)

    Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André


    Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  3. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny


    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  4. VisIt: Interactive Parallel Visualization and Graphical Analysis Tool (United States)

    Department Of Energy (DOE) Advanced Simulation; Computing Initiative (ASCI)


    VisIt is a free interactive parallel visualization and graphical analysis tool for viewing scientific data on Unix and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range. See the table below for more details about the tool’s features. VisIt was developed by the Department of Energy (DOE) Advanced Simulation and Computing Initiative (ASCI) to visualize and analyze the results of terascale simulations. It was developed as a framework for adding custom capabilities and rapidly deploying new visualization technologies. Although the primary driving force behind the development of VisIt was for visualizing terascale data, it is also well suited for visualizing data from typical simulations on desktop systems.

  5. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool (United States)

    Lee, Nathaniel; Welch, Bryan W.


    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  6. Tools4miRs - one place to gather all the tools for miRNA analysis. (United States)

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr


    MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Tools4miRs is implemented in Python using Django and is freely available at Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  7. Tools4miRs – one place to gather all the tools for miRNA analysis (United States)

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr


    Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at Contact: Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626

  8. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone


    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  9. Microfabricated tools for manipulation and analysis of magnetic microcarriers

    International Nuclear Information System (INIS)

    Tondra, Mark; Popple, Anthony; Jander, Albrecht; Millen, Rachel L.; Pekas, Nikola; Porter, Marc D.


    Tools for manipulating and detecting magnetic microcarriers are being developed with microscale features. Microfabricated giant magnetoresistive (GMR) sensors and wires are used for detection, and for creating high local field gradients. Microfluidic structures are added to control flow, and positioning of samples and microcarriers. These tools are designed for work in analytical chemistry and biology

  10. An Analysis of Teacher Selection Tools in Pennsylvania (United States)

    Vitale, Tracy L.


    The purpose of this study was to examine teacher screening and selection tools currently being utilized by public school districts in Pennsylvania and to compare these tools to the research on qualities of effective teachers. The researcher developed four research questions that guided her study. The Pennsylvania Association of School Personnel…

  11. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt


    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets...

  12. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel. (United States)

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari


    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  13. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin


    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  14. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration


    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  15. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool (United States)

    Halford, Keith


    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  16. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.


    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  17. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    International Nuclear Information System (INIS)

    Ourghanlian, Alain


    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  18. Assessing the Possibility of Implementing Tools of Technical Analysys for Real Estate Market Analysis

    Directory of Open Access Journals (Sweden)

    Brzezicka Justyna


    Full Text Available Technical analysis (TA and its different aspects are widely used to study the capital market. In the traditional approach, this analysis is used to determine the probability of changes in current rates on the basis of their past changes, accounting for factors which had, have or may have an influence on shaping the supply and demand of a given asset. In the practical sense, TA is a set of techniques used for assessing the value of an asset based on the analysis of the asset's trajectories as well as statistical tools.

  19. Neutrons and magnetic structures: analysis methods and tools (United States)

    Damay, Françoise


    After a short introduction on neutron diffraction and magnetic structures, this review focuses on the new computing tools available in magnetic crystallography nowadays. The appropriate neutron techniques and different steps required to determine a magnetic structure are also introduced.

  20. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Bush, Brian; Penev, Michael


    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  1. Scaffolding Assignments: Analysis of Assignmentor as a Tool to Support First Year Students' Academic Writing Skills (United States)

    Silva, Pedro


    There are several technological tools which aim to support first year students' challenges, especially when it comes to academic writing. This paper analyses one of these tools, Wiley's AssignMentor. The Technological Pedagogical Content Knowledge framework was used to systematise this analysis. The paper showed an alignment between the tools'…

  2. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis (United States)

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom


    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  3. Modal Analysis and Experimental Determination of Optimum Tool Shank Overhang of a Lathe Machine

    Directory of Open Access Journals (Sweden)

    Nabin SARDAR


    Full Text Available Vibration of Tool Shank of a cutting tool has large influence on tolerances and surface finish of products. Frequency and amplitude of vibrations depend on the overhang of the shank of the cutting tool. In turning operations, when the tool overhang is about 2 times of the tool height, the amplitude of the vibration is almost zero and dimensional tolerances and surface finish of the product becomes high. In this paper, the above statement is verified firstly by using a finite element analysis of the cutting tool with ANSYS software package and secondly, with experimental verification with a piezoelectric sensor.

  4. Risk analysis for dengue suitability in Africa using the ArcGIS predictive analysis tools (PA tools). (United States)

    Attaway, David F; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M


    Risk maps identifying suitable locations for infection transmission are important for public health planning. Data on dengue infection rates are not readily available in most places where the disease is known to occur. A newly available add-in to Esri's ArcGIS software package, the ArcGIS Predictive Analysis Toolset (PA Tools), was used to identify locations within Africa with environmental characteristics likely to be suitable for transmission of dengue virus. A more accurate, robust, and localized (1 km × 1 km) dengue risk map for Africa was created based on bioclimatic layers, elevation data, high-resolution population data, and other environmental factors that a search of the peer-reviewed literature showed to be associated with dengue risk. Variables related to temperature, precipitation, elevation, and population density were identified as good predictors of dengue suitability. Areas of high dengue suitability occur primarily within West Africa and parts of Central Africa and East Africa, but even in these regions the suitability is not homogenous. This risk mapping technique for an infection transmitted by Aedes mosquitoes draws on entomological, epidemiological, and geographic data. The method could be applied to other infectious diseases (such as Zika) in order to provide new insights for public health officials and others making decisions about where to increase disease surveillance activities and implement infection prevention and control efforts. The ability to map threats to human and animal health is important for tracking vectorborne and other emerging infectious diseases and modeling the likely impacts of climate change. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Teaching Advanced Data Analysis Tools to High School Astronomy Students (United States)

    Black, David V.; Herring, Julie; Hintz, Eric G.


    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  6. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen


    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  7. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark


    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  8. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints (United States)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)


    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  9. Lagrangian analysis. Modern tool of the dynamics of solids (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  10. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    Eldridge, C.; Gagne, D.; Wilson, B.; Murray, J.; Gazze, C.; Feldman, Y.; Rorif, F.


    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  11. Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool. (United States)

    Xavier, Daniela; Vázquez, Sara; Higuera, Clara; Morán, Federico; Montero, Francisco


    Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  12. Stability analysis of multipoint tool equipped with metal cutting ceramics (United States)

    Maksarov, V. V.; Khalimonenko, A. D.; Matrenichev, K. G.


    The article highlights the issues of determining the stability of the cutting process by a multipoint cutting tool equipped with cutting ceramics. There were some recommendations offered on the choice of parameters of replaceable cutting ceramic plates for milling based of the conducted researches. Ceramic plates for milling are proposed to be selected on the basis of value of their electrical volume resistivity.

  13. “DRYPACK” - a calculation and analysis tool

    DEFF Research Database (Denmark)

    Andreasen, M.B.; Toftegaard, R.; Schneider, P.


    energy consumption reductions by using “DryPack” are calculated. With the “DryPack” calculation tool, it is possible to calculate four different unit operations with moist air (dehumidification of air, humidification of air, mixing of two air streams, and heating of air). In addition, a Mollier diagram...

  14. comparative analysis of diagnostic applications of autoscan tools

    African Journals Online (AJOL)


    changing the skills demanded of auto designers, engineers and production workers [1,5,6,]. In automobile education, the use of autotronic simulators and demonstrators as teaching aids with computer soft- wares, auto scan tools for diagnosis, servicing and maintenance, auto- analyzers, solid work design and can- bus hard ...

  15. Fractography analysis of tool samples used for cold forging

    DEFF Research Database (Denmark)

    Dahl, K.V.


    Three fractured tool dies used for industrial cold forging have been investigated using light optical microscopy and scanning electron microscopy. Two of the specimens were produced using the traditional Böhler P/M steel grade s790, while the lastspecimen was a third generation P/M steel produced...... resistance towards abrasive wear compared with the traditional P/M steel....

  16. Analysis of Online Quizzes as a Teaching and Assessment Tool (United States)

    Salas-Morera, Lorenzo; Arauzo-Azofra, Antonio; García-Hernández, Laura


    This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an…

  17. Comprehensive Analysis of Semantic Web Reasoners and Tools: A Survey (United States)

    Khamparia, Aditya; Pandey, Babita


    Ontologies are emerging as best representation techniques for knowledge based context domains. The continuing need for interoperation, collaboration and effective information retrieval has lead to the creation of semantic web with the help of tools and reasoners which manages personalized information. The future of semantic web lies in an ontology…

  18. Hygrothermal Simulation: A Tool for Building Envelope Design Analysis (United States)

    Samuel V. Glass; Anton TenWolde; Samuel L. Zelinka


    Is it possible to gauge the risk of moisture problems while designing the building envelope? This article provides a brief introduction to computer-based hygrothermal (heat and moisture) simulation, shows how simulation can be useful as a design tool, and points out a number of im-portant considerations regarding model inputs and limita-tions. Hygrothermal simulation...

  19. NREL Suite of Tools for PV and Storage Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Elgqvist, Emma M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Salasovich, James A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    Many different factors such as the solar resource, technology costs and incentives, utility cost and consumption, space available, and financial parameters impact the technical and economic potential of a PV project. NREL has developed techno-economic modeling tools that can be used to evaluate PV projects at a site.

  20. Stakeholder Analysis of an Executable Achitecture Systems Engineering (EASE) Tool (United States)


    The FCR tables and stakeholder feedback are then used as the foundation of a Strengths, Weaknesses, Opportunities, and Threats ( SWOT ) analysis . Finally...the SWOT analysis and stakeholder feedback arc translated into an EASE future development strategy; a series of recommendations regarding...and Threats ( SWOT ) analysis . Finally, the SWOT analysis and stakeholder feedback are translated into an EASE future development strategy; a series

  1. Interactive tool that empowers structural understanding and enables FEM analysis in a parametric design environment

    DEFF Research Database (Denmark)

    Christensen, Jesper Thøger; Parigi, Dario; Kirkegaard, Poul Henning


    This paper introduces an interactive tool developed to integrate structural analysis in the architectural design environment from the early conceptual design stage. The tool improves exchange of data between the design environment of Rhino Grasshopper and the FEM analysis of Autodesk Robot...

  2. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging. (United States)

    Kaçar, Sezgin; Sakoğlu, Ünal


    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Stability analysis of machine tool spindle under uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Dou


    Full Text Available Chatter is a harmful machining vibration that occurs between the workpiece and the cutting tool, usually resulting in irregular flaw streaks on the finished surface and severe tool wear. Stability lobe diagrams could predict chatter by providing graphical representations of the stable combinations of the axial depth of the cut and spindle speed. In this article, the analytical model of a spindle system is constructed, including a Timoshenko beam rotating shaft model and double sets of angular contact ball bearings with 5 degrees of freedom. Then, the stability lobe diagram of the model is developed according to its dynamic properties. The Monte Carlo method is applied to analyse the bearing preload influence on the system stability with uncertainty taken into account.

  4. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.


    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  5. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    International Nuclear Information System (INIS)

    Rath, Frank


    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings

  6. An online database for plant image analysis software tools


    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire


    Background: Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is...

  7. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning


    The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where an a...... provide the possibility for the designer to work both with the aesthetics as well as the technical aspects of architectural design....

  8. Pyrosequencing data analysis software: a useful tool for EGFR, KRAS, and BRAF mutation analysis

    Directory of Open Access Journals (Sweden)

    Shen Shanxiang


    Full Text Available Abstract Background Pyrosequencing is a new technology and can be used for mutation tests. However, its data analysis is a manual process and involves sophisticated algorithms. During this process, human errors may occur. A better way of analyzing pyrosequencing data is needed in clinical diagnostic laboratory. Computer software is potentially useful for pyrosequencing data analysis. We have developed such software, which is able to perform pyrosequencing mutation data analysis for epidermal growth factor receptor, Kirsten rat sarcoma viral oncogene homolog and v-raf murine sarcoma viral oncogene homolog B1. The input data for analysis includes the targeted nucleotide sequence, common mutations in the targeted sequence, pyrosequencing dispensing order, pyrogram peak order and peak heights. The output includes mutation type and percentage of mutant gene in the specimen. Results The data from 1375 pyrosequencing test results were analyzed using the software in parallel with manual analysis. The software was able to generate correct results for all 1375 cases. Conclusion The software developed is a useful molecular diagnostic tool for pyrosequencing mutation data analysis. This software can increase laboratory data analysis efficiency and reduce data analysis error rate. Virtual slides The virtual slide(s for this article can be found here:

  9. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin


    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  10. Elevation Difference and Bouguer Anomaly Analysis Tool (EDBAAT) User's Guide (United States)

    Smittle, Aaron M.; Shoberg, Thomas G.


    This report describes a software tool that imports gravity anomaly point data from the Gravity Database of the United States (GDUS) of the National Geospatial-Intelligence Agency and University of Texas at El Paso along with elevation data from The National Map (TNM) of the U.S. Geological Survey that lie within a user-specified geographic area of interest. Further, the tool integrates these two sets of data spatially and analyzes the consistency of the elevation of each gravity station from the GDUS with TNM elevation data; it also evaluates the consistency of gravity anomaly data within the GDUS data repository. The tool bins the GDUS data based on user-defined criteria of elevation misfit between the GDUS and TNM elevation data. It also provides users with a list of points from the GDUS data, which have Bouguer anomaly values that are considered outliers (two standard deviations or greater) with respect to other nearby GDUS anomaly data. “Nearby” can be defined by the user at time of execution. These outputs should allow users to quickly and efficiently choose which points from the GDUS would be most useful in reconnaissance studies or in augmenting and extending the range of individual gravity studies.

  11. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  12. Propositional Analysis: A Tool for Library and Information Science Research. (United States)

    Allen, Bryce


    Reviews the use of propositional analysis in library and information science research. Evidence that different analysts produce similar judgments about texts and use the method consistently over time is presented, and it is concluded that propositional analysis is a reliable and valid research method. An example of an analysis is appended. (32…

  13. Application of Statistical Tools for Data Analysis and Interpretation in Rice Plant Pathology

    Directory of Open Access Journals (Sweden)

    Parsuram Nayak


    Full Text Available There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in

  14. An auditory display tool for DNA sequence analysis. (United States)

    Temple, Mark D


    DNA Sonification refers to the use of an auditory display to convey the information content of DNA sequence data. Six sonification algorithms are presented that each produce an auditory display. These algorithms are logically designed from the simple through to the more complex. Three of these parse individual nucleotides, nucleotide pairs or codons into musical notes to give rise to 4, 16 or 64 notes, respectively. Codons may also be parsed degenerately into 20 notes with respect to the genetic code. Lastly nucleotide pairs can be parsed as two separate frames or codons can be parsed as three reading frames giving rise to multiple streams of audio. The most informative sonification algorithm reads the DNA sequence as codons in three reading frames to produce three concurrent streams of audio in an auditory display. This approach is advantageous since start and stop codons in either frame have a direct affect to start or stop the audio in that frame, leaving the other frames unaffected. Using these methods, DNA sequences such as open reading frames or repetitive DNA sequences can be distinguished from one another. These sonification tools are available through a webpage interface in which an input DNA sequence can be processed in real time to produce an auditory display playable directly within the browser. The potential of this approach as an analytical tool is discussed with reference to auditory displays derived from test sequences including simple nucleotide sequences, repetitive DNA sequences and coding or non-coding genes. This study presents a proof-of-concept that some properties of a DNA sequence can be identified through sonification alone and argues for their inclusion within the toolkit of DNA sequence browsers as an adjunct to existing visual and analytical tools.

  15. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    Directory of Open Access Journals (Sweden)

    Drechsel Marion


    Full Text Available Abstract Background Single nucleotide polymorphism (SNP genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis. Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  16. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for combustion device environment prediction, including complex fluid mixing phenomena, is now becoming...

  17. A Comparative Analysis of Cockpit Display Development Tools

    National Research Council Canada - National Science Library

    Gebhardt, Matthew


    ..., Virtual Application Prototyping System (VAPS) and Display Editor. The comparison exploits the analysis framework establishing the advantages and disadvantages of the three software development suites...

  18. Data visualization and analysis tools for the MAVEN mission (United States)

    Harter, B.; De Wolfe, A. W.; Putnam, B.; Brain, D.; Chaffin, M.


    The Mars Atmospheric and Volatile Evolution (MAVEN) mission has been collecting data at Mars since September 2014. We have developed new software tools for exploring and analyzing the science data. Our open-source Python toolkit for working with data from MAVEN and other missions is based on the widely-used "tplot" IDL toolkit. We have replicated all of the basic tplot functionality in Python, and use the bokeh and matplotlib libraries to generate interactive line plots and spectrograms, providing additional functionality beyond the capabilities of IDL graphics. These Python tools are generalized to work with missions beyond MAVEN, and our software is available on Github. We have also been exploring 3D graphics as a way to better visualize the MAVEN science data and models. We have constructed a 3D visualization of MAVEN's orbit using the CesiumJS library, which not only allows viewing of MAVEN's orientation and position, but also allows the display of selected science data sets and their variation over time.

  19. Cemented carbide cutting tool: Laser processing and thermal stress analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yilbas, B.S. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia)]. E-mail:; Arif, A.F.M. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia); Karatas, C. [Engineering Faculty, Hacettepe University, Ankara (Turkey); Ahsan, M. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia)


    Laser treatment of cemented carbide tool surface consisting of W, C, TiC, TaC is examined and thermal stress developed due to temperature gradients in the laser treated region is predicted numerically. Temperature rise in the substrate material is computed numerically using the Fourier heating model. Experiment is carried out to treat the tool surfaces using a CO{sub 2} laser while SEM, XRD and EDS are carried out for morphological and structural characterization of the treated surface. Laser parameters were selected include the laser output power, duty cycle, assisting gas pressure, scanning speed, and nominal focus setting of the focusing lens. It is found that temperature gradient attains significantly high values below the surface particularly for titanium and tantalum carbides, which in turn, results in high thermal stress generation in this region. SEM examination of laser treated surface and its cross section reveals that crack initiation below the surface occurs and crack extends over the depth of the laser treated region.

  20. Pathway-based analysis tools for complex diseases: a review. (United States)

    Jin, Lv; Zuo, Xiao-Yu; Su, Wei-Yang; Zhao, Xiao-Lei; Yuan, Man-Qiong; Han, Li-Zhen; Zhao, Xiang; Chen, Ye-Da; Rao, Shao-Qi


    Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods-the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases. Copyright © 2014 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  1. Integration of management control tools. Analysis of a case study

    Directory of Open Access Journals (Sweden)

    Raúl Comas Rodríguez


    Full Text Available The objective of this article is to design and to implement a procedure that integrates management control tools focusing on process, to improve the efficiency and the efficacy. It was carried out an experimental study where is defined a procedure, based in the Balanced Scorecard, which integrates the process management into the strategic planning and their evaluation. As results of this work, we define the key factors of success associated with the four perspectives of the Balanced Scorecard that are linked through the cause-effect relations obtaining the strategic map that allows visualizing and communicating the enterprise strategy. The indicators evaluate the key factor of success, integrating the process with the assistance of a software. The implementation of the procedure in a commercialization enterprise contributed to integrate the process definition into the strategic planning. The alignment was evaluated and the efficiency and efficacy indicators improved the company´s performance.

  2. Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report (United States)


    Med Evac Vehicle MGS Mobile Gun System MILPRS Military Personnel MILCON Military Construction MODA Multiple Objective Decision Analysis...Analysis ( MODA ) approach for assessing the value of vehicle modernization in the HBCT and SBCT combat fleets. The MODA approach provides insight to...used to measure the returns of scale for a given attribute. The MODA approach promotes buy-in from multiple stakeholders. The CPAT team held an SME

  3. ATENA – A tool for engineering analysis of fracture in concrete

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    ATENA – A tool for engineering analysis of fracture in concrete. VLADIMIR CERVENKA, JAN CERVENKA and RADOMIR PUKL. Cervenka Consulting, Prague, Czech Republic e-mail: Abstract. Advanced constitutive models implemented in the finite element system ATENA serve as rational tools to ...

  4. HAWCStab2 with super element foundations: A new tool for frequency analysis of offshore wind turbines

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian; Hansen, Anders Melchior; Kragh, Knud Abildgaard


    HAWCStab2 is a linear frequency domain aero-elastic tool, developed by DTU Wind Energy, suitable for frequency and stability analysis of horizontal axis 3 bladed wind turbines [1]. This tool has now been extended to also handle complex offshore foundation types, such as jacket structures...

  5. DEBRISK, a Tool for Re-Entry Risk Analysis (United States)

    Omaly, P.; Spel, M.


    An act of French parliament, adopted in 2008, imposes satellite constructors to evaluate the end-of-life operations in order to assure the risk mitigation of their satellites. One important element in this evaluation is the estimation of the mass and impact energy of the satellite debris after atmospheric re-entry. For this purpose, CNES has developed the tool DEBRISK which allows the operator to simulate the re-entry phase and to study the demise altitudes or impact energy of the individual fragments of the original satellite. DEBRISK is based on the so called object based approach. Using this approach, a breakup altitude is assumed where the satellite disintegrates due to the pressure loads. This altitude is typically around 78 km. After breakup, the satellite structure is modelled by a parent-child approach, where each child has its birth criterion. In the simplest approach the child is born after demise of the parent object. This could be the case of an object A containing an object B which is in the interior of object A and thus not exposed to the atmosphere. Each object is defined by: - its shape, attitude and dimensions, - the material along with their physical properties - the state and velocity vectors. The shape, attitude and dimensions define the aerodynamic drag of the object which is input to the 3DOF trajectory modelling. The aerodynamic mass used in the equation of motion is defined as the sum of the object's own mass and the mass of the object's offspring. A new born object inherits the state vector of the parent object. The shape, attitude and dimensions also define the heating rates experienced by the object. The heating rate is integrated in time up to the point where the melting temperature is reached. The mass of melted material is computed from the excess heat and the material properties. After each step the amount of ablated material is determined using the lumped mass approach and is peeled off from the object, updating mass and shape of the

  6. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.


    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  7. dada - a web-based 2D detector analysis tool (United States)

    Osterhoff, Markus


    The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.

  8. SCALE 5: Powerful new criticality safety analysis tools

    International Nuclear Information System (INIS)

    Bowman, Stephen M.; Hollenbach, Daniel F.; Dehart, Mark D.; Rearden, Bradley T.; Gauld, Ian C.; Goluoglu, Sedat


    Version 5 of the SCALE computer software system developed at Oak Ridge National Laboratory, scheduled for release in December 2003, contains several significant new modules and sequences for criticality safety analysis and marks the most important update to SCALE in more than a decade. This paper highlights the capabilities of these new modules and sequences, including continuous energy flux spectra for processing multigroup problem-dependent cross sections; one- and three-dimensional sensitivity and uncertainty analyses for criticality safety evaluations; two-dimensional flexible mesh discrete ordinates code; automated burnup-credit analysis sequence; and one-dimensional material distribution optimization for criticality safety. (author)

  9. Cluster analysis as a prediction tool for pregnancy outcomes. (United States)

    Banjari, Ines; Kenjerić, Daniela; Šolić, Krešimir; Mandić, Milena L


    Considering specific physiology changes during gestation and thinking of pregnancy as a "critical window", classification of pregnant women at early pregnancy can be considered as crucial. The paper demonstrates the use of a method based on an approach from intelligent data mining, cluster analysis. Cluster analysis method is a statistical method which makes possible to group individuals based on sets of identifying variables. The method was chosen in order to determine possibility for classification of pregnant women at early pregnancy to analyze unknown correlations between different variables so that the certain outcomes could be predicted. 222 pregnant women from two general obstetric offices' were recruited. The main orient was set on characteristics of these pregnant women: their age, pre-pregnancy body mass index (BMI) and haemoglobin value. Cluster analysis gained a 94.1% classification accuracy rate with three branch- es or groups of pregnant women showing statistically significant correlations with pregnancy outcomes. The results are showing that pregnant women both of older age and higher pre-pregnancy BMI have a significantly higher incidence of delivering baby of higher birth weight but they gain significantly less weight during pregnancy. Their babies are also longer, and these women have significantly higher probability for complications during pregnancy (gestosis) and higher probability of induced or caesarean delivery. We can conclude that the cluster analysis method can appropriately classify pregnant women at early pregnancy to predict certain outcomes.

  10. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas


    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic...

  11. EMGTools, an adaptive and versatile tool for detailed EMG analysis

    DEFF Research Database (Denmark)

    Nikolic, M; Krarup, C


    We have developed an EMG decomposition system called EMGTools that can extract the constituent MUAPs and firing patterns for quantitative analysis from the EMG signal recorded at slight effort for clinical evaluation. The aim was to implement a robust system able to handle the challenges...

  12. Gipsy 3D : Analysis, Visualization and Vo-Tools

    NARCIS (Netherlands)

    Ruiz, J. E.; Santander-Vela, J. D.; Espigares, V.; Verdes-Montenegro, L.; Hulst, J. M. van der


    The scientific goals of the AMIGA project are based on the analysis of a significant amount of spectroscopic 3D data. In order to perform this work we present an initiative to develop a new VO compliant package, including present core applications and tasks offered by the Groningen Image Processing

  13. Television as an Instructional Tool for Concept Analysis (United States)

    Benwari, Nnenna Ngozi


    This is a study of the perception of teachers on the use of television for concept analysis in the classroom. The population of the study is all the 9,784 Secondary School teachers in Bayelsa State of Nigeria out of which 110 teachers were randomly selected using the proportional sampling method. The instrument is a questionnaire designed by the…

  14. Principal Component Analysis: Most Favourite Tool in Chemometrics

    Indian Academy of Sciences (India)

    differentiate olive oils from non-olive vegetable oils. Moreover, manual analysis of such a large volume of data is laborious and time consuming, and may not provide any meaningful interpre-. Figure 4. Amount of vari- ance captured by different principal components (PCs). The plot indicates that first two PCs are sufficient to ...

  15. Principal Component Analysis: Most Favourite Tool in Chemometrics

    Indian Academy of Sciences (India)

    Abstract. Principal component analysis (PCA) is the most commonlyused chemometric technique. It is an unsupervised patternrecognition technique. PCA has found applications in chemistry,biology, medicine and economics. The present work attemptsto understand how PCA work and how can we interpretits results.

  16. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.


    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  17. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    International Nuclear Information System (INIS)

    Johnson, P.E.; Lester, P.B.


    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  18. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology. (United States)

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan


    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  19. Evaluation of fatigue damage in nuclear power plants: evolution and new tools of analysis

    International Nuclear Information System (INIS)

    Cicero, R.; Corchon, F.


    This paper presents new fatigue mechanisms requiring analysis, tools developed for evaluation and the latest trends and studies that are currently working in the nuclear field, and allow proper management referring facilities the said degradation mechanism.

  20. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.


    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  1. Cerebrotendinous xanthomatosis (CTX): an association of pulverulent cataracts and pseudo-dominant developmental delay in a family with a splice site mutation in CYP27A1--a case report. (United States)

    Bourkiza, Rabia; Joyce, Sarah; Patel, Himanshu; Chan, Michelle; Meyer, Esther; Maher, Eamonn R; Reddy, M Ashwin


    A 15-year-old boy with developmental delay presented to the pediatric ophthalmology clinic with bilateral pulverulent cataracts. The family was examined for developmental delay, cataracts and systemic problems. The parents were consanguineous and originally from Bangladesh. All the children were born in the UK. The mother and 5 children had developmental delay. Three children had global developmental delay, diarrhea and pulverulent cataracts. Two children had microcephaly, developmental delay, constipation and no cataracts. The mother did not have microcephaly, cataracts or gastrointestinal problems. Linkage analysis via autozygosity testing was performed for detection of loci and candidate genes. The patients with cataracts were segregated with homozygous mutations in the CYP27A1 (G to A substitution at position +1 of intron 6). The complex nature of this family's findings suggested that it had an unusual autosomal dominant condition with variable expression. Autozygosity testing demonstrated that three members had Cerebrotendinous xanthomatosis (CTX), which is inherited in an autosomal recessive manner. The aetiology of the developmental delay in other family members remains unknown. Cerebrotendinous xanthomatosis is a rare autosomal recessive condition that can result in neurological deficits and early death if left untreated. In view of the reversible nature of the condition with appropriate treatment, there needs to be a high level of suspicion of CTX for any child with cataracts and developmental delay even if the pattern of inheritance is not straightforward at initial assessment.

  2. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro


    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  3. Accessibility Analyst: an integrated GIS tool for accessibility analysis in urban transportation planning


    Suxia Liu; Xuan Zhu


    The authors present an integrated GIS tool, Accessibility Analyst, for accessibility analysis in urban transportation planning, built as an extension to the desktop GIS software package, ArcView. Accessibility Analyst incorporates a number of accessibility measures, ranging from catchment profile analysis to cumulative-opportunity measures, gravity-type measures, and utility-based measures, contains several travel-impedance measurement tools for estimating the travel distance, time, or cost b...

  4. Tools for functional postgenomic analysis of listeria monocytogenes. (United States)

    Monk, Ian R; Gahan, Cormac G M; Hill, Colin


    We describe the development of genetic tools for regulated gene expression, the introduction of chromosomal mutations, and improved plasmid transfer by electroporation in the food-borne pathogen Listeria monocytogenes. pIMK, a kanamycin-resistant, site-specific, integrative listeriophage vector was constructed and then modified for overexpression (pIMK2) or for isopropyl-beta-d-thiogalactopyranoside (IPTG)-regulated expression (pIMK3 and pIMK4). The dynamic range of promoters was assessed by determining luciferase activity, P60 secretion, and internalin A-mediated invasion. These analyses demonstrated that pIMK4 and pIMK3 have a stringently controlled dynamic range of 540-fold. Stable gene overexpression was achieved with pIMK2, giving a range of expression for the three vectors of 1,350-fold. The lactococcal pORI280 system was optimized for the generation of chromosomal mutations and used to create five new prfA star mutants. The combination of pIMK4 and pORI280 allowed streamlined creation of "IPTG-dependent" mutants. This was exemplified by creation of a clean deletion mutant with deletion of the universally essential secA gene, and this mutant exhibited a rapid loss of viability upon withdrawal of IPTG. We also improved plasmid transfer by electroporation into three commonly used laboratory strains of L. monocytogenes. A 125-fold increase in transformation efficiency for EGDe compared with the widely used protocol of Park and Stewart (S. F. Park and G. S. Stewart, Gene 94:129-132, 1990) was observed. Maximal transformation efficiencies of 5.7 x 10(6) and 6.7 x 10(6) CFU per mug were achieved for EGDe and 10403S, respectively, with a replicating plasmid. An efficiency of 2 x 10(7) CFU per mug is the highest efficiency reported thus far for L. monocytogenes F2365.

  5. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan


    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  6. Flow Injection as a Teaching Tool for Gravimetric Analysis (United States)

    Sartini, Raquel P.; Zagatto, Elias A. G.; Oliveira, Cláudio C.


    A flow-injection system to carry out gravimetric analysis is presented. Students are faced with an instrumental approach for gravimetric procedures. Crucibles, muffle furnaces, and desiccators are not required. A flowing suspension is established by simultaneously injecting an aqueous sample and a precipitating reagent into two merging carrier streams. The precipitate is accumulated on a minifilter hanging under the plate of an analytical balance and is weighed inside the main stream. Since Archimedes' principle holds, a drying step is not needed. After measurement, the precipitate is dissolved and disposed of. As an application, the determination of phosphate based on precipitation with ammonium and magnesium ions in slightly alkaline medium is chosen. The proposed system is very stable and well suited for demonstration. When applied to analysis of fertilizer extracts with 0.10-1.00% w/v P, it yields precise results (RSD < 0.042) in agreement with an official spectrophotometric method.

  7. Analysis of live cell images: Methods, tools and opportunities. (United States)

    Nketia, Thomas A; Sailem, Heba; Rohde, Gustavo; Machiraju, Raghu; Rittscher, Jens


    Advances in optical microscopy, biosensors and cell culturing technologies have transformed live cell imaging. Thanks to these advances live cell imaging plays an increasingly important role in basic biology research as well as at all stages of drug development. Image analysis methods are needed to extract quantitative information from these vast and complex data sets. The aim of this review is to provide an overview of available image analysis methods for live cell imaging, in particular required preprocessing image segmentation, cell tracking and data visualisation methods. The potential opportunities recent advances in machine learning, especially deep learning, and computer vision provide are being discussed. This review includes overview of the different available software packages and toolkits. Copyright © 2017. Published by Elsevier Inc.

  8. A tool for public analysis of scientific data


    Haglin, D; Roiger, R; Hakkila, J; Giblin, T


    The scientific method encourages sharing data with other researchers to independently verify conclusions. Currently, technical barriers impede such public scrutiny. A strategy for offering scientific data for public analysis is described. With this strategy, effectively no requirements of software installation (other than a web browser) or data manipulation are imposed on other researchers to prepare for perusing the scientific data. A prototype showcasing this strategy is described.

  9. Systems analysis as a tool for optimal process strategy

    International Nuclear Information System (INIS)

    Ditterich, K.; Schneider, J.


    For the description and the optimal treatment of complex processes, the methods of Systems Analysis are used as the most promising approach in recent times. In general every process should be optimised with respect to reliability, safety, economy and environmental pollution. In this paper the complex relations between these general optimisation postulates are established in qualitative form. These general trend relations have to be quantified for every particular system studied in practice

  10. A tool for public analysis of scientific data

    Directory of Open Access Journals (Sweden)

    D Haglin


    Full Text Available The scientific method encourages sharing data with other researchers to independently verify conclusions. Currently, technical barriers impede such public scrutiny. A strategy for offering scientific data for public analysis is described. With this strategy, effectively no requirements of software installation (other than a web browser or data manipulation are imposed on other researchers to prepare for perusing the scientific data. A prototype showcasing this strategy is described.

  11. Implementing Second-Order Decision Analysis: Concepts, Algorithms, and Tool

    Directory of Open Access Journals (Sweden)

    Aron Larsson


    Full Text Available We present implemented concepts and algorithms for a simulation approach to decision evaluation with second-order belief distributions in a common framework for interval decision analysis. The rationale behind this work is that decision analysis with interval-valued probabilities and utilities may lead to overlapping expected utility intervals yielding difficulties in discriminating between alternatives. By allowing for second-order belief distributions over interval-valued utility and probability statements these difficulties may not only be remedied but will also allow for decision evaluation concepts and techniques providing additional insight into a decision problem. The approach is based upon sets of linear constraints together with generation of random probability distributions and utility values from implicitly stated uniform second-order belief distributions over the polytopes given from the constraints. The result is an interactive method for decision evaluation with second-order belief distributions, complementing earlier methods for decision evaluation with interval-valued probabilities and utilities. The method has been implemented for trial use in a user oriented decision analysis software.

  12. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    Energy Technology Data Exchange (ETDEWEB)

    Andruski, Joel; Drennen, Thomas E.


    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  13. The Revised Time-Frequency Analysis (R-TFA) Tool for the Swarm Mission (United States)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Thanasis; Giannakis, Omiros; Giamini, Sigiava A.; Vasalos, Georgios; Daglis, Ioannis A.


    The time-frequency analysis (TFA) tool is a suite of algorithms based on wavelet transforms and neural networks, tailored to the analysis of Level 1b data from the Swarm mission [1], [2]. The aim of the TFA tool has been to combine the advantages of multi-spacecraft and ground-based monitoring of the geospace environment in order to analyze and study magnetospheric ultra low frequency (ULF) waves. Additionally, the tool has been offered a useful platform to monitor the wave evolution from the outer boundaries of Earth's magnetosphere through the topside ionosphere down to the surface. Here, we present the revised TFA (R-TFA) tool introducing a new user-friendly interface and presenting a number of applications and capabilities that have not been previously included in the earlier versions of the tool.

  14. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool. (United States)

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda


    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  15. Nickel and cobalt release from metal alloys of tools--a current analysis in Germany. (United States)

    Kickinger-Lörsch, Anja; Bruckner, Thomas; Mahler, Vera


    The former 'EU Nickel Directive' and, since 2009, the REACH Regulation (item 27 of Annex XVII) do not include all metallic objects. The nickel content of tools is not regulated by the REACH Regulation, even if they may come into in prolonged contact with the skin. Tools might be possible sources of nickel and cobalt sensitization, and may contribute to elicitation and maintenance of hand eczema. To perform a current analysis of the frequency of nickel or cobalt release from new handheld tools purchased in Germany. Six hundred unused handheld tools from the German market were investigated with the dimethylglyoxime test for nickel release and with disodium-1-nitroso-2-naphthol-3,6-disulfonate solution for cobalt release. Nickel release was detected in 195 of 600 (32.5%) items, and cobalt in only six (1%) of them. Positive nickel results were nearly twice as frequent in tools 'made in Germany' than in tools without a mark of origin. Tools made in other European countries did not release nickel. Cobalt release was only found in pliers and a saw. A correlation was found between price level and nickel release. Among toolkits, 34.2% were inhomogeneous concerning nickel release. The German market currently provides a large number of handheld tools that release nickel, especially tools 'made in Germany'. For consumer protection, it seems appropriate to include handheld tools in the REACH Regulation on nickel. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models (United States)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines


    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community. PMID:27536246

  17. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton


    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  18. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB) (United States)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.


    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.


    Directory of Open Access Journals (Sweden)

    Edgar Antonio Reyes Montaño


    Full Text Available Producing polyclonal antibodies (IgY inchickens has advantages over those obtainedin other animal models, since theyhave been used as a tool for studyingdifferent proteins (NMDA glutamate receptorin our case, specifically the NR1subunit. We produced specific antibodiesagainst expression products by thealternative splicing of the gene encodingNMDA receptor NR1 subunit in adult ratbrain. Three peptides corresponding tothe splicing sites (N1, C1 and C2’ cassetteswere designed, synthesised and usedindividually as antigens in hens. Specificimmunoglobulins were purified fromyolks. The antibodies were then used forpurifying the NMDA receptor NR1 subunitusing affinity chromatography couplingthe three antibodies to the support.R

  20. Battery Simulation Tool for Worst Case Analysis and Mission Evaluations

    Directory of Open Access Journals (Sweden)

    Lefeuvre Stéphane


    The first part of this paper presents the PSpice models including their respective variable parameters at SBS and cell level. Then the second part of the paper introduces to the reader the model parameters that were chosen and identified to perform Monte Carlo Analysis simulations. The third part reflects some MCA results for a VES16 battery module. Finally the reader will see some other simulations that were performed by re-using the battery model for an another Saft battery cell type (MP XTD for a specific space application, at high temperature.

  1. Stakeholder analysis: a useful tool for biobank planning. (United States)

    Bjugn, Roger; Casati, Bettina


    Stakeholders are individuals, groups, or organizations that are affected by or can affect a particular action undertaken by others. Biobanks relate to a number of donors, researchers, research institutions, regulatory bodies, funders, and others. These stakeholders can potentially have a strong influence upon the organization and operation of a biobank. A sound strategy for stakeholder engagement is considered essential in project management and organization theory. In this article, we review relevant stakeholder theory and demonstrate how a stakeholder analysis was undertaken in the early stage of a planned research biobank at a public hospital in Norway.

  2. Application of Multivariate Analysis Tools to Industrial Scale Fermentation Data

    DEFF Research Database (Denmark)

    Mears, Lisa; Nørregård, Rasmus; Stocks, Stuart M.

    The analysis of batch process data can provide insight into the process operation, and there is a vast amount of historical data available for data mining. Empirical modelling utilising this data is desirable where there is a lack of understanding regarding the underlying process (Formenti et al....... application of multivariate methods to industrial scale process data to cover these considerations....... prediction error of 7.6%. The success of the final regression model was heavily dependent on the decisions made in the pre-processing stages, where the issues of different batch lengths, different measurement intervals, and variable scaling are considered. Therefore a methodology is presented for future...

  3. Neutron activation analysis: a powerful tool in provenance investigations

    International Nuclear Information System (INIS)

    Meloni, Sandro; Oddone, Massimo


    It is well known that neutron activation analysis (NAA), both instrumental and destructive, allows the simultaneous determination of a number of elements, mostly trace elements, with high levels of precision and accuracy. These peculiar properties of NAA are very useful when applied to provenance studies, i.e. to the identification of the origin of raw materials with which artifacts had been manufactured in ancient times. Data reduction by statistical procedures, especially multivariate analysis techniques, provides a statistical 'fingerprint' of investigated materials, both raw materials and archaeological artifacts, that, upon comparison, allows the identification of the provenance of prime matters used for artifact manufacturing. Thus information on quarries and flows exploitation in the antiquity, on technological raw materials processing, on trade routes and about the circulation of fakes, can be obtained. In the present paper two case studies are reported. The first one deals with the identification of the provenance of clay used to make ceramic materials, mostly bricks and tiles, recovered from the excavation of a Roman 'villa' in Lomello (Roman name Laumellum) and of Roman settlings in Casteggio (Roman name Clastidium). Both sites are located in the Province of Pavia in areas called Lomellina and Oltrepo respectively. The second one investigates the origin of the white marble used to build medieval arks, Carolingian age, located in the church of San Felice, now property of the University of Pavia. Experimental set-up, analytical results and data reduction procedures are presented and discussed. (author)


    Directory of Open Access Journals (Sweden)

    Dewi Rusnita


    Full Text Available AbstrakSingle Nucleotide Polymorphism (SNP merupakan variasi genetik yang ditemukan pada lebih dari 1% populasi. Haplotipe, yang merupakan sekelompok SNP atau alel dalam satu kromosom, dapat di turunkan ke generasi selanjutnya dan dapat digunakan untuk menelusuri gen penyebab penyakit (marker genetik. Artikel ini bertujuan menjelaskan aplikasi analisis SNP dalam diagnosis beberapa sindrom yang disebabkan gangguan genetik. Berdasarkan laporan studi terdahulu, sindrom yang disebabkan oleh UPD (uniparental disomy maupun penyakit autosomal resesif yang muncul sebagai akibat perkawinan sedarah dapat dideteksi dengan SNP array melalui analisis block of homozygosity dalam kromosom. Kelebihan lain SNP array adalah kemampuannya dalam mendeteksi mosaicism level rendah yang tidak terdeteksi dengan pemeriksaan sitogenetik konvensional. Bahkan saat ini, SNP array sedang diujicobakan dalam IVF untuk mendapatkan bayi yang sehat. Hal ini dapat dilakukan dengan mendeteksi ada atau tidaknya gen tunggal penyebab penyakit pada embrio hasil bayi tabung sebelum embrio ditanamkan ke uterus. Analisis SNP dengan SNP array mempunyai banyak kelebihan dibanding metode pemeriksaan SNP lainnya dan diharapkan dapat digunakan secara luas dalam bidang diagnostik molekuler genetik di masa mendatang.AbstractSingle Nucleotide Polymorphism (SNP is a genetic variant with a frequency of >1% of a large population. Haplotypes, a combination of a set of SNPs/alleles that appear as “associated blocks” on one chromosome, tend to be inherited together to the next offspring and can be used as genetic markers to trace particular diseases. This article aimed at explaining of SNP analysis application in diagnosis of genetic-disorder related syndrome. Previous studies showed that syndromes caused by UPD or autosomal recessive disorder as a result of consanguineous marriage can be identified by SNP array through analysing block of homozygosity region in a chromosome. Another advantage of SNP

  5. Error Modeling and Sensitivity Analysis of a Five-Axis Machine Tool

    Directory of Open Access Journals (Sweden)

    Wenjie Tian


    Full Text Available Geometric error modeling and its sensitivity analysis are carried out in this paper, which is helpful for precision design of machine tools. Screw theory and rigid body kinematics are used to establish the error model of an RRTTT-type five-axis machine tool, which enables the source errors affecting the compensable and uncompensable pose accuracy of the machine tool to be explicitly separated, thereby providing designers and/or field engineers with an informative guideline for the accuracy improvement by suitable measures, that is, component tolerancing in design, manufacturing, and assembly processes, and error compensation. The sensitivity analysis method is proposed, and the sensitivities of compensable and uncompensable pose accuracies are analyzed. The analysis results will be used for the precision design of the machine tool.


    Directory of Open Access Journals (Sweden)

    Ion Danut I. JUGANARU


    Full Text Available This study aims at analyzing the distribution of tourist flows in 2014, from 25 European countries, on three main categories of trip purposes, and assumes that there are differences or similarities between the tourists’ countries of residence and their trip purposes. "Purpose'' is a multidimensional concept used in marketing research, most often for understanding consumer behavior, and for identifying market segments or customer target groups, reunited in terms of similar characteristics. Being aware that the decision of choice/ purchase is based on purposes, their knowledge proves useful in designing strategies to increase the satisfaction level provided to the customer. The statistical method used in this paper is the factorial correspondences analysis. In our opinion, the identification, by this method, of the existence of differences or similarities between the tourists’ countries of residence and their trip purposes can represent a useful step in studying the tourism market and the choice/ reformulation of strategies.

  7. Thorough in silico and in vitro cDNA analysis of 21 putative BRCA1 and BRCA2 splice variants and a complex tandem duplication in BRCA2 allowing the identification of activated cryptic splice donor sites in BRCA2 exon 11. (United States)

    Baert, Annelot; Machackova, Eva; Coene, Ilse; Cremin, Carol; Turner, Kristin; Portigal-Todd, Cheryl; Asrat, Marie Jill; Nuk, Jennifer; Mindlin, Allison; Young, Sean; MacMillan, Andree; Van Maerken, Tom; Trbusek, Martin; McKinnon, Wendy; Wood, Marie E; Foulkes, William D; Santamariña, Marta; de la Hoya, Miguel; Foretova, Lenka; Poppe, Bruce; Vral, Anne; Rosseel, Toon; De Leeneer, Kim; Vega, Ana; Claes, Kathleen B M


    For 21 putative BRCA1 and BRCA2 splice site variants, the concordance between mRNA analysis and predictions by in silico programs was evaluated. Aberrant splicing was confirmed for 12 alterations. In silico prediction tools were helpful to determine for which variants cDNA analysis is warranted, however, predictions for variants in the Cartegni consensus region but outside the canonical sites, were less reliable. Learning algorithms like Adaboost and Random Forest outperformed the classical tools. Further validations are warranted prior to implementation of these novel tools in clinical settings. Additionally, we report here for the first time activated cryptic donor sites in the large exon 11 of BRCA2 by evaluating the effect at the cDNA level of a novel tandem duplication (5' breakpoint in intron 4; 3' breakpoint in exon 11) and of a variant disrupting the splice donor site of exon 11 (c.6841+1G > C). Additional sites were predicted, but not activated. These sites warrant further research to increase our knowledge on cis and trans acting factors involved in the conservation of correct transcription of this large exon. This may contribute to adequate design of ASOs (antisense oligonucleotides), an emerging therapy to render cancer cells sensitive to PARP inhibitor and platinum therapies. © 2017 Wiley Periodicals, Inc.

  8. Design and analysis of lifting tool assemblies to lift different engine block (United States)

    Sawant, Arpana; Deshmukh, Nilaj N.; Chauhan, Santosh; Dabhadkar, Mandar; Deore, Rupali


    Engines block are required to be lifted from one place to another while they are being processed. The human effort required for this purpose is more and also the engine block may get damaged if it is not handled properly. There is a need for designing a proper lifting tool which will be able to conveniently lift the engine block and place it at the desired position without any accident and damage to the engine block. In the present study lifting tool assemblies are designed and analyzed in such way that it may lift different categories of engine blocks. The lifting tool assembly consists of lifting plate, lifting ring, cap screws and washers. A parametric model and assembly of Lifting tool is done in 3D modelling software CREO 2.0 and analysis is carried out in ANSYS Workbench 16.0. A test block of weight equivalent to that of an engine block is considered for the purpose of analysis. In the preliminary study, without washer the stresses obtained on the lifting tool were more than the safety margin. In the present design, washers were used with appropriate dimensions which helps to bring down the stresses on the lifting tool within the safety margin. Analysis is carried out to verify that tool design meets the ASME BTH-1 required safety margin.

  9. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David


    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  10. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology. (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton


    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed ( or

  11. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    Directory of Open Access Journals (Sweden)

    Peter J.A. Cock


    Full Text Available The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology.This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols.The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed ( or

  12. The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis (United States)

    Burks, Jason Eric; Sperow, Ken


    A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

  13. ChAsE: chromatin analysis and exploration tool. (United States)

    Younesy, Hamid; Nielsen, Cydney B; Lorincz, Matthew C; Jones, Steven J M; Karimi, Mohammad M; Möller, Torsten


    : We present ChAsE, a cross-platform desktop application developed for interactive visualization, exploration and clustering of epigenomic data such as ChIP-seq experiments. ChAsE is designed and developed in close collaboration with several groups of biologists and bioinformaticians with a focus on usability and interactivity. Data can be analyzed through k-means clustering, specifying presence or absence of signal in epigenetic data and performing set operations between clusters. Results can be explored in an interactive heat map and profile plot interface and exported for downstream analysis or as high quality figures suitable for publications. Software, source code (MIT License), data and video tutorials available at CONTACT: : or information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail:

  14. Microplasmas for chemical analysis: analytical tools or research toys?

    International Nuclear Information System (INIS)

    Karanassios, Vassili


    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided

  15. The geomatic like a tool for biodiversity analysis in Colombia

    International Nuclear Information System (INIS)

    Galindo, G; Armenteras, D; Franco, C; Sua S and others


    Current biodiversity research recognizes geographic information and its variability in space as an essential characteristic that helps understand the relationships between the components of biological communities and their environment. The description and quantification of their spatial and temporal attributes adds important elements for their adequate management. The biological diversity convention (biological diversity convention, law 165 of 1994) reassured the importance of biodiversity and the necessity of its conservation and sustainable use and emphasized that its components should be characterized and monitored, and the data and information related with them should be maintained and organized. The biological research institute Alexander von Humboldt is the Colombian entity in charge of promoting, coordinating and undertaking research that helps in the conservation and sustainable use of biodiversity, this institution has defined the inventory of all the fauna and flora resources in the country as one of its priority research lines. Using geomatic techniques, Humboldt institute has implemented and developed technologies to capture, debug, geocode and analyze geographic data related with biodiversity (Armenteras, 2001) among others, this has helped in the development, structure and management of projects such as the ecosystems mapping of the Colombian amazonic, Andean and Orinoco ecosystems (GIS -RS), finding conservation opportunities in rural landscapes (GIS-RS) biological localities Gazetteer (GIS, databases, programming), development of models that predict and explain species distribution (GIS, database management, modeling techniques), conservation weakness (GIS-RS) and environmental indicators (GIS, geostatistical analysis)

  16. Trajectory Shape Analysis and Anomaly Detection Utilizing Information Theory Tools

    Directory of Open Access Journals (Sweden)

    Yuejun Guo


    Full Text Available In this paper, we propose to improve trajectory shape analysis by explicitly considering the speed attribute of trajectory data, and to successfully achieve anomaly detection. The shape of object motion trajectory is modeled using Kernel Density Estimation (KDE, making use of both the angle attribute of the trajectory and the speed of the moving object. An unsupervised clustering algorithm, based on the Information Bottleneck (IB method, is employed for trajectory learning to obtain an adaptive number of trajectory clusters through maximizing the Mutual Information (MI between the clustering result and a feature set of the trajectory data. Furthermore, we propose to effectively enhance the performance of IB by taking into account the clustering quality in each iteration of the clustering procedure. The trajectories are determined as either abnormal (infrequently observed or normal by a measure based on Shannon entropy. Extensive tests on real-world and synthetic data show that the proposed technique behaves very well and outperforms the state-of-the-art methods.

  17. Single Molecule Cluster Analysis Identifies Signature Dynamic Conformations along the Splicing Pathway (United States)

    Blanco, Mario R.; Martin, Joshua S.; Kahlscheuer, Matthew L.; Krishnan, Ramya; Abelson, John; Laederach, Alain; Walter, Nils G.


    The spliceosome is the dynamic RNA-protein machine responsible for faithfully splicing introns from precursor messenger RNAs (pre-mRNAs). Many of the dynamic processes required for the proper assembly, catalytic activation, and disassembly of the spliceosome as it acts on its pre-mRNA substrate remain poorly understood, a challenge that persists for many biomolecular machines. Here, we developed a fluorescence-based Single Molecule Cluster Analysis (SiMCAn) tool to dissect the manifold conformational dynamics of a pre-mRNA through the splicing cycle. By clustering common dynamic behaviors derived from selectively blocked splicing reactions, SiMCAn was able to identify signature conformations and dynamic behaviors of multiple ATP-dependent intermediates. In addition, it identified a conformation adopted late in splicing by a 3′ splice site mutant, invoking a mechanism for substrate proofreading. SiMCAn presents a novel framework for interpreting complex single molecule behaviors that should prove widely useful for the comprehensive analysis of a plethora of dynamic cellular machines. PMID:26414013

  18. Development of a User Interface for a Regression Analysis Software Tool (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.


    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  19. Data analysis techniques: a tool for cumulative exposure assessment. (United States)

    Lalloué, Benoît; Monnez, Jean-Marie; Padilla, Cindy; Kihal, Wahida; Zmirou-Navier, Denis; Deguen, Séverine


    Everyone is subject to environmental exposures from various sources, with negative health impacts (air, water and soil contamination, noise, etc.or with positive effects (e.g. green space). Studies considering such complex environmental settings in a global manner are rare. We propose to use statistical factor and cluster analyses to create a composite exposure index with a data-driven approach, in view to assess the environmental burden experienced by populations. We illustrate this approach in a large French metropolitan area. The study was carried out in the Great Lyon area (France, 1.2 M inhabitants) at the census Block Group (BG) scale. We used as environmental indicators ambient air NO2 annual concentrations, noise levels and proximity to green spaces, to industrial plants, to polluted sites and to road traffic. They were synthesized using Multiple Factor Analysis (MFA), a data-driven technique without a priori modeling, followed by a Hierarchical Clustering to create BG classes. The first components of the MFA explained, respectively, 30, 14, 11 and 9% of the total variance. Clustering in five classes group: (1) a particular type of large BGs without population; (2) BGs of green residential areas, with less negative exposures than average; (3) BGs of residential areas near midtown; (4) BGs close to industries; and (5) midtown urban BGs, with higher negative exposures than average and less green spaces. Other numbers of classes were tested in order to assess a variety of clustering. We present an approach using statistical factor and cluster analyses techniques, which seem overlooked to assess cumulative exposure in complex environmental settings. Although it cannot be applied directly for risk or health effect assessment, the resulting index can help to identify hot spots of cumulative exposure, to prioritize urban policies or to compare the environmental burden across study areas in an epidemiological framework.

  20. Landscape Change Index as a Tool for Spatial Analysis (United States)

    Krajewski, Piotr; Solecka, Iga; Barbara-Mastalska-Cetera


    This study analysed spatial and temporal changes in protected landscape of Ślęża Landscape Park in Poland, covering an area of 7724 ha. The main objective was to determine level of landscape change of the research area after Polish accession to European Union by comparing land-cover maps from 2004, 2009 and 2014. With the use of prepared land cover maps, we developed a database of the surface of the main elements constituting the background landscape of the research area. The data obtained made it feasible to assess the level of change in two different periods of time (2004 - 2009 and 2009 - 2014) by means of the landscape change index (LCI). This indicator is described by one value which is the result of all the change types taking place in the background landscape in a given period of time. Comparing the index of different parts of Ślęża Landscape Park helped to identify areas where the landscape changes were the highest and areas where the changes were hardly noticeable. The results show that when we take into account whole research area landscape changes are much more intense in the second of the analysed periods of time (2009 - 2014) (LCI=1,91) then in years 2004 - 2009 (LCI=0,71). The same analysis was done for each part of municipalities located within the Park. This made it possible to determine which part of the park is the most threatened by spatial transformations. In this context, it should be emphasized that the highest rates of landscape changes were recorded in the municipalities where the largest new residential area was located – in the municipalities Sobótka and Łagiewniki. Whereas municipality Dzierżoniów with a high percentage of forests inside the Park and unchanging area of arable land have the lowest landscape change index.

  1. Synchrotron radiation micro-X-ray fluorescence analysis: A tool to increase accuracy in microscopic analysis

    CERN Document Server

    Adams, F


    Microscopic X-ray fluorescence (XRF) analysis has potential for development as a certification method and as a calibration tool for other microanalytical techniques. The interaction of X-rays with matter is well understood and modelling studies show excellent agreement between experimental data and calculations using Monte Carlo simulation. The method can be used for a direct iterative calculation of concentrations using available high accuracy physical constants. Average accuracy is in the range of 3-5% for micron sized objects at concentration levels of less than 1 ppm with focused radiation from SR sources. The end-station ID18F of the ESRF is dedicated to accurate quantitative micro-XRF analysis including fast 2D scanning with collection of full X-ray spectra. Important aspects of the beamline are the precise monitoring of the intensity of the polarized, variable energy beam and the high reproducibility of the set-up measurement geometry, instrumental parameters and long-term stability.

  2. Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pabian, Frank V [Los Alamos National Laboratory


    This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previously used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant

  3. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT): Semi-Annual Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, D N


    This report summarizes work carried out by the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Team for the period of July 1, 2011 through December 31, 2011. It discusses highlights, overall progress, period goals, and collaborations and lists papers and presentations. The UV-CDAT team is positioned to address the following high-level visualization requirements: (1) Alternative parallel streaming statistics and analysis pipelines - Data parallelism, Task parallelism, Visualization parallelism; (2) Optimized parallel input/output (I/O); (3) Remote interactive execution; (4) Advanced intercomparison visualization; (5) Data provenance processing and capture; and (6) Interfaces for scientists - Workflow data analysis and visualization construction tools, Visualization interfaces.

  4. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul


    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  5. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis. (United States)

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab


    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.

  6. Transana Video Analysis Software as a Tool for Consultation: Applications to Improving PTA Meeting Leadership (United States)

    Rush, Craig


    The chief aim of this article is to illustrate the potential of using Transana, a qualitative video analysis tool, for effective and efficient school-based consultation. In this illustrative study, the Transana program facilitated analysis of excerpts of video from a representative sample of Parent Teacher Association (PTA) meetings over the…

  7. Conception of a PWR simulator as a tool for safety analysis

    International Nuclear Information System (INIS)

    Lanore, J.M.; Bernard, P.; Romeyer Dherbey, J.; Bonnet, C.; Quilchini, P.


    A simulator can be a very useful tool for safety analysis to study accident sequences involving malfunctions of the systems and operator interventions. The main characteristics of the simulator SALAMANDRE (description of the systems, physical models, programming organization, control desk) have then been selected according tot he objectives of safety analysis

  8. Toward Enhancing Automated Credibility Assessment: A Model for Question Type Classification and Tools for Linguistic Analysis (United States)

    Moffitt, Kevin Christopher


    The three objectives of this dissertation were to develop a question type model for predicting linguistic features of responses to interview questions, create a tool for linguistic analysis of documents, and use lexical bundle analysis to identify linguistic differences between fraudulent and non-fraudulent financial reports. First, The Moffitt…

  9. The VI-Suite: a set of environmental analysis tools with geospatial data applications

    NARCIS (Netherlands)

    Southall, Ryan; Biljecki, F.


    Background: The VI-Suite is a free and open-source addon for the 3D content creation application Blender, developed primarily as a tool for the contextual and performative analysis of buildings. Its functionality has grown from simple, static lighting analysis to fully parametric lighting,

  10. On the blind use of statistical tools in the analysis of globular cluster stars (United States)

    D'Antona, Francesca; Caloi, Vittoria; Tailo, Marco


    As with most data analysis methods, the Bayesian method must be handled with care. We show that its application to determine stellar evolution parameters within globular clusters can lead to paradoxical results if used without the necessary precautions. This is a cautionary tale on the use of statistical tools for big data analysis.

  11. A structured approach to forensic study of explosions: The TNO Inverse Explosion Analysis tool

    NARCIS (Netherlands)

    Voort, M.M. van der; Wees, R.M.M. van; Brouwer, S.D.; Jagt-Deutekom, M.J. van der; Verreault, J.


    Forensic analysis of explosions consists of determining the point of origin, the explosive substance involved, and the charge mass. Within the EU FP7 project Hyperion, TNO developed the Inverse Explosion Analysis (TNO-IEA) tool to estimate the charge mass and point of origin based on observed damage

  12. Geospatial Analysis Tool Kit for Regional Climate Datasets (GATOR) : An Open-source Tool to Compute Climate Statistic GIS Layers from Argonne Climate Modeling Results (United States)


    DRAFT Geospatial Analysis Tool Kit for Regional Climate Datasets (GATOR) An Open- source Tool to Compute Climate Statistic GIS...Unidata 2017). This report includes: 1. User documentation for, an open- source Python tool that uses Environmental Systems Research...of climate change. GATOR can compute many additional statistics by using new combinations of the existing input parameters. The code is open source

  13. New Geant4 based simulation tools for space radiation shielding and effects analysis

    International Nuclear Information System (INIS)

    Santina, G.; Nieminen, P.; Evansa, H.; Daly, E.; Lei, F.; Truscott, P.R.; Dyer, C.S.; Quaghebeur, B.; Heynderickx, D.


    We present here a set of tools for space applications based on the Geant4 simulation toolkit, developed for radiation shielding analysis as part of the European Space Agency (ESA) activities in the Geant4 collaboration. The Sector Shielding Analysis Tool (SSAT) and the Materials and Geometry Association (MGA) utility will first be described. An overview of the main features of the MUlti-LAyered Shielding SImulation Software tool (MULASSIS) will follow. The tool is specifically addressed to shielding optimization and effects analysis. A Java interface allows the use of MULASSIS by the space community over the World Wide Web, integrated in the widely used SPENVIS package. The analysis of the particle transport output provides automatically radiation fluence, ionising and NIEL dose and effects analysis. ESA is currently funding the porting of this tools to a lowcost parallel processor facility using the GRID technology under the ESA SpaceGRID initiative. Other Geant4 present and future projects will be presented related to the study of space environment effects on spacecrafts

  14. Database tools for enhanced analysis of TMX-U data. Revision 1

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.


    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  15. Identification of five novel FBN1 mutations by non-radioactive single-strand conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Liu, W.; Qian, C.; Comeau, K.; Francke, U. [Stanford Univ. Medical Center, Stanford, CA (United States)


    Marfan syndrome (MFS), one of the most common genetic disorders of connective tissue, is characterized by variable manifestations in skeletal, cardiovascular and ocular systems. Mutations in the fibrillin gene on chromosome 15 (FBN1) have been shown to cause MFS. To examine the relationship between FBN1 gene mutations, fibrillin protein function and MFS phenotypes, we screened for alternations in the fibrillin coding sequence in fibroblast derived cDNA from MFS patients. To date, abnormally migrating bands in more than 20 unrelated MFS patients have been identified by using non-radioactive single-strand conformation analysis and silver staining. Five altered bands have been directly sequenced. Two missense mutations and three splice site mutations have been identified. Both missense mutations substitute another amino acid for a cysteine residue (C1402W and C1672R) in EGF-like motifs of the fibrillin polypeptide chain. The two splice site mutations are at nucleotide positions 6994+1 (G{yields}A), and 7205-2 (A{yields}G) and result in in-frame skipping of exon 56 and 58, respectively. Skipping of exon 56 occurs in 50% of mutant transcripts. Use of a cryptic splice site 51 bp upstream of the normal donor site results in half of the mutant transcripts containing part of exon 56. Both products contain in-frame deletions. Another splice site mutation, identified by exon screening from patient genomic DNA using intron primers, is at nucleotide position 2293+2 (T{yields}A), but the predicted exon skipping has not been detected at the RT-PCR level. This may be due to instability of the mutant transcript. Including the mutations reported here, a total of 8 out of 36 published FBN1 gene mutations involve exon skipping. It may be inferred that FBN1 exon skipping plays an important pathogenic role in MFS.

  16. Risk analysis tools for force protection and infrastructure/asset protection

    International Nuclear Information System (INIS)

    Jaeger, C.D.; Duggan, R.A.; Paulus, W.K.


    The Security Systems and Technology Center at Sandia National Laboratories has for many years been involved in the development and use of vulnerability assessment and risk analysis tools. In particular, two of these tools, ASSESS and JTS, have been used extensively for Department of Energy facilities. Increasingly, Sandia has been called upon to evaluate critical assets and infrastructures, support DoD force protection activities and assist in the protection of facilities from terrorist attacks using weapons of mass destruction. Sandia is involved in many different activities related to security and force protection and is expanding its capabilities by developing new risk analysis tools to support a variety of users. One tool, in the very early stages of development, is EnSURE, Engineered Surety Using the Risk Equation. EnSURE addresses all of the risk equation and integrates the many components into a single, tool-supported process to help determine the most cost-effective ways to reduce risk. This paper will briefly discuss some of these risk analysis tools within the EnSURE framework

  17. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization (United States)

    Meyn, Larry A.


    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use


    Directory of Open Access Journals (Sweden)

    Katarzyna MIDOR


    Full Text Available To stay or strengthen its position on the market, a modern business needs to follow the principles of quality control in its actions. Especially important is the Zero Defects concept developed by Philip Crosby, which means flawless production. The concept consists in preventing the occurrence of defects and flaws in all production stages. To achieve that, we must, among other things, make use of quality management tools. This article presents an analysis of the reasons for the return of damaged or faulty goods in the automotive industry by means of quality management tools such as the Ishikawa diagram and Pareto analysis, which allow us to identify the causes of product defectiveness. Based on the results, preventive measures have been proposed. The actions presented in this article and the results of the analysis prove the effectiveness of the aforementioned quality management tools.

  19. MEL-IRIS: An Online Tool for Audio Analysis and Music Indexing

    Directory of Open Access Journals (Sweden)

    Dimitrios Margounakis


    Full Text Available Chroma is an important attribute of music and sound, although it has not yet been adequately defined in literature. As such, it can be used for further analysis of sound, resulting in interesting colorful representations that can be used in many tasks: indexing, classification, and retrieval. Especially in Music Information Retrieval (MIR, the visualization of the chromatic analysis can be used for comparison, pattern recognition, melodic sequence prediction, and color-based searching. MEL-IRIS is the tool which has been developed in order to analyze audio files and characterize music based on chroma. The tool implements specially designed algorithms and a unique way of visualization of the results. The tool is network-oriented and can be installed in audio servers, in order to manipulate large music collections. Several samples from world music have been tested and processed, in order to demonstrate the possible uses of such an analysis.

  20. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"

    DEFF Research Database (Denmark)


    This special section contains the revised and expanded versions of eight of the papers from the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS) held in March/April 2004 in Barcelona, Spain. The conference proceedings appeared as volume 2988...... in the Lecture Notes in Computer Science series published by Springer. TACAS is a forum for researchers, developers and users interested in rigorously based tools for the construction and analysis of systems. The conference serves to bridge the gaps between different communities – including but not limited...... to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  1. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    Energy Technology Data Exchange (ETDEWEB)

    Irsyad, M. Indra al [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Ministry of Energy and Mineral Resources, Jakarta (Indonesia); Halog, Anthony Basco, E-mail: [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Nepal, Rabindra [Massey Business School, Massey University, Palmerston North (New Zealand); Koesrindartoto, Deddy P. [School of Business and Management, Institut Teknologi Bandung, Bandung (Indonesia)


    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  2. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    International Nuclear Information System (INIS)

    Irsyad, M. Indra al; Halog, Anthony Basco; Nepal, Rabindra; Koesrindartoto, Deddy P.


    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  3. RiboCAT: a new capillary electrophoresis data analysis tool for nucleic acid probing. (United States)

    Cantara, William A; Hatterschide, Joshua; Wu, Weixin; Musier-Forsyth, Karin


    Chemical and enzymatic probing of RNA secondary structure and RNA/protein interactions provides the basis for understanding the functions of structured RNAs. However, the ability to rapidly perform such experiments using capillary electrophoresis has been hampered by relatively labor-intensive data analysis software. While these computationally robust programs have been shown to calculate residue-specific reactivities to a high degree of accuracy, they often require time-consuming manual intervention and lack the ability to be easily modified by users. To alleviate these issues, RiboCAT (Ribonucleic acid capillary-electrophoresis analysis tool) was developed as a user-friendly, Microsoft Excel-based tool that reduces the need for manual intervention, thereby significantly shortening the time required for data analysis. Features of this tool include (i) the use of an Excel platform, (ii) a method of intercapillary signal alignment using internal size standards, (iii) a peak-sharpening algorithm to more accurately identify peaks, and (iv) an open architecture allowing for simple user intervention. Furthermore, a complementary tool, RiboDOG (RiboCAT data output generator) was designed to facilitate the comparison of multiple data sets, highlighting potential inconsistencies and inaccuracies that may have occurred during analysis. Using these new tools, the secondary structure of the HIV-1 5' untranslated region (5'UTR) was determined using selective 2'-hydroxyl acylation analyzed by primer extension (SHAPE), matching the results of previous work. © 2017 Cantara et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  4. The TREAT-NMD DMD Global Database: Analysis of More than 7,000 Duchenne Muscular Dystrophy Mutations (United States)

    Bladen, Catherine L; Salgado, David; Monges, Soledad; Foncuberta, Maria E; Kekou, Kyriaki; Kosma, Konstantina; Dawkins, Hugh; Lamont, Leanne; Roy, Anna J; Chamova, Teodora; Guergueltcheva, Velina; Chan, Sophelia; Korngut, Lawrence; Campbell, Craig; Dai, Yi; Wang, Jen; Barišić, Nina; Brabec, Petr; Lahdetie, Jaana; Walter, Maggie C; Schreiber-Katz, Olivia; Karcagi, Veronika; Garami, Marta; Viswanathan, Venkatarman; Bayat, Farhad; Buccella, Filippo; Kimura, En; Koeks, Zaïda; van den Bergen, Janneke C; Rodrigues, Miriam; Roxburgh, Richard; Lusakowska, Anna; Kostera-Pruszczyk, Anna; Zimowski, Janusz; Santos, Rosário; Neagu, Elena; Artemieva, Svetlana; Rasic, Vedrana Milic; Vojinovic, Dina; Posada, Manuel; Bloetzer, Clemens; Jeannet, Pierre-Yves; Joncourt, Franziska; Díaz-Manera, Jordi; Gallardo, Eduard; Karaduman, A Ayşe; Topaloğlu, Haluk; El Sherif, Rasha; Stringer, Angela; Shatillo, Andriy V; Martin, Ann S; Peay, Holly L; Bellgard, Matthew I; Kirschner, Jan; Flanigan, Kevin M; Straub, Volker; Bushby, Kate; Verschuuren, Jan; Aartsma-Rus, Annemieke; Béroud, Christophe; Lochmüller, Hanns


    Analyzing the type and frequency of patient-specific mutations that give rise to Duchenne muscular dystrophy (DMD) is an invaluable tool for diagnostics, basic scientific research, trial planning, and improved clinical care. Locus-specific databases allow for the collection, organization, storage, and analysis of genetic variants of disease. Here, we describe the development and analysis of the TREAT-NMD DMD Global database ( We analyzed genetic data for 7,149 DMD mutations held within the database. A total of 5,682 large mutations were observed (80% of total mutations), of which 4,894 (86%) were deletions (1 exon or larger) and 784 (14%) were duplications (1 exon or larger). There were 1,445 small mutations (smaller than 1 exon, 20% of all mutations), of which 358 (25%) were small deletions and 132 (9%) small insertions and 199 (14%) affected the splice sites. Point mutations totalled 756 (52% of small mutations) with 726 (50%) nonsense mutations and 30 (2%) missense mutations. Finally, 22 (0.3%) mid-intronic mutations were observed. In addition, mutations were identified within the database that would potentially benefit from novel genetic therapies for DMD including stop codon read-through therapies (10% of total mutations) and exon skipping therapy (80% of deletions and 55% of total mutations). PMID:25604253

  5. The TREAT-NMD DMD Global Database: analysis of more than 7,000 Duchenne muscular dystrophy mutations. (United States)

    Bladen, Catherine L; Salgado, David; Monges, Soledad; Foncuberta, Maria E; Kekou, Kyriaki; Kosma, Konstantina; Dawkins, Hugh; Lamont, Leanne; Roy, Anna J; Chamova, Teodora; Guergueltcheva, Velina; Chan, Sophelia; Korngut, Lawrence; Campbell, Craig; Dai, Yi; Wang, Jen; Barišić, Nina; Brabec, Petr; Lahdetie, Jaana; Walter, Maggie C; Schreiber-Katz, Olivia; Karcagi, Veronika; Garami, Marta; Viswanathan, Venkatarman; Bayat, Farhad; Buccella, Filippo; Kimura, En; Koeks, Zaïda; van den Bergen, Janneke C; Rodrigues, Miriam; Roxburgh, Richard; Lusakowska, Anna; Kostera-Pruszczyk, Anna; Zimowski, Janusz; Santos, Rosário; Neagu, Elena; Artemieva, Svetlana; Rasic, Vedrana Milic; Vojinovic, Dina; Posada, Manuel; Bloetzer, Clemens; Jeannet, Pierre-Yves; Joncourt, Franziska; Díaz-Manera, Jordi; Gallardo, Eduard; Karaduman, A Ayşe; Topaloğlu, Haluk; El Sherif, Rasha; Stringer, Angela; Shatillo, Andriy V; Martin, Ann S; Peay, Holly L; Bellgard, Matthew I; Kirschner, Jan; Flanigan, Kevin M; Straub, Volker; Bushby, Kate; Verschuuren, Jan; Aartsma-Rus, Annemieke; Béroud, Christophe; Lochmüller, Hanns


    Analyzing the type and frequency of patient-specific mutations that give rise to Duchenne muscular dystrophy (DMD) is an invaluable tool for diagnostics, basic scientific research, trial planning, and improved clinical care. Locus-specific databases allow for the collection, organization, storage, and analysis of genetic variants of disease. Here, we describe the development and analysis of the TREAT-NMD DMD Global database ( We analyzed genetic data for 7,149 DMD mutations held within the database. A total of 5,682 large mutations were observed (80% of total mutations), of which 4,894 (86%) were deletions (1 exon or larger) and 784 (14%) were duplications (1 exon or larger). There were 1,445 small mutations (smaller than 1 exon, 20% of all mutations), of which 358 (25%) were small deletions and 132 (9%) small insertions and 199 (14%) affected the splice sites. Point mutations totalled 756 (52% of small mutations) with 726 (50%) nonsense mutations and 30 (2%) missense mutations. Finally, 22 (0.3%) mid-intronic mutations were observed. In addition, mutations were identified within the database that would potentially benefit from novel genetic therapies for DMD including stop codon read-through therapies (10% of total mutations) and exon skipping therapy (80% of deletions and 55% of total mutations). © 2015 The Authors. **Human Mutation published by Wiley Periodicals, Inc.

  6. SDA-Based Diagnostic and Analysis Tools for Collider Run II

    CERN Document Server

    Papadimitriou, Vaia; Lebrun, Paul; Panacek, S; Slaughter, Anna Jean; Xiao, Aimin


    Operating and improving the understanding of the Fermilab Accelerator Complex for the colliding beam experiments requires advanced software methods and tools. The Shot Data Acquisition and Analysis (SDA) has been developed to fulfill this need. Data is stored in a relational database, and is served to programs and users via Web-based tools. Summary tables are systematically generated during and after a store. These tables, the Supertable, and the Recomputed Emittances and Recomputed Intensity tables are discussed here. This information is also accesible in JAS3 (Java Analysis Studio version 3).

  7. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    Directory of Open Access Journals (Sweden)

    Delphine Ribes

    Full Text Available In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  8. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission (United States)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph


    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  9. Development of a Method for Tool Wear Analysis Using 3D Scanning

    Directory of Open Access Journals (Sweden)

    Hawryluk Marek


    Full Text Available The paper deals with evaluation of a 3D scanning method elaborated by the authors, by applying it to the analysis of the wear of forging tools. The 3D scanning method in the first place consists in the application of scanning to the analysis of changes in geometry of a forging tool by way of comparing the images of a worn tool with a CAD model or an image of a new tool. The method was evaluated in the context of the important measurement problems resulting from the extreme conditions present during the industrial hot forging processes. The method was used to evaluate wear of tools with an increasing wear degree, which made it possible to determine the wear characteristics in a function of the number of produced forgings. The following stage was the use it for a direct control of the quality and geometry changes of forging tools (without their disassembly by way of a direct measurement of the geometry of periodically collected forgings (indirect method based on forgings. The final part of the study points to the advantages and disadvantages of the elaborated method as well as the potential directions of its further development.

  10. Analysis, Design, Implementation and Evaluation of Graphical Design Tool to Develop Discrete Event Simulation Models Using Event Graphs and Simkit

    National Research Council Canada - National Science Library



    ... (OR) modeling and analysis. However, designing and implementing DES can be a time-consuming and error-prone task, This thesis designed, implemented and evaluated a tool, the Event Graph Graphical Design Tool (EGGDT...

  11. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    Furutaka, Kazuyoshi


    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  12. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix


    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  13. SafetyBarrierManager, a software tool to perform risk analysis using ARAMIS's principles

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan


    of the ARAMIS project, Risø National Laboratory started developing a tool that could implement these methodologies, leading to SafetyBarrierManager. The tool is based on the principles of “safety‐barrier diagrams”, which are very similar to “bowties”, with the possibility of performing quantitative analysis......The ARAMIS project resulted in a number of methodologies, dealing with among others: the development of standard fault trees and “bowties”; the identification and classification of safety barriers; and including the quality of safety management into the quantified risk assessment. After conclusion....... The tool allows constructing comprehensive fault trees, event trees and safety‐barrier diagrams. The tool implements the ARAMIS idea of a set of safety barrier types, to which a number of safety management issues can be linked. By rating the quality of these management issues, the operational probability...

  14. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0 (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan


    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  15. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling. (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y


    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: .

  16. Implementation Analysis of Cutting Tool Carbide with Cast Iron Material S45 C on Universal Lathe (United States)

    Junaidi; hestukoro, Soni; yanie, Ahmad; Jumadi; Eddy


    Cutting tool is the tools lathe. Cutting process tool CARBIDE with Cast Iron Material Universal Lathe which is commonly found at Analysiscutting Process by some aspects numely Cutting force, Cutting Speed, Cutting Power, Cutting Indication Power, Temperature Zone 1 and Temperatur Zone 2. Purpose of this Study was to determine how big the cutting Speed, Cutting Power, electromotor Power,Temperatur Zone 1 and Temperatur Zone 2 that drives the chisel cutting CARBIDE in the Process of tur ning Cast Iron Material. Cutting force obtained from image analysis relationship between the recommended Component Cuting Force with plane of the cut and Cutting Speed obtained from image analysis of relationships between the recommended Cutting Speed Feed rate.

  17. Lean production tools and decision latitude enable conditions for innovative learning in organizations: a multilevel analysis. (United States)

    Fagerlind Ståhl, Anna-Carin; Gustavsson, Maria; Karlsson, Nadine; Johansson, Gun; Ekberg, Kerstin


    The effect of lean production on conditions for learning is debated. This study aimed to investigate how tools inspired by lean production (standardization, resource reduction, visual monitoring, housekeeping, value flow analysis) were associated with an innovative learning climate and with collective dispersion of ideas in organizations, and whether decision latitude contributed to these associations. A questionnaire was sent out to employees in public, private, production and service organizations (n = 4442). Multilevel linear regression analyses were used. Use of lean tools and decision latitude were positively associated with an innovative learning climate and collective dispersion of ideas. A low degree of decision latitude was a modifier in the association to collective dispersion of ideas. Lean tools can enable shared understanding and collective spreading of ideas, needed for the development of work processes, especially when decision latitude is low. Value flow analysis played a pivotal role in the associations. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  18. A compilation of Web-based research tools for miRNA analysis. (United States)

    Shukla, Vaibhav; Varghese, Vinay Koshy; Kabekkodu, Shama Prasada; Mallya, Sandeep; Satyamoorthy, Kapaettu


    Since the discovery of microRNAs (miRNAs), a class of noncoding RNAs that regulate the gene expression posttranscriptionally in sequence-specific manner, there has been a release of number of tools useful for both basic and advanced applications. This is because of the significance of miRNAs in many pathophysiological conditions including cancer. Numerous bioinformatics tools that have been developed for miRNA analysis have their utility for detection, expression, function, target prediction and many other related features. This review provides a comprehensive assessment of web-based tools for the miRNA analysis that does not require prior knowledge of any computing languages. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email:

  19. The development of a visualization tool for displaying analysis and test results

    International Nuclear Information System (INIS)

    Uncapher, W.L.; Ammerman, D.J.; Ludwigsen, J.S.; Wix, S.D.


    The evaluation and certification of packages for transportation of radioactive materials is performed by analysis, testing, or a combination of both. Within the last few years, many transport packages that were certified have used a combination of analysis and testing. The ability to combine and display both kinds of data with interactive graphical tools allows a faster and more complete understanding of the response of the package to these environments. Sandia National Laboratories has developed an initial version of a visualization tool that allows the comparison and display of test and of analytical data as part of a Department of Energy-sponsored program to support advanced analytical techniques and test methodologies. The capability of the tool extends to both mechanical (structural) and thermal data

  20. Integrated analysis tools for trade studies of spacecraft controller and sensor locations (United States)

    Rowell, L. F.


    The present investigation was conducted with the aim to evaluate the practicality and difficulties of modern control design methods for large space structure controls. The evaluation is used as a basis for the identification of useful computer-based analysis tools which would provide insight into control characteristics of a spacecraft concept. A description is presented of the wrap-rib antenna and its packaging concept. Attention is given to active control requirements, a mathematical model of structural dynamics, aspects of sensor and actuator location, the analysis approach, controllability, observability, the concept of balanced realization, transmission zeros, singular value plots, analysis results, model reduction, and an interactive computer program. It is pointed out that the application of selected control analysis tools to the wrap-rib antenna demonstrates several capabilities which can be useful during conceptual design.

  1. R-based Tool for a Pairwise Structure-Activity Relationship Analysis. (United States)

    Klimenko, Kyrylo


    The Structure-Activity Relationship analysis is a complex process that can be enhanced by computational techniques. This article describes a simple tool for SAR analysis that has a graphic user interface and a flexible approach towards the input of molecular data. The application allows calculating molecular similarity represented by Tanimoto index & Euclid distance, as well as, determining activity cliffs by means of Structure-Activity Landscape Index. The calculation is performed in a pairwise manner either for the reference compound and other compounds or for all possible pairs in the data set. The results of SAR analysis are visualized using two types of plot. The application capability is demonstrated by the analysis of a set of COX2 inhibitors with respect to Isoxicam. This tool is available online: it includes manual and input file examples. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. New Tools in Orthology Analysis: A Brief Review of Promising Perspectives

    Directory of Open Access Journals (Sweden)

    Bruno T. L. Nichio


    Full Text Available Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST “all-against-all” methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm; or proteinOrtho (which improves the accuracy of ortholog groups; or ReMark (tackling the integration of the pipeline to turn the entry process automatic; or OrthAgogue (using algorithms developed to

  3. Financial analysis of community-based forest enterprises with the Green Value tool (United States)

    S. Humphries; Tom Holmes


    The Green Value tool was developed in response to the need for simplified procedures that could be used in the field to conduct financial analysis for community-based forest enterprises (CFEs). Initially our efforts focused on a set of worksheets that could be used by both researchers and CFEs to monitor and analyze costs and income for one production period. The...

  4. Agroinjection of Tomato Fruits : a Tool for Rapid Functional Analysis of Transgenes Directly in Fruit

    NARCIS (Netherlands)

    Orzaéz Calatayud, D.V.; Mirabel, S.; Wieland, W.H.; Granell, A.


    Transient expression of foreign genes in plant tissues is a valuable tool for plant biotechnology. To shorten the time for gene functional analysis in fruits, we developed a transient methodology that could be applied to tomato (Solanum lycopersicum cv Micro Tom) fruits. It was found that injection

  5. Tool to estimate optical metrics from summary wave-front analysis data in the human eye

    NARCIS (Netherlands)

    Jansonius, Nomdo M.

    Purpose Studies in the field of cataract and refractive surgery often report only summary wave-front analysis data data that are too condensed to allow for a retrospective calculation of metrics relevant to visual perception. The aim of this study was to develop a tool that can be used to estimate

  6. Advances in geospatial analysis platforms and tools: Creating space for differentiated policy and investment responses

    CSIR Research Space (South Africa)

    Maritz, Johan


    Full Text Available Over the last 5 years a set of incremental advances within geospatial analysis platforms and tools developed by the CSIR's Planning Support Systems in collaboration with key stakeholders such as The Presidency, enabled a more nuanced regional level...

  7. The use of case tools in OPG safety analysis code qualification

    International Nuclear Information System (INIS)

    Pascoe, J.; Cheung, A.; Westbye, C.


    Ontario Power Generation (OPG) is currently qualifying its critical safety analysis software. The software quality assurance (SQA) framework is described. Given the legacy nature of much of the safety analysis software the reverse engineering methodology has been adopted. The safety analysis suite of codes was developed over a period of many years to differing standards of quality and had sparse or incomplete documentation. Key elements of the reverse engineering process require recovery of design information from existing coding. This recovery, if performed manually, could represent an enormous effort. Driven by a need to maximize productivity and enhance the repeatability and objectivity of software qualification activities the decision was made to acquire or develop and implement Computer Aided Software Engineering (CASE) tools. This paper presents relevant background information on CASE tools and discusses how the OPG SQA requirements were used to assess the suitability of available CASE tools. Key findings from the application of CASE tools to the qualification of the OPG safety analysis software are discussed. (author)

  8. UAV : Warnings From Multiple Automated Static Analysis Tools At A Glance

    NARCIS (Netherlands)

    Buckers, T.B.; Cao, C.S.; Doesburg, M.S.; Gong, Boning; Wang, Sunwei; Beller, M.M.; Zaidman, A.E.; Pinzger, Martin; Bavota, Gabriele; Marcus, Andrian


    Automated Static Analysis Tools (ASATs) are an integral part of today’s software quality assurance practices. At present, a plethora of ASATs exist, each with different strengths. However, there is little guidance for developers on which of these ASATs to choose and combine for a project. As a

  9. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.


    management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  10. A Portfolio Analysis Tool for Measuring NASAs Aeronautics Research Progress toward Planned Strategic Outcomes (United States)

    Tahmasebi, Farhad; Pearce, Robert


    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.

  11. State Civic Education Policy: Framework and Gap Analysis Tool. Special Report (United States)

    Baumann, Paul; Brennan, Jan


    The civic education policy framework and gap analysis tool are intended to guide state leaders as they address the complexities of preparing students for college, career and civic life. They allow for adaptation to state- and site-specific circumstances and may be adopted in whole or in piecemeal fashion, according to states' individual…

  12. Indexing Combined with Statistical Deflation as a Tool for Analysis of Longitudinal Data. (United States)

    Babcock, Judith A.

    Indexing is a tool that can be used with longitudinal, quantitative data for analysis of relative changes and for comparisons of changes among items. For greater accuracy, raw financial data should be deflated into constant dollars prior to indexing. This paper demonstrates the procedures for indexing, statistical deflation, and the use of…

  13. Analysis of Java Client/Server and Web Programming Tools for Development of Educational Systems. (United States)

    Muldner, Tomasz

    This paper provides an analysis of old and new programming tools for development of client/server programs, particularly World Wide Web-based programs. The focus is on development of educational systems that use interactive shared workspaces to provide portable and expandable solutions. The paper begins with a short description of relevant terms.…

  14. ULg Spectra: An Interactive Software Tool to Improve Undergraduate Students' Structural Analysis Skills (United States)

    Agnello, Armelinda; Carre, Cyril; Billen, Roland; Leyh, Bernard; De Pauw, Edwin; Damblon, Christian


    The analysis of spectroscopic data to solve chemical structures requires practical skills and drills. In this context, we have developed ULg Spectra, a computer-based tool designed to improve the ability of learners to perform complex reasoning. The identification of organic chemical compounds involves gathering and interpreting complementary…

  15. Expanded Capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST)

    Energy Technology Data Exchange (ETDEWEB)

    Bush, Brian; Melaina, Marc; Penev, Michael


    This presentation describes how NREL expanded the capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST) in FY16. It was presented at the U.S. Department of Energy Hydrogen and Fuel Cells Program 2016 Annual Merit Review and Peer Evaluation Meeting on June 8, 2016, in Washington, D.C.

  16. Enhancements to the Image Analysis Tool for Core Punch Experiments and Simulations (vs. 2014)

    Energy Technology Data Exchange (ETDEWEB)

    Hogden, John Edward [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Unal, Cetin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    A previous paper (Hogden & Unal, 2012, Image Analysis Tool for Core Punch Experiments and Simulations) described an image processing computer program developed at Los Alamos National Laboratory. This program has proven useful so developement has been continued. In this paper we describe enhacements to the program as of 2014.

  17. On Fast Fourier Transform-A Popular Tool for Spectrum Analysis

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 10. On Fast Fourier Transform - A Popular Tool for Spectrum Analysis. V Umapathi Reddy. General Article ... Author Affiliations. V Umapathi Reddy1. Electrical Communication Engineering, Indian Institute of Science, Bangalore 560012, India ...

  18. TTCScope - An analysis tool for the Trigger, Timing and Control signals of the LHC experiments


    Moosavi, P; Ohm, C; Pauly, T


    This document describes a scope-based signal analysis tool for the Trigger, Timing and Control system of the LHC experiments. The TTC signal is digitized and read out using a digital sampling oscilloscope, and then analyzed in software. From the sampled signal, the BC clock is recovered along with the signal contents: level-1 accepts, commands, and trigger types. Two common use-cases in the ATLAS experiment are addressed: analysis of TTC signals and calibration of TTC crates. The latter inclu...

  19. Analytical sensitivity analysis of geometric errors in a three axis machine tool

    International Nuclear Information System (INIS)

    Park, Sung Ryung; Yang, Seung Han


    In this paper, an analytical method is used to perform a sensitivity analysis of geometric errors in a three axis machine tool. First, an error synthesis model is constructed for evaluating the position volumetric error due to the geometric errors, and then an output variable is defined, such as the magnitude of the position volumetric error. Next, the global sensitivity analysis is executed using an analytical method. Finally, the sensitivity indices are calculated using the quantitative values of the geometric errors


    Directory of Open Access Journals (Sweden)

    Radu BISCA


    Full Text Available The main objective of this project is to develop a set of tools and to integrate techniques in a software package which is build on structure analysis applications based on Romanian engineers experience in designing and analysing aerospace structures, consolidated with the most recent methods and techniques. The applications automates the structure’s design and analysis processes and facilitate the exchange of technical information between the partners involved in a complex aerospace project without limiting the domain.

  1. FUTISTREFFIT : Participatory Action Research: analysis and evaluation of football as a community youth development tool


    Wesseh, Cucu


    Wesseh Cucu. Thesis: Futistreffit – analysis and evaluation. Language: English. Content: 53 pages, 2 appendices. Degree: Bachelor of Social Services. Focus: Community Development. Institution: Diaconia University of Applied Sciences, Järvenpää The aim of this research is to examine football as a positive youth development tool for Learning-Integration. It focuses on community youth work and uses action research as the prime method of analysis and evaluation. The subject of researc...

  2. HBS-Tools for Hairpin Bisulfite Sequencing Data Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Ming-an Sun


    Full Text Available The emerging genome-wide hairpin bisulfite sequencing (hairpin-BS-Seq technique enables the determination of the methylation pattern for DNA double strands simultaneously. Compared with traditional bisulfite sequencing (BS-Seq techniques, hairpin-BS-Seq can determine methylation fidelity and increase mapping efficiency. However, no computational tool has been designed for the analysis of hairpin-BS-Seq data yet. Here we present HBS-tools, a set of command line based tools for the preprocessing, mapping, methylation calling, and summarizing of genome-wide hairpin-BS-Seq data. It accepts paired-end hairpin-BS-Seq reads to recover the original (pre-bisulfite-converted sequences using global alignment and then calls the methylation statuses for cytosines on both DNA strands after mapping the original sequences to the reference genome. After applying to hairpin-BS-Seq datasets, we found that HBS-tools have a reduced mapping time and improved mapping efficiency compared with state-of-the-art mapping tools. The HBS-tools source scripts, along with user guide and testing data, are freely available for download.

  3. A web tool for age-period-cohort analysis of cancer incidence and mortality rates. (United States)

    Rosenberg, Philip S; Check, David P; Anderson, William F


    Age-period-cohort (APC) analysis can inform registry-based studies of cancer incidence and mortality, but concerns about statistical identifiability and interpretability, as well as the learning curves of statistical software packages, have limited its uptake. We implemented a panel of easy-to-interpret estimable APC functions and corresponding Wald tests in R code that can be accessed through a user-friendly Web tool. Input data for the Web tool consist of age-specific numbers of events and person-years over time, in the form of a rate matrix of paired columns. Output functions include model-based estimators of cross-sectional and longitudinal age-specific rates, period and cohort rate ratios that incorporate the overall annual percentage change (net drift), and estimators of the age-specific annual percentage change (local drifts). The Web tool includes built-in examples for teaching and demonstration. User data can be input from a Microsoft Excel worksheet or by uploading a comma-separated-value file. Model outputs can be saved in a variety of formats, including R and Excel. APC methodology can now be carried out through a freely available user-friendly Web tool. The tool can be accessed at The Web tool can help cancer surveillance researchers make important discoveries about emerging cancer trends and patterns. ©2014 American Association for Cancer Research.

  4. An analysis of the Grade 3 Department of Basic Education workbooks as curriculum tools

    Directory of Open Access Journals (Sweden)

    Ursula Hoadley


    Full Text Available Since 2011, the Department of Basic Education (DBE, has provided all Grade 1 to 6 learners in public schools with literacy/language, numeracy/mathematics and life skills workbooks. This study provides an assessment of the purpose to which the workbooks are best suited by analysing the Grade 3 Mathematics and Home Language English workbooks for 2015 in the light of the DBE’s intentions for the workbooks. The study considers alignment of the workbooks with the official national curriculum, the Curriculum and Assessment Policy (CAPS, as well as ‘conceptual signalling’ and progression in the content of the workbooks. We then evaluate the kind of ‘curriculum tool’ the workbooks in their current format conform to. We explore three possibilities in the light of the DBE’s proposals for the workbook use: a practice tool; an assessment tool; and a monitoring tool. We also reflect on the workbooks as a teaching tool. Our analysis suggests that, in line with the DBE’s intended purpose for the workbooks, that the workbooks best represent a practice curriculum tool. We highlight the significance of the high level of curriculum compliance of the workbook, and indicate that this would render them an effective monitoring tool for assessing the quantitative coverage of the curriculum at a systemic level.

  5. Does tool use extend peripersonal space? A review and re-analysis. (United States)

    Holmes, Nicholas P


    The fascinating idea that tools become extensions of our body appears in artistic, literary, philosophical, and scientific works alike. In the last 15 years, this idea has been reframed into several related hypotheses, one of which states that tool use extends the neural representation of the multisensory space immediately surrounding the hands (variously termed peripersonal space, peri-hand space, peri-cutaneous space, action space, or near space). This and related hypotheses have been tested extensively in the cognitive neurosciences, with evidence from molecular, neurophysiological, neuroimaging, neuropsychological, and behavioural fields. Here, I briefly review the evidence for and against the hypothesis that tool use extends a neural representation of the space surrounding the hand, concentrating on neurophysiological, neuropsychological, and behavioural evidence. I then provide a re-analysis of data from six published and one unpublished experiments using the crossmodal congruency task to test this hypothesis. While the re-analysis broadly confirms the previously reported finding that tool use does not literally extend peripersonal space, the overall effect sizes are small and statistical power is low. I conclude by questioning whether the crossmodal congruency task can indeed be used to test the hypothesis that tool use modifies peripersonal space.

  6. A library of cortical morphology analysis tools to study development, aging and genetics of cerebral cortex. (United States)

    Kochunov, Peter; Rogers, William; Mangin, Jean-Francois; Lancaster, Jack


    Sharing of analysis techniques and tools is among the main driving forces of modern neuroscience. We describe a library of tools developed to quantify global and regional differences in cortical anatomy in high resolution structural MR images. This library is distributed as a plug-in application for popular structural analysis software, BrainVisa (BV). It contains tools to measure global and regional gyrification, gray matter thickness and sulcal and gyral white matter spans. We provide a description of each tool and examples for several case studies to demonstrate their use. These examples show how the BV library was used to study cortical folding process during antenatal development and recapitulation of this process during cerebral aging. Further, the BV library was used to perform translation research in humans and non-human primates on the genetics of cerebral gyrification. This library, including source code and self-contained binaries for popular computer platforms, is available from the NIH-Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC) resource ( ).

  7. ConfBuster: Open-Source Tools for Macrocycle Conformational Search and Analysis

    Directory of Open Access Journals (Sweden)

    Xavier Barbeau


    Full Text Available Macrocycles are cyclic macromolecules that have gained an increased interest in drug development. To our knowledge, the current bioinformatics tools that are available to investigate and predict macrocycles 3D conformations are limited in their availability. In this paper, we introduce ConfBuster, a suite of tools written in Python with the goal of sampling the lower energy conformations of macrocycles. The suite also includes tools for the analysis and visualisation of the conformational search results. Coordinate sets of single molecules in MOL2 or PDB format are required as input, and a set of lower energy conformation coordinates is returned as output, as well as PyMOL script and graphics for results analysis. In addition to Python and the optional R programming languages with freely available packages, the tools require Open Babel and PyMOL to work properly. For several examples, ConfBuster found macrocycle conformations that are within few tenths of Å of the experimental structures in minutes. To our knowledge, this is the only open-source tools for macrocycle conformational search available to the scientific community

  8. A comprehensive comparison of tools for differential ChIP-seq analysis. (United States)

    Steinhauser, Sebastian; Kurzawa, Nils; Eils, Roland; Herrmann, Carl


    ChIP-seq has become a widely adopted genomic assay in recent years to determine binding sites for transcription factors or enrichments for specific histone modifications. Beside detection of enriched or bound regions, an important question is to determine differences between conditions. While this is a common analysis for gene expression, for which a large number of computational approaches have been validated, the same question for ChIP-seq is particularly challenging owing to the complexity of ChIP-seq data in terms of noisiness and variability. Many different tools have been developed and published in recent years. However, a comprehensive comparison and review of these tools is still missing. Here, we have reviewed 14 tools, which have been developed to determine differential enrichment between two conditions. They differ in their algorithmic setups, and also in the range of applicability. Hence, we have benchmarked these tools on real data sets for transcription factors and histone modifications, as well as on simulated data sets to quantitatively evaluate their performance. Overall, there is a great variety in the type of signal detected by these tools with a surprisingly low level of agreement. Depending on the type of analysis performed, the choice of method will crucially impact the outcome. © The Author 2016. Published by Oxford University Press.

  9. Factor analysis as a tool in groundwater quality management: two southern African case studies (United States)

    Love, David; Hallbauer, Dieter; Amos, Amos; Hranova, Roumiana

    Although developed as a tool in the social sciences, R-mode factor analysis, a multivariate statistical tool, has proven highly effective in studies of groundwater quality. The technique examines the relationships between variables (such as chemical parameters in groundwater), which are shown by a number of cases (such as sampling points). In this study, two examples are presented. The first is of groundwater around a southern African iron ore mine and the second is of groundwater in the vicinity of a southern African municipal sewage disposal works. Groundwater samples were collected, their chemistry analysed and factor analysis was performed on each of the chemical datasets. In the first case study, factor analysis successfully separated signatures due to uncontaminated groundwater (calcium, magnesium and bicarbonate), agricultural activities (potassium and ammonium) and mining activities (sodium, chloride and sulphate). In the second case study, factor analysis did identify a chemical signature (nitrate and phosphate; minor iron) related to the sewage works-but since this signature involved parameters that were within regulated limits, the finding was of limited value for management purposes. Thus although R-mode factor analysis can be a valuable tool studies of groundwater quality, this is not always the case. Multivariate statistical techniques like factor analysis should thus be used as supplementary to, but not in replacement of, conventional groundwater quality data treatment methods.


    Directory of Open Access Journals (Sweden)

    K. Kovács


    Full Text Available The improvement of detailed surface documentation methods provides unique tool mark-study opportunities in the field of archaeological researches. One of these data collection techniques is short-range laser scanning, which creates a digital copy of the object’s morphological characteristics from high-resolution datasets. The aim of our work was the accurate documentation of a Bronze Age sluice box from Mitterberg, Austria with a spatial resolution of 0.2 mm. Furthermore, the investigation of the entirely preserved tool marks on the surface of this archaeological find was also accomplished by these datasets. The methodology of this tool mark-study can be summarized in the following way: At first, a local hydrologic analysis has been applied to separate the various patterns of tools on the finds’ surface. As a result, the XYZ coordinates of the special points, which represent the edge lines of the sliding tool marks, were calculated by buffer operations in a GIS environment. During the second part of the workflow, these edge points were utilized to manually clip the triangle meshes of these patterns in reverse engineering software. Finally, circle features were generated and analysed to determine the different sections along these sliding tool marks. In conclusion, the movement of the hand tool could be reproduced by the spatial analysis of the created features, since the horizontal and vertical position of the defined circle centre points indicated the various phases of the movements. This research shows an exact workflow to determine the fine morphological structures on the surface of the archaeological find.

  11. Time series analysis of tool wear in sheet metal stamping using acoustic emission (United States)

    Vignesh Shanbhag, V.; Pereira, P. Michael; Rolfe, F. Bernard; Arunachalam, N.


    Galling is an adhesive wear mode that often affects the lifespan of stamping tools. Since stamping tools represent significant economic cost, even a slight improvement in maintenance cost is of high importance for the stamping industry. In other manufacturing industries, online tool condition monitoring has been used to prevent tool wear-related failure. However, monitoring the acoustic emission signal from a stamping process is a non-trivial task since the acoustic emission signal is non-stationary and non-transient. There have been numerous studies examining acoustic emissions in sheet metal stamping. However, very few have focused in detail on how the signals change as wear on the tool surface progresses prior to failure. In this study, time domain analysis was applied to the acoustic emission signals to extract features related to tool wear. To understand the wear progression, accelerated stamping tests were performed using a semi-industrial stamping setup which can perform clamping, piercing, stamping in a single cycle. The time domain features related to stamping were computed for the acoustic emissions signal of each part. The sidewalls of the stamped parts were scanned using an optical profilometer to obtain profiles of the worn part, and they were qualitatively correlated to that of the acoustic emissions signal. Based on the wear behaviour, the wear data can be divided into three stages: - In the first stage, no wear is observed, in the second stage, adhesive wear is likely to occur, and in the third stage severe abrasive plus adhesive wear is likely to occur. Scanning electron microscopy showed the formation of lumps on the stamping tool, which represents galling behavior. Correlation between the time domain features of the acoustic emissions signal and the wear progression identified in this study lays the basis for tool diagnostics in stamping industry.

  12. Playbook Data Analysis Tool: Collecting Interaction Data from Extremely Remote Users (United States)

    Kanefsky, Bob; Zheng, Jimin; Deliz, Ivonne; Marquez, Jessica J.; Hillenius, Steven


    Typically, user tests for software tools are conducted in person. At NASA, the users may be located at the bottom of the ocean in a pressurized habitat, above the atmosphere in the International Space Station, or in an isolated capsule on a simulated asteroid mission. The Playbook Data Analysis Tool (P-DAT) is a human-computer interaction (HCI) evaluation tool that the NASA Ames HCI Group has developed to record user interactions with Playbook, the group's existing planning-and-execution software application. Once the remotely collected user interaction data makes its way back to Earth, researchers can use P-DAT for in-depth analysis. Since a critical component of the Playbook project is to understand how to develop more intuitive software tools for astronauts to plan in space, P-DAT helps guide us in the development of additional easy-to-use features for Playbook, informing the design of future crew autonomy tools.P-DAT has demonstrated the capability of discreetly capturing usability data in amanner that is transparent to Playbook’s end-users. In our experience, P-DAT data hasalready shown its utility, revealing potential usability patterns, helping diagnose softwarebugs, and identifying metrics and events that are pertinent to Playbook usage aswell as spaceflight operations. As we continue to develop this analysis tool, P-DATmay yet provide a method for long-duration, unobtrusive human performance collectionand evaluation for mission controllers back on Earth and researchers investigatingthe effects and mitigations related to future human spaceflight performance.

  13. Python Spectral Analysis Tool (PySAT) for Preprocessing, Multivariate Analysis, and Machine Learning with Point Spectra (United States)

    Anderson, R. B.; Finch, N.; Clegg, S.; Graff, T.; Morris, R. V.; Laura, J.


    We present a Python-based library and graphical interface for the analysis of point spectra. The tool is being developed with a focus on methods used for ChemCam data, but is flexible enough to handle spectra from other instruments.

  14. Error Propagation Analysis in the SAE Architecture Analysis and Design Language (AADL) and the EDICT Tool Framework (United States)

    LaValley, Brian W.; Little, Phillip D.; Walter, Chris J.


    This report documents the capabilities of the EDICT tools for error modeling and error propagation analysis when operating with models defined in the Architecture Analysis & Design Language (AADL). We discuss our experience using the EDICT error analysis capabilities on a model of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER) architecture that uses the Reliable Optical Bus (ROBUS). Based on these experiences we draw some initial conclusions about model based design techniques for error modeling and analysis of highly reliable computing architectures.

  15. Credit analysis using cognitive maps as a tool to support decision making in a factoring company

    Directory of Open Access Journals (Sweden)

    Sergio Renato Ferreira Decker


    Full Text Available This study represents a case study conducted in a factoring company in the city of Pelotas, Rio Grande do Sul. In order to support company managers, the research aimed to develop a tool to support the decision to grant credit to new customers. Aiming to better understand the problem presented, with all its complexity, a Cognitive Map of the credit analysis process was developed in a constructive way along with the maker.. Through the resulting Cognitive Map, the decision maker now possesses an important tool to describe their decisions. Such research will proceed with the development of the remaining steps necessary to build a  multicriteria decision model.

  16. Model and numerical analysis of mechanical phenomena of tools steel hardening

    Directory of Open Access Journals (Sweden)

    A. Bokota


    Full Text Available This paper the model hardening of tool steel takes into considerations of mechanical phenomena is presented. Fields stresses and strains are obtained from solutions by FEM equilibrium equations in rate form. The stresses generated during hardening were assumed to result from thermal load, structural deformation, and plastic deformation and transformation plasticity. Thermophysical values in the constitutive relations are depended upon both the temperature and the phase composition. Condition Huber-Misses with the isotropic strengthening for the creation of plastic strains is used. However model Leblond to determined transformations plasticity applied. The analysis of stresses associated of the elements hardening made of tool steel was done.



    Ivanov, Stanislav


    This master thesis is a part of the ongoing research on EAT development project. Its main goal is to research whether Eclipse Modeling Project can be used as an alternative platform to using NetBeans in implementing EAT tool. In order to fulfill this goal, it contains analysis of the current EAT tool version and design research of a new version using EMP. The design addresses most of the issues related to building a new version and eventually recommends porting EAT to EMP.

  18. Tool wear monitoring by machine learning techniques and singular spectrum analysis (United States)

    Kilundu, Bovic; Dehombreux, Pierre; Chiementin, Xavier


    This paper explores the use of data mining techniques for tool condition monitoring in metal cutting. Pseudo-local singular spectrum analysis (SSA) is performed on vibration signals measured on the toolholder. This is coupled to a band-pass filter to allow definition and extraction of features which are sensitive to tool wear. These features are defined, in some frequency bands, from sums of Fourier coefficients of reconstructed and residual signals obtained by SSA. This study highlights two important aspects: strong relevance of information in high frequency vibration components and benefits of the combination of SSA and band-pass filtering to get rid of useless components (noise).

  19. An introduction to TR-X: a simplified tool for standardized analysis

    Energy Technology Data Exchange (ETDEWEB)

    Johns, Russell C [Los Alamos National Laboratory; Waters, Laurie S [Los Alamos National Laboratory; Fallgren, Andrew J [Los Alamos National Laboratory; Ghoreson, Gregory G [UNIV OF TEXAS


    TR-X is a multi-platform program that provides a graphical interface to Monte Carlo N-Particle transport (MCNP) and Monte Carlo N-Particle transport eXtended (MCNPX) codes. Included in this interface are tools to reduce the tedium of input file creation, provide standardization of model creation and analysis, and expedite the execution of the created models. TR-X provides tools to make the rapid testing of multiple permutations of these models easier, while also building in standardization that allows multiple solutions to be compared.

  20. OST: analysis tool for real time software by simulation of material and software environments

    International Nuclear Information System (INIS)

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard


    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  1. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koo, Michelle [Univ. of California, Berkeley, CA (United States); Cao, Yu [California Inst. of Technology (CalTech), Pasadena, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)


    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe- art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.

  2. REVEAL - A tool for rule driven analysis of safety critical software

    International Nuclear Information System (INIS)

    Miedl, H.; Kersken, M.


    As the determination of ultrahigh reliability figures for safety critical software is hardly possible, national and international guidelines and standards give mainly requirements for the qualitative evaluation of software. An analysis whether all these requirements are fulfilled is time and effort consuming and prone to errors, if performed manually by analysts, and should instead be dedicated to tools as far as possible. There are many ''general-purpose'' software analysis tools, both static and dynamic, which help analyzing the source code. However, they are not designed to assess the adherence to specific requirements of guidelines and standards in the nuclear field. Against the background of the development of I and C systems in the nuclear field which are based on digital techniques and implemented in high level language, it is essential that the assessor or licenser has a tool with which he can automatically and uniformly qualify as many aspects as possible of the high level language software. For this purpose the software analysis tool REVEAL has been developed at ISTec and the Halden Reactor Project. (author)

  3. CloudMan as a platform for tool, data, and analysis distribution

    Directory of Open Access Journals (Sweden)

    Afgan Enis


    Full Text Available Abstract Background Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. Results CloudMan ( enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. Conclusions With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions.

  4. NEuronMOrphological analysis tool: open-source software for quantitative morphometrics (United States)

    Billeci, Lucia; Magliaro, Chiara; Pioggia, Giovanni; Ahluwalia, Arti


    Morphometric analysis of neurons and brain tissue is relevant to the study of neuron circuitry development during the first phases of brain growth or for probing the link between microstructural morphology and degenerative diseases. As neural imaging techniques become ever more sophisticated, so does the amount and complexity of data generated. The NEuronMOrphological analysis tool NEMO was purposely developed to handle and process large numbers of optical microscopy image files of neurons in culture or slices in order to automatically run batch routines, store data and apply multivariate classification and feature extraction using 3-way principal component analysis (PCA). Here we describe the software's main features, underlining the differences between NEMO and other commercial and non-commercial image processing tools, and show an example of how NEMO can be used to classify neurons from wild-type mice and from animal models of autism. PMID:23420185

  5. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.


    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  6. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources. (United States)

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak


    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  7. RADYBAN: A tool for reliability analysis of dynamic fault trees through conversion into dynamic Bayesian networks

    International Nuclear Information System (INIS)

    Montani, S.; Portinale, L.; Bobbio, A.; Codetta-Raiteri, D.


    In this paper, we present RADYBAN (Reliability Analysis with DYnamic BAyesian Networks), a software tool which allows to analyze a dynamic fault tree relying on its conversion into a dynamic Bayesian network. The tool implements a modular algorithm for automatically translating a dynamic fault tree into the corresponding dynamic Bayesian network and exploits classical algorithms for the inference on dynamic Bayesian networks, in order to compute reliability measures. After having described the basic features of the tool, we show how it operates on a real world example and we compare the unreliability results it generates with those returned by other methodologies, in order to verify the correctness and the consistency of the results obtained

  8. SustainPro - A tool for systematic process analysis, generation and evaluation of sustainable design alternatives

    DEFF Research Database (Denmark)

    Carvalho, Ana; Matos, Henrique A.; Gani, Rafiqul


    . The software tool is based on the implementation of an extended systematic methodology for sustainable process design (Carvalho et al. 2008 and Carvalho et al. 2009). Using process information/data such as the process flowsheet, the associated mass / energy balance data and the cost data, SustainPro guides......Chemical processes are continuously facing challenges from the demands of the global market related to economics, environment and social issues. This paper presents the development of a software tool (SustainPro) and its application to chemical processes operating in batch or continuous modes...... the user through the necessary steps according to work-flow of the implemented methodology. At the end the design alternatives, are evaluated using environmental impact assessment tools and safety indices. The extended features of the methodology incorporate Life Cycle Assessment analysis and economic...

  9. Biofac, a microbiological multimedia tool to perform the analysis of activated sludge

    International Nuclear Information System (INIS)

    Ferrer Torregrosa, C.; Llopis Nicolau, A.; Claramonte Santarrufina, J.; Alonso Hernandez, S.


    The composition and structure of the macrobiotic that is part of the active sludge, its temporal evolution, and the analysis of the macroscopic and microscopic characteristics of it are a source of information of great help in making decisions for plant operators. Lack of training and access to specific information linked to the missing standardization of analysis processes hinder the implementation and interpretation of them. Using a multimedia tool in DVD, Facsa has developed the Biofac, an application in which it is documented and illustrated the most relevant aspects that allow the user to perform the analysis of activated sludge. (Author)

  10. Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm (United States)

    Pak, Chan-gi; Li, Wesley


    Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!

  11. System analysis: A tool for determining the feasibility of PACS for radiology

    International Nuclear Information System (INIS)

    Parrish, D.M.; Thompson, B.G.; Creasy, J.L.; Wallace, R.J.


    In the emerging technology of picture archival and communication systems, the real productivity improvements that such a system may provide are being evaluated by a systems analysis tool. This computer model allows a simulated comparison of manual versus digital departmental functions, by allowing changes of operational parameter times, physical distances in the department, and equipment and manpower resources; examples are presented. The presentation will focus on the analysis approach, the operational parameters most important in the digital environment, and an analysis of the potential productivity improvements

  12. Virtual tool mark generation for efficient striation analysis in forensic science

    Energy Technology Data Exchange (ETDEWEB)

    Ekstrand, Laura [Iowa State Univ., Ames, IA (United States)


    In 2009, a National Academy of Sciences report called for investigation into the scienti c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served to con rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5 and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range of angles

  13. Analysis of Multiple Genomic Sequence Alignments: A Web Resource, Online Tools, and Lessons Learned From Analysis of Mammalian SCL Loci (United States)

    Chapman, Michael A.; Donaldson, Ian J.; Gilbert, James; Grafham, Darren; Rogers, Jane; Green, Anthony R.; Göttgens, Berthold


    Comparative analysis of genomic sequences is becoming a standard technique for studying gene regulation. However, only a limited number of tools are currently available for the analysis of multiple genomic sequences. An extensive data set for the testing and training of such tools is provided by the SCL gene locus. Here we have expanded the data set to eight vertebrate species by sequencing the dog SCL locus and by annotating the dog and rat SCL loci. To provide a resource for the bioinformatics community, all SCL sequences and functional annotations, comprising a collation of the extensive experimental evidence pertaining to SCL regulation, have been made available via a Web server. A Web interface to new tools specifically designed for the display and analysis of multiple sequence alignments was also implemented. The unique SCL data set and new sequence comparison tools allowed us to perform a rigorous examination of the true benefits of multiple sequence comparisons. We demonstrate that multiple sequence alignments are, overall, superior to pairwise alignments for identification of mammalian regulatory regions. In the search for individual transcription factor binding sites, multiple alignments markedly increase the signal-to-noise ratio compared to pairwise alignments. PMID:14718377

  14. Syncope risk stratification tools vs clinical judgment: an individual patient data meta-analysis. (United States)

    Costantino, Giorgio; Casazza, Giovanni; Reed, Matthew; Bossi, Ilaria; Sun, Benjamin; Del Rosso, Attilio; Ungar, Andrea; Grossman, Shamai; D'Ascenzo, Fabrizio; Quinn, James; McDermott, Daniel; Sheldon, Robert; Furlan, Raffaello


    There have been several attempts to derive syncope prediction tools to guide clinician decision-making. However, they have not been largely adopted, possibly because of their lack of sensitivity and specificity. We sought to externally validate the existing tools and to compare them with clinical judgment, using an individual patient data meta-analysis approach. Electronic databases, bibliographies, and experts in the field were screened to find all prospective studies enrolling consecutive subjects presenting with syncope to the emergency department. Prediction tools and clinical judgment were applied to all patients in each dataset. Serious outcomes and death were considered separately during emergency department stay and at 10 and 30 days after presenting syncope. Pooled sensitivities, specificities, likelihood ratios, and diagnostic odds ratios, with 95% confidence intervals, were calculated. Thirteen potentially relevant papers were retrieved (11 authors). Six authors agreed to share individual patient data. In total, 3681 patients were included. Three prediction tools (Osservatorio Epidemiologico sulla Sincope del Lazio [OESIL], San Francisco Syncope Rule [SFSR], Evaluation of Guidelines in Syncope Study [EGSYS]) could be assessed by the available datasets. None of the evaluated prediction tools performed better than clinical judgment in identifying serious outcomes during emergency department stay, and at 10 and 30 days after syncope. Despite the use of an individual patient data approach to reduce heterogeneity among studies, a large variability was still present. Current prediction tools did not show better sensitivity, specificity, or prognostic yield compared with clinical judgment in predicting short-term serious outcome after syncope. Our systematic review strengthens the evidence that current prediction tools should not be strictly used in clinical practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. A knowledge translation tool improved osteoporosis disease management in primary care: an interrupted time series analysis. (United States)

    Kastner, Monika; Sawka, Anna M; Hamid, Jemila; Chen, Maggie; Thorpe, Kevin; Chignell, Mark; Ewusie, Joycelyne; Marquez, Christine; Newton, David; Straus, Sharon E


    Osteoporosis affects over 200 million people worldwide at a high cost to healthcare systems, yet gaps in management still exist. In response, we developed a multi-component osteoporosis knowledge translation (Op-KT) tool involving a patient-initiated risk assessment questionnaire (RAQ), which generates individualized best practice recommendations for physicians and customized education for patients at the point of care. The objective of this study was to evaluate the effectiveness of the Op-KT tool for appropriate disease management by physicians. The Op-KT tool was evaluated using an interrupted time series design. This involved multiple assessments of the outcomes 12 months before (baseline) and 12 months after tool implementation (52 data points in total). Inclusion criteria were family physicians and their patients at risk for osteoporosis (women aged ≥ 50 years, men aged ≥ 65 years). Primary outcomes were the initiation of appropriate osteoporosis screening and treatment. Analyses included segmented linear regression modeling and analysis of variance. The Op-KT tool was implemented in three family practices in Ontario, Canada representing 5 family physicians with 2840 age eligible patients (mean age 67 years; 76% women). Time series regression models showed an overall increase from baseline in the initiation of screening (3.4%; P management addressed by their physician. Study limitations included the inherent susceptibility of our design compared with a randomized trial. The multicomponent Op-KT tool significantly increased osteoporosis investigations in three family practices, and highlights its potential to facilitate patient self-management. Next steps include wider implementation and evaluation of the tool in primary care.

  16. Toxic release consequence analysis tool (TORCAT) for inherently safer design plant

    International Nuclear Information System (INIS)

    Shariff, Azmi Mohd; Zaini, Dzulkarnain


    Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage.

  17. Multispectral analysis tools can increase utility of RGB color images in histology (United States)

    Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard


    Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.

  18. Reproducible Analysis of Sequencing-Based RNA Structure Probing Data with User-Friendly Tools. (United States)

    Kielpinski, Lukasz Jan; Sidiropoulos, Nikolaos; Vinther, Jeppe


    RNA structure-probing data can improve the prediction of RNA secondary and tertiary structure and allow structural changes to be identified and investigated. In recent years, massive parallel sequencing has dramatically improved the throughput of RNA structure probing experiments, but at the same time also made analysis of the data challenging for scientists without formal training in computational biology. Here, we discuss different strategies for data analysis of massive parallel sequencing-based structure-probing data. To facilitate reproducible and standardized analysis of this type of data, we have made a collection of tools, which allow raw sequencing reads to be converted to normalized probing values using different published strategies. In addition, we also provide tools for visualization of the probing data in the UCSC Genome Browser and for converting RNA coordinates to genomic coordinates and vice versa. The collection is implemented as functions in the R statistical environment and as tools in the Galaxy platform, making them easily accessible for the scientific community. We demonstrate the usefulness of the collection by applying it to the analysis of sequencing-based hydroxyl radical probing data and comparing different normalization strategies. © 2015 Elsevier Inc. All rights reserved.

  19. A population MRI brain template and analysis tools for the macaque. (United States)

    Seidlitz, Jakob; Sponheim, Caleb; Glen, Daniel; Ye, Frank Q; Saleem, Kadharbatcha S; Leopold, David A; Ungerleider, Leslie; Messinger, Adam


    The use of standard anatomical templates is common in human neuroimaging, as it facilitates data analysis and comparison across subjects and studies. For non-human primates, previous in vivo templates have lacked sufficient contrast to reliably validate known anatomical brain regions and have not provided tools for automated single-subject processing. Here we present the "National Institute of Mental Health Macaque Template", or NMT for short. The NMT is a high-resolution in vivo MRI template of the average macaque brain generated from 31 subjects, as well as a neuroimaging tool for improved data analysis and visualization. From the NMT volume, we generated maps of tissue segmentation and cortical thickness. Surface reconstructions and transformations to previously published digital brain atlases are also provided. We further provide an analysis pipeline using the NMT that automates and standardizes the time-consuming processes of brain extraction, tissue segmentation, and morphometric feature estimation for anatomical scans of individual subjects. The NMT and associated tools thus provide a common platform for precise single-subject data analysis and for characterizations of neuroimaging results across subjects and studies. Copyright © 2017 ElsevierCompany. All rights reserved.

  20. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis. (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier


    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients