WorldWideScience

Sample records for computationally identified trifoliate

  1. EVALUATION OF COLOUR IN WHITE AND YELLOW TRIFOLIATE ...

    African Journals Online (AJOL)

    IBUKUN

    2010-03-20

    Mar 20, 2010 ... 2Department of Food Technology, University of Ibadan, Oyo State, Nigeria. ... Therefore, this work determines the colour in white and yellow trifoliate ... Freshly harvested trifoliate yam tubers were prepared into flour using four.

  2. Evaluation of colour in white and yellow trifoliate yam flours in ...

    African Journals Online (AJOL)

    Colour is one of the important sensory properties that determine the acceptability of food products. Therefore, this work determines the colour in white and yellow trifoliate yam flours in relation to harvesting periods and pre-processing methods. Freshly harvested trifoliate yam tubers were prepared into flour using four ...

  3. California mild CTV strains that break resistance in Trifoliate Orange

    Science.gov (United States)

    This is the final report of a project to characterize California isolates of Citrus tristeza virus (CTV) that replicate in Poncirus trifoliata (trifoliate orange). Next Generation Sequencing (NGS) of viral small interfering RNAs (siRNAs) and assembly of full-length sequences of mild California CTV i...

  4. Textural and sensory properties of trifoliate yam (Dioscorea dumetorum) flour and stiff dough 'amala'.

    Science.gov (United States)

    Abiodun, O A; Akinoso, R

    2015-05-01

    The use of trifoliate yam (Dioscorea dumetorum) flour for stiff dough 'amala' production is one of the ways to curb under-utilization of the tuber. The study evaluates the textural and sensory properties of trifoliate yam flour and stiff dough. Freshly harvested trifoliate yam tubers were peeled, washed, sliced and blanched (60 (°)C for 10 min). The sliced yam were soaked in water for 12 h, dried and milled into flour. Pasting viscosities, functional properties, brown index and sensory attributes of the flour and stiff dough were analyzed. Peak, holding strength and final viscosities ranged from 84.09 to 213.33 RVU, 81.25 to 157.00 RVU and 127.58 to 236.17 RVU respectively. White raw flour had higher viscosity than the yellow flours. The swelling index, water absorption capacity and bulk density ranged from 1.46 to 2.28, 2.11 to 2.92 ml H2O/g and 0.71 to 0.88 g/cm(3) respectively. Blanching method employed improved the swelling index and water absorption capacity of flour. The brown index values of flour and stiff dough ranged from 6.73 to 18.36 and 14.63-46.72 respectively. Sensory evaluation revealed significant differences in the colour, odour and general acceptability of the product when compared with the stiff dough from white yam.

  5. Deep sequencing discovery of novel and conserved microRNAs in trifoliate orange (Citrus trifoliata

    Directory of Open Access Journals (Sweden)

    Yu Huaping

    2010-07-01

    Full Text Available Abstract Background MicroRNAs (miRNAs play a critical role in post-transcriptional gene regulation and have been shown to control many genes involved in various biological and metabolic processes. There have been extensive studies to discover miRNAs and analyze their functions in model plant species, such as Arabidopsis and rice. Deep sequencing technologies have facilitated identification of species-specific or lowly expressed as well as conserved or highly expressed miRNAs in plants. Results In this research, we used Solexa sequencing to discover new microRNAs in trifoliate orange (Citrus trifoliata which is an important rootstock of citrus. A total of 13,106,753 reads representing 4,876,395 distinct sequences were obtained from a short RNA library generated from small RNA extracted from C. trifoliata flower and fruit tissues. Based on sequence similarity and hairpin structure prediction, we found that 156,639 reads representing 63 sequences from 42 highly conserved miRNA families, have perfect matches to known miRNAs. We also identified 10 novel miRNA candidates whose precursors were all potentially generated from citrus ESTs. In addition, five miRNA* sequences were also sequenced. These sequences had not been earlier described in other plant species and accumulation of the 10 novel miRNAs were confirmed by qRT-PCR analysis. Potential target genes were predicted for most conserved and novel miRNAs. Moreover, four target genes including one encoding IRX12 copper ion binding/oxidoreductase and three genes encoding NB-LRR disease resistance protein have been experimentally verified by detection of the miRNA-mediated mRNA cleavage in C. trifoliata. Conclusion Deep sequencing of short RNAs from C. trifoliata flowers and fruits identified 10 new potential miRNAs and 42 highly conserved miRNA families, indicating that specific miRNAs exist in C. trifoliata. These results show that regulatory miRNAs exist in agronomically important trifoliate orange

  6. Trifoliate hybrids as rootstocks for Pêra sweet orange tree

    Directory of Open Access Journals (Sweden)

    Jorgino Pompeu Junior

    2014-03-01

    Full Text Available The Rangpur lime (Citrus limonia has been used as the main rootstock for Pêra sweet orange (C. sinensis trees. However, its susceptibility to citrus blight and citrus sudden death has led to the use of disease-tolerant rootstocks, such as Cleopatra mandarin reshni, Sunki mandarin (C. sunki and Swingle citrumelo (C. paradisi x Poncirus trifoliata, which are more susceptible to drought than the Rangpur lime. These mandarin varieties are also less resistant to root rot caused by Phytophthora, and the Swingle citrumelo showed to be incompatible with the Pêra sweet orange. In search of new rootstock varieties, this study aimed at assessing the fruit precocity and yield, susceptibility to tristeza and blight and occurrence of incompatibility of Pêra sweet orange trees grafted on 12 trifoliate hybrids, on Rangpur lime EEL and Goutou sour orange, without irrigation. Tristeza and blight are endemic in the experimental area. The Sunki x English (1628 and Changsha x English Small (1710 citrandarins and two other selections of Cleopatra x Rubidoux provided the highest cumulative yields, in the first three crops and in the total of six crops evaluated. The Cleopatra x Rubidoux (1660 and Sunki x Benecke (1697 citrandarins induced early yield, while the Cravo x Swingle citromonia and C-13 citrange induced later yield. None of the rootstock varieties caused alternate bearing. Pêra sweet orange trees grafted on Swingle citrumelo, Cleopatra x Swingle (1654 citrandarin and on two selections of Rangpur lime x Carrizo citrange showed bud-union-ring symptoms of incompatibility. None of the plants presented symptoms of tristeza or blight.

  7. Identification and comparative profiling of miRNAs in an early flowering mutant of trifoliate orange and its wild type by genome-wide deep sequencing.

    Directory of Open Access Journals (Sweden)

    Lei-Ming Sun

    Full Text Available MicroRNAs (miRNAs are a new class of small, endogenous RNAs that play a regulatory role in various biological and metabolic processes by negatively affecting gene expression at the post-transcriptional level. While the number of known Arabidopsis and rice miRNAs is continuously increasing, information regarding miRNAs from woody plants such as citrus remains limited. Solexa sequencing was performed at different developmental stages on both an early flowering mutant of trifoliate orange (precocious trifoliate orange, Poncirus trifoliata L. Raf. and its wild-type in this study, resulting in the obtainment of 141 known miRNAs belonging to 99 families and 75 novel miRNAs in four libraries. A total of 317 potential target genes were predicted based on the 51 novel miRNAs families, GO and KEGG annotation revealed that high ranked miRNA-target genes are those implicated in diverse cellular processes in plants, including development, transcription, protein degradation and cross adaptation. To characterize those miRNAs expressed at the juvenile and adult development stages of the mutant and its wild-type, further analysis on the expression profiles of several miRNAs through real-time PCR was performed. The results revealed that most miRNAs were down-regulated at adult stage compared with juvenile stage for both the mutant and its wild-type. These results indicate that both conserved and novel miRNAs may play important roles in citrus growth and development, stress responses and other physiological processes.

  8. Identification and Characterization of Citrus tristeza virus Isolates Breaking Resistance in Trifoliate Orange in California.

    Science.gov (United States)

    Yokomi, Raymond K; Selvaraj, Vijayanandraj; Maheshwari, Yogita; Saponari, Maria; Giampetruzzi, Annalisa; Chiumenti, Michela; Hajeri, Subhas

    2017-07-01

    Most Citrus tristeza virus (CTV) isolates in California are biologically mild and symptomless in commercial cultivars on CTV tolerant rootstocks. However, to better define California CTV isolates showing divergent serological and genetic profiles, selected isolates were subjected to deep sequencing of small RNAs. Full-length sequences were assembled, annotated and trifoliate orange resistance-breaking (RB) isolates of CTV were identified. Phylogenetic relationships based on their full genomes placed three isolates in the RB clade: CA-RB-115, CA-RB-AT25, and CA-RB-AT35. The latter two isolates were obtained by aphid transmission from Murcott and Dekopon trees, respectively, containing CTV mixtures. The California RB isolates were further distinguished into two subclades. Group I included CA-RB-115 and CA-RB-AT25 with 99% nucleotide sequence identity with RB type strain NZRB-G90; and group II included CA-RB-AT35 with 99 and 96% sequence identity with Taiwan Pumelo/SP/T1 and HA18-9, respectively. The RB phenotype was confirmed by detecting CTV replication in graft-inoculated Poncirus trifoliata and transmission from P. trifoliata to sweet orange. The California RB isolates induced mild symptoms compared with severe isolates in greenhouse indexing tests. Further examination of 570 CTV accessions, acquired from approximately 1960 and maintained in planta at the Central California Tristeza Eradication Agency, revealed 16 RB positive isolates based on partial p65 sequences. Six isolates collected from 1992 to 2011 from Tulare and Kern counties were CA-RB-115-like; and 10 isolates collected from 1968 to 2010 from Riverside, Fresno, and Kern counties were CA-RB-AT35-like. The presence of the RB genotype is relevant because P. trifoliata and its hybrids are the most popular rootstocks in California.

  9. Híbridos de trifoliata como porta-enxertos para a laranjeira 'Valência' Trifoliate hybrids as rootstocks for sweet orange 'Valência'

    Directory of Open Access Journals (Sweden)

    Jorgino Pompeu Junior

    2009-07-01

    Full Text Available O objetivo deste trabalho foi avaliar a produtividade e as características agronômicas de laranjeira 'Valência', enxertadas em porta-enxertos de híbridos de trifoliata (Poncirus trifoliata. A produção de frutos, a de sólidos solúveis totais por planta, as dimensões e a eficiência produtiva de copas de laranjeira 'Valência', enxertadas em 13 híbridos de trifoliata, cultivados sem irrigação, foram avaliados por períodos que variaram de três a oito anos. As plantas também foram avaliadas, visualmente, quanto à manifestação dos sintomas de tristeza (Citrus tristeza virus e de declínio-dos-citros, e foi utilizado o teste diagnóstico "dot immunobinding assay" (DIBA, para detecção da ocorrência do declínio antes do aparecimento dos sintomas. As plantas tinham oito anos de idade, no início das avaliações. Verificou-se que o citrandarin 'Sunki' x 'English' induziu as maiores produções de frutos em oito colheitas, sem diferir significativamente do citrange 'Troyer'. Em três anos de análise, o citrandarin 'Sunki' x 'English', sem diferir dos citranges 'Troyer' e 'Carrizo', também induziu as maiores produções de frutos e sólidos solúveis por planta. O citrentin 'Clementina' x trifoliata, os citrandarins 'Cleópatra' x 'Swingle' (715 e (1.614, 'Cleópatra' x 'Rubidoux' (1.600 e 'Cleópatra' x 'Christian' induziram a formação de laranjeiras da cultivar Valência com alturas iguais ou inferiores a 2,5 m. Nenhuma das plantas apresentou sintomas de tristeza ou declínio-dos-citros. Foi constatada a incompatibilidade entre a cultivar Valência e o trangpur 'Cravo' x 'Carrizo'.The objective of this work was to evaluate the productivity and agronomic traits of 'Valência' sweet orange tree budded onto trifoliate (Poncirus trifoliata hybrids rootstocks. Fruit production, total soluble solids production per plant, canopy production efficiency and dimensions of 'Valência' sweet orange trees budded onto 13 trifoliate hybrids

  10. Avaliação de citrandarins e outros híbridos de trifoliata como porta-enxertos para citros em São Paulo Performance of citrandarins and others trifoliate hybrids rootstocks in Sao Paulo, Brazil

    Directory of Open Access Journals (Sweden)

    Silvia Blumer

    2005-08-01

    Full Text Available Laranjeiras Valência enxertadas em citrandarins e outros híbridos de trifoliata foram plantadas em 1988, em Itirapina (SP, num Latossolo Vermelho-Amarelo textura arenosa e conduzidas sem irrigação. O citrandarin Sunki x English (1.628, sem diferir estatisticamente de Cleópatra x Rubidoux (1.660, Cleópatra x English (710, Cleópatra x Swingle (715 e do trangpur Cravo x Carrizo (717, induziu a maior produção de frutos nas cinco primeiras colheitas do experimento (1991-1995, sendo que os três primeiros foram os mais produtivos nas três últimas colheitas. Os citranges Troyer e Carrizo foram significativamente inferiores aos citrandarins Sunki x English (1.628, Cleópatra x Rubidoux (1.660 e Cleópatra x English (710 em todos os anos, exceto 1994. Nenhuma das plantas apresentou sintomas de suscetibilidade à tristeza ou ao declínio. Os seedlings dos porta-enxertos diferiram quanto à área lesionada pela inoculação com Phytophthora parasitica. Os citrandarins Cleópatra x Swingle (1.587, Cleópatra x Trifoliata (1.574, Cleópatra x Rubidoux (1.600, Clementina x Trifoliata (1.615 e o limão Cravo x citrange Carrizo (717 foram significativamente mais resistentes que Cleópatra x Christian (712, Sunki x English (1.628, Cleópatra x Swingle (715 e Cleópatra x English (710.Valencia sweet orange trees budded onto citrandarins and others trifoliate hybrids rootstocks from the USDA Horticultural Research Laboratory, Fort Pierce, Florida, were planted in 1988 on a sandy textured Oxisol in São Paulo State, Brazil, and managed without irrigation. Tristeza and blight diseases are endemic in this area. Trees of Sunki x English (1.628, Cleopatra x Rubidoux (1.660, Cleopatra x English (710, Cleopatra x Swingle (715 and Rangpur lime x Carrizo citrange (717, produced the highest cumulative yields in the first five crops (1991-1995. The first three rootstocks induced the highest crops in the last three years. Carrizo and Troyer citranges had the lowest

  11. Direct and indirect effects of glomalin, mycorrhizal hyphae, and roots on aggregate stability in rhizosphere of trifoliate orange.

    Science.gov (United States)

    Wu, Qiang-Sheng; Cao, Ming-Qin; Zou, Ying-Ning; He, Xin-hua

    2014-07-25

    To test direct and indirect effects of glomalin, mycorrhizal hyphae, and roots on aggregate stability, perspex pots separated by 37-μm nylon mesh in the middle were used to form root-free hyphae and root/hyphae chambers, where trifoliate orange (Poncirus trifoliata) seedlings were colonized by Funneliformis mosseae or Paraglomus occultum in the root/hyphae chamber. Both fungal species induced significantly higher plant growth, root total length, easily-extractable glomalin-related soil protein (EE-GRSP) and total GRSP (T-GRSP), and mean weight diameter (an aggregate stability indicator). The Pearson correlation showed that root colonization or soil hyphal length significantly positively correlated with EE-GRSP, difficultly-extractable GRSP (DE-GRSP), T-GRSP, and water-stable aggregates in 2.00-4.00, 0.50-1.00, and 0.25-0.50 mm size fractions. The path analysis indicated that in the root/hyphae chamber, aggregate stability derived from a direct effect of root colonization, EE-GRSP or DE-GRSP. Meanwhile, the direct effect was stronger by EE-GRSP or DE-GRSP than by mycorrhizal colonization. In the root-free hyphae chamber, mycorrhizal-mediated aggregate stability was due to total effect but not direct effect of soil hyphal length, EE-GRSP and T-GRSP. Our results suggest that GRSP among these tested factors may be the primary contributor to aggregate stability in the citrus rhizosphere.

  12. Alleviation of drought stress by mycorrhizas is related to increased root H2O2 efflux in trifoliate orange.

    Science.gov (United States)

    Huang, Yong-Ming; Zou, Ying-Ning; Wu, Qiang-Sheng

    2017-02-08

    The Non-invasive Micro-test Technique (NMT) is used to measure dynamic changes of specific ions/molecules non-invasively, but information about hydrogen peroxide (H 2 O 2 ) fluxes in different classes of roots by mycorrhiza is scarce in terms of NMT. Effects of Funneliformis mosseae on plant growth, H 2 O 2 , superoxide radical (O 2 ·- ), malondialdehyde (MDA) concentrations, and H 2 O 2 fluxes in the taproot (TR) and lateral roots (LRs) of trifoliate orange seedlings under well-watered (WW) and drought stress (DS) conditions were studied. DS strongly inhibited mycorrhizal colonization in the TR and LRs, whereas mycorrhizal inoculation significantly promoted plant growth and biomass production. H 2 O 2 , O 2 ·- , and MDA concentrations in leaves and roots were dramatically lower in mycorrhizal seedlings than in non-mycorrhizal seedlings under DS. Compared with non-mycorrhizal seedlings, mycorrhizal seedlings had relatively higher net root H 2 O 2 effluxes in the TR and LRs especially under WW, as well as significantly higher total root H 2 O 2 effluxes in the TR and LRs under WW and DS. Total root H 2 O 2 effluxes were significantly positively correlated with root colonization but negatively with root H 2 O 2 and MDA concentrations. It suggested that mycorrhizas induces more H 2 O 2 effluxes of the TR and LRs, thus, alleviating oxidative damage of DS in the host plant.

  13. Boron alleviates the aluminum toxicity in trifoliate orange by regulating antioxidant defense system and reducing root cell injury.

    Science.gov (United States)

    Riaz, Muhammad; Yan, Lei; Wu, Xiuwen; Hussain, Saddam; Aziz, Omar; Wang, Yuhan; Imran, Muhammad; Jiang, Cuncang

    2018-02-15

    Aluminium (Al) toxicity is the most important soil constraint for plant growth and development in acid soils (pH Boron (B) is an essential micronutrient for the growth and development of higher plants. The results of previous studies propose that B might ameliorate Al toxicity; however, none of the studies have been conducted on trifoliate orange to study this effect. Thus, a study was carried out in hydroponics comprising of two different Al concentrations, 0 and 400 μM. For every concentration, two B treatments (0 and 10 μM as H 3 BO 3 ) were applied to investigate the B-induced alleviation of Al toxicity and exploring the underneath mechanisms. The results revealed that Al toxicity under B deficiency severely hampered the root growth and physiology of plant, caused oxidative stress and membrane damage, leading to severe root injury and damage. However, application of B under Al toxicity improved the root elongation and photosynthesis, while reduced Al uptake and mobilization into plant parts. Moreover, B supply regulated the activities of antioxidant enzymes, proline, secondary metabolites (phenylalanine ammonia lyase and polyphenol oxidase) contents, and stabilized integrity of proteins. Our study results imply that B supply promoted root growth as well as defense system by reducing reactive oxygen species (ROS) and Al concentrations in plant parts thus B induced alleviation of Al toxicity; a fact that might be significant for higher productivity of agricultural plants grown in acidic conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Identifying logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Science.gov (United States)

    Davis, Kristan D.; Faraj, Daniel A.

    2016-03-01

    In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.

  15. Identifying failure in a tree network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.

    2010-08-24

    Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.

  16. A computer model for identifying security system upgrades

    International Nuclear Information System (INIS)

    Lamont, A.

    1988-01-01

    This paper describes a prototype safeguards analysis tool that automatically identifies system weaknesses against an insider adversary and suggest possible upgrades to improve the probability that the adversary will be detected. The tool is based on this premise: as the adversary acts, he or she creates a set of facts that can be detected by safeguards components. Whenever an adversary's planned set of actions create a set of facts which the security personnel would consider irregular or unusual, we can improve the security system by implementing safeguards that detect those facts. Therefore, an intelligent computer program can suggest upgrades to the facility if we construct a knowledge base that contains information about: (1) the facts created by each possible adversary action, (2) the facts that each possible safeguard can detect, and (3) groups of facts which will be considered irregular whenever they occur together. The authors describe the structure of the knowledge base and show how the above information can be represented in it. They also describe the procedures that a computer program can use to identify missing or weak safeguards and to suggest upgrades

  17. Disruption of mycorrhizal extraradical mycelium and changes in leaf water status and soil aggregate stability in rootbox-grown trifoliate orange

    Directory of Open Access Journals (Sweden)

    Ying-Ning eZou

    2015-03-01

    Full Text Available Arbuscular mycorrhizas possess well developed extraradical mycelium (ERM network that enlarge the surrounding soil for better acquisition of water and nutrients, besides soil aggregation. Distinction in ERM functioning was studied under a rootbox system, which consisted of root+hyphae and root-free hyphae compartments separated by 37-μm nylon mesh with an air gap. Trifoliate orange (Poncirus trifoliata seedlings were inoculated with Funneliformis mosseae in root+hyphae compartment, and the ERM network was established between the two compartments. The ERM network of air gap was disrupted before 8 h of the harvest (one time disruption or multiple disruptions during seedlings acclimation. Our results showed that mycorrhizal inoculation induced a significant increase in growth (plant height, stem diameter, and leaf, stem, and root biomass and physiological characters (leaf relative water content, leaf water potential, and transpiration rate, irrespective of ERM status. Easily-extractable glomalin-related soil protein (EE-GRSP and total GRSP (T-GRSP concentration and mean weight diameter (MWD, an indicator of soil aggregate stability were significantly higher in mycorrhizosphere of root+hyphae and root-free hyphae compartments than non-mycorrhizosphere. One time disruption of ERM network did not influence plant growth and soil properties but only notably decreased leaf water. Periodical disruption of ERM network at weekly interval markedly inhibited the mycorrhizal roles on plant growth, leaf water, GRSP production, and MWD in root+hyphae and hyphae chambers. EE-GRSP was the most responsive GRSP fraction to changes in leaf water and MWD under root+hyphae and hyphae conditions. It suggests that effect of peridical disruption of ERM network was more impactful than one-time disruption of ERM network with regard to leaf water, plant growth, and aggregate stability responses, thereby, implying ERM network aided in developing the host plant metabolically

  18. A Computer-Based Instrument That Identifies Common Science Misconceptions

    Science.gov (United States)

    Larrabee, Timothy G.; Stein, Mary; Barman, Charles

    2006-01-01

    This article describes the rationale for and development of a computer-based instrument that helps identify commonly held science misconceptions. The instrument, known as the Science Beliefs Test, is a 47-item instrument that targets topics in chemistry, physics, biology, earth science, and astronomy. The use of an online data collection system…

  19. Field performance of "marsh seedless" grapefruit on trifoliate orange inoculated with viroids in Brazil Desempenho do pomeleiro "marsh seedles" enxertado em trifoliata inoculado com viróides no Brasil

    Directory of Open Access Journals (Sweden)

    Eduardo Sanches Stuchi

    2007-12-01

    Full Text Available Some viroids reduce citrus tree growth and may be used for tree size control aiming the establishment of orchards with close tree spacing that may provide higher productivity than conventional ones. To study the effects of citrus viroids inoculation on vegetative growth, yield and fruit quality of 'Marsh Seedless' grapefruit (Citrus paradisi Macf. grafted on trifoliate orange [Poncirus trifoliata (L. Raf.], an experiment was set up in January 1991, in Bebedouro, São Paulo State, Brazil. The experimental design was randomized blocks with four treatments with two plants per plot: viroid isolates Citrus Exocortis Viroid (CEVd + Hop stunt viroid (HSVd - CVd-II, a non cachexia variant + Citrus III viroid (CVd-III and Hop stunt viroid (HSVd - CVd-II, a non cachexia variant + Citrus III viroid (CVd-III and controls: two healthy buds (control, and no grafting (absolute control. Inoculation was done in the field, six months after planting by bud grafting. Both isolates reduced tree growth (trunk diameter, plant height, canopy diameter and volume. Trees not inoculated yielded better (average of eleven harvests than inoculated ones but the productivity was the same after 150 months. Fruit quality was affected by viroids inoculation but not in a restrictive way. The use of such severe dwarfing isolates for high density plantings of grapefruit on trifoliate orange rootstock is not recommended.Alguns viróides reduzem o crescimento dos citros e podem ser usados para o controle do tamanho das plantas objetivando a instalação de pomares adensados que podem ter maior produtividade que os pomares com espaçamentos convencionais. Para estudar o efeito da inoculação de viróides no desenvolvimento vegetativo, produção e qualidade dos frutos de pomeleiro 'Marsh Seedless' (Citrus paradisi Macf. enxertado em trifoliata [Poncirus trifoliata (L. Raf.], foi instalado um experimento em Janeiro de 1991, em Bebedouro, Estado de São Paulo, Brasil. O delineamento

  20. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    Science.gov (United States)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  1. 76 FR 37111 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Science.gov (United States)

    2011-06-24

    ... Business Information by Computer Sciences Corporation and Its Identified Subcontractors AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY: EPA has authorized its contractor, Computer Sciences Corporation of Chantilly, VA and Its Identified Subcontractors, to access information which has...

  2. Dissection of the Mechanism for Compatible and Incompatible Graft Combinations of Citrus grandis (L. Osbeck (‘Hongmian Miyou’

    Directory of Open Access Journals (Sweden)

    Wen He

    2018-02-01

    Full Text Available ‘Hongmian miyou’ (Citrus grandis L. Osbeck is mutated from ‘Guanxi miyou’, with a different spongy layer coloration. Trifoliate orange (Poncirus trifoliata is widely used as rootstocks in ‘Guanxi miyou’ grafting, whereas ‘Hongmian miyou’ is incompatible with available trifoliate orange rootstocks. To explore the reasons for the etiolation of leaves of ‘Hongmian miyou’/trifoliate orange, anatomical differences among different graft unions, gene expression profiles, and auxin levels of scion were investigated in this study. A histological assay indicated that there was no significant difference in anatomical structure between the compatible and incompatible combinations. A total of 1950 significant differentially-expressed genes (DEGs were identified and analyzed. The Kyoto Encyclopedia of Genes and Genomes (KEGG pathway enrichment analysis revealed that genes involved in carbohydrate metabolism, energy metabolism, amino acid metabolism, and plant hormone signal transduction were significantly enriched. Moreover, the expression of nine genes in the auxin pathway were upregulated and three were downregulated in compatible combinations compared with those in the incompatible group. Further experiments verified that indole-3-acetic acid (IAA content increases in the compatible graft combination, which suggests that IAA might promote graft compatibility.

  3. Global identifiability of linear compartmental models--a computer algebra algorithm.

    Science.gov (United States)

    Audoly, S; D'Angiò, L; Saccomani, M P; Cobelli, C

    1998-01-01

    A priori global identifiability deals with the uniqueness of the solution for the unknown parameters of a model and is, thus, a prerequisite for parameter estimation of biological dynamic models. Global identifiability is however difficult to test, since it requires solving a system of algebraic nonlinear equations which increases both in nonlinearity degree and number of terms and unknowns with increasing model order. In this paper, a computer algebra tool, GLOBI (GLOBal Identifiability) is presented, which combines the topological transfer function method with the Buchberger algorithm, to test global identifiability of linear compartmental models. GLOBI allows for the automatic testing of a priori global identifiability of general structure compartmental models from general multi input-multi output experiments. Examples of usage of GLOBI to analyze a priori global identifiability of some complex biological compartmental models are provided.

  4. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words.

    Science.gov (United States)

    Wang, Bingkun; Huang, Yongfeng; Wu, Xian; Li, Xing

    2015-01-01

    With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods.

  5. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words

    Directory of Open Access Journals (Sweden)

    Bingkun Wang

    2015-01-01

    Full Text Available With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods.

  6. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words

    Science.gov (United States)

    Huang, Yongfeng; Wu, Xian; Li, Xing

    2015-01-01

    With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods. PMID:26106409

  7. A dark incubation period is important for Agrobacterium-mediated transformation of mature internode explants of sweet orange, grapefruit, citron, and a citrange rootstock.

    Science.gov (United States)

    Marutani-Hert, Mizuri; Bowman, Kim D; McCollum, Greg T; Mirkov, T Erik; Evens, Terence J; Niedz, Randall P

    2012-01-01

    Citrus has an extended juvenile phase and trees can take 2-20 years to transition to the adult reproductive phase and produce fruit. For citrus variety development this substantially prolongs the time before adult traits, such as fruit yield and quality, can be evaluated. Methods to transform tissue from mature citrus trees would shorten the evaluation period via the direct production of adult phase transgenic citrus trees. Factors important for promoting shoot regeneration from internode explants from adult phase citrus trees were identified and included a dark incubation period and the use of the cytokinin zeatin riboside. Transgenic trees were produced from four citrus types including sweet orange, citron, grapefruit, and a trifoliate hybrid using the identified factors and factor settings. The critical importance of a dark incubation period for shoot regeneration was established. These results confirm previous reports on the feasibility of transforming mature tissue from sweet orange and are the first to document the transformation of mature tissue from grapefruit, citron, and a trifoliate hybrid.

  8. A dark incubation period is important for Agrobacterium-mediated transformation of mature internode explants of sweet orange, grapefruit, citron, and a citrange rootstock.

    Directory of Open Access Journals (Sweden)

    Mizuri Marutani-Hert

    Full Text Available BACKGROUND: Citrus has an extended juvenile phase and trees can take 2-20 years to transition to the adult reproductive phase and produce fruit. For citrus variety development this substantially prolongs the time before adult traits, such as fruit yield and quality, can be evaluated. Methods to transform tissue from mature citrus trees would shorten the evaluation period via the direct production of adult phase transgenic citrus trees. METHODOLOGY/PRINCIPAL FINDINGS: Factors important for promoting shoot regeneration from internode explants from adult phase citrus trees were identified and included a dark incubation period and the use of the cytokinin zeatin riboside. Transgenic trees were produced from four citrus types including sweet orange, citron, grapefruit, and a trifoliate hybrid using the identified factors and factor settings. SIGNIFICANCE: The critical importance of a dark incubation period for shoot regeneration was established. These results confirm previous reports on the feasibility of transforming mature tissue from sweet orange and are the first to document the transformation of mature tissue from grapefruit, citron, and a trifoliate hybrid.

  9. Effects of Salinity Stress on Gas Exchange, Growth, and Nutrient Concentrations of Two Citrus Rootstocks

    Directory of Open Access Journals (Sweden)

    D. Khoshbakht

    2015-03-01

    Full Text Available A greenhouse study was undertaken to assess the salt tolerance of two citrus rootstocks, namely, Bakraii (Citrus sp. and Trifoliate orange (Poncirus trifoliata. A factorial experiment through a completely randomized design (CRD with three replications and four levels of salt including 0, 20, 40 and 60 mM NaCl was conducted. After eight weeks of treatment, number of leaves, plant height, leaf area, wet and dry weight of leaf, stem and root, length of root, chlorophyll content, net CO2 assimilation rate (ACO2, stomatal conductance (gs, transpiration (E and water use efficiency (WUE and ion concentrations were measured. Salinity decreased growth and net gas exchange. Trifoliate orange showed the most decrease in growth indices and net gas exchange compared with Bakraii. The ability to limit the transfer of sodium to leaves in low levels of salt was observed in Trifoliate orange, but this ability was not observed in high levels of salt. Results showed that accumulation of chloride in leaves and roots were less in Bakraii compared to the Trifoliate orange. The lower Cl- concentration in leaves of Bakraii than trifoliate orange suggests that the salinity tolerance of Bakraii is associated with less transport of Cl- to the leaves. Salinity increased K+ and decreased Mg2+ and Ca2+ concentrations in leaves of both rootstocks. It is proposed that salt stress effect on plant physiological processes such as changes in plant growth, Cl- and Na+ toxicity, and mineral distribution, decreases chlorophyll content and reduces the photosynthetic efficiency of these citrus species.

  10. Identifying a Computer Forensics Expert: A Study to Measure the Characteristics of Forensic Computer Examiners

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2010-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The usage of digital evidence from electronic devices has been rapidly expanding within litigation, and along with this increased usage, the reliance upon forensic computer examiners to acquire, analyze, and report upon this evidence is also rapidly growing. This growing demand for forensic computer examiners raises questions concerning the selection of individuals qualified to perform this work. While courts have mechanisms for qualifying witnesses that provide testimony based on scientific data, such as digital data, the qualifying criteria covers a wide variety of characteristics including, education, experience, training, professional certifications, or other special skills. In this study, we compare task performance responses from forensic computer examiners with an expert review panel and measure the relationship with the characteristics of the examiners to their quality responses. The results of this analysis provide insight into identifying forensic computer examiners that provide high-quality responses. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  11. Application of artificial neural networks to identify equilibration in computer simulations

    Science.gov (United States)

    Leibowitz, Mitchell H.; Miller, Evan D.; Henry, Michael M.; Jankowski, Eric

    2017-11-01

    Determining which microstates generated by a thermodynamic simulation are representative of the ensemble for which sampling is desired is a ubiquitous, underspecified problem. Artificial neural networks are one type of machine learning algorithm that can provide a reproducible way to apply pattern recognition heuristics to underspecified problems. Here we use the open-source TensorFlow machine learning library and apply it to the problem of identifying which hypothetical observation sequences from a computer simulation are “equilibrated” and which are not. We generate training populations and test populations of observation sequences with embedded linear and exponential correlations. We train a two-neuron artificial network to distinguish the correlated and uncorrelated sequences. We find that this simple network is good enough for > 98% accuracy in identifying exponentially-decaying energy trajectories from molecular simulations.

  12. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    Science.gov (United States)

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  13. Computational modeling identifies key gene regulatory interactions underlying phenobarbital-mediated tumor promotion

    Science.gov (United States)

    Luisier, Raphaëlle; Unterberger, Elif B.; Goodman, Jay I.; Schwarz, Michael; Moggs, Jonathan; Terranova, Rémi; van Nimwegen, Erik

    2014-01-01

    Gene regulatory interactions underlying the early stages of non-genotoxic carcinogenesis are poorly understood. Here, we have identified key candidate regulators of phenobarbital (PB)-mediated mouse liver tumorigenesis, a well-characterized model of non-genotoxic carcinogenesis, by applying a new computational modeling approach to a comprehensive collection of in vivo gene expression studies. We have combined our previously developed motif activity response analysis (MARA), which models gene expression patterns in terms of computationally predicted transcription factor binding sites with singular value decomposition (SVD) of the inferred motif activities, to disentangle the roles that different transcriptional regulators play in specific biological pathways of tumor promotion. Furthermore, transgenic mouse models enabled us to identify which of these regulatory activities was downstream of constitutive androstane receptor and β-catenin signaling, both crucial components of PB-mediated liver tumorigenesis. We propose novel roles for E2F and ZFP161 in PB-mediated hepatocyte proliferation and suggest that PB-mediated suppression of ESR1 activity contributes to the development of a tumor-prone environment. Our study shows that combining MARA with SVD allows for automated identification of independent transcription regulatory programs within a complex in vivo tissue environment and provides novel mechanistic insights into PB-mediated hepatocarcinogenesis. PMID:24464994

  14. Identifying Benefits and risks associated with utilizing cloud computing

    OpenAIRE

    Shayan, Jafar; Azarnik, Ahmad; Chuprat, Suriayati; Karamizadeh, Sasan; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is an emerging computing model where IT and computing operations are delivered as services in highly scalable and cost effective manner. Recently, embarking this new model in business has become popular. Companies in diverse sectors intend to leverage cloud computing architecture, platforms and applications in order to gain higher competitive advantages. Likewise other models, cloud computing brought advantages to attract business but meanwhile fostering cloud has led to some ...

  15. Computational approaches to identify functional genetic variants in cancer genomes

    DEFF Research Database (Denmark)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu......The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result...... of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype....

  16. Circadian rhythm in ''1''5O-labeled water uptake manner of a soybean plant by PETIS (Positron Emitting Tracer Imaging System)

    International Nuclear Information System (INIS)

    Nakanishi, Tomoko M.; Yokota, Harumi; Tanoi, Keitaro; Furukawa, Jun; Ikeue, Natsuko; Ookuni, Yoko; Uchida, Hiroshi; Tsuji, Atsunori

    2001-01-01

    We present a circadian rhythm of water uptake manner in a soybean plant through realtime imaging of water, labeled with 15 O. Nitrogen gas was irradiated with deuterons accelerated by a cyclotron at Hamamatsu Photonics Co. to produce 15 O-labeled water. Then the 15 O-labeled water was supplied to a soybean plant from the root and the realtime water uptake amount was measured for 20 min by Positron Emitting Tracer Imaging System (PETIS). All the targeting positions for the measurements were stems, two points at an internode between root and the first leaves, between the first leaves and the first trifoliates and between the first trifoliates and the second trifoliates. The water uptake amount was gradually increased and showed its maximum at around 13:00, especially at the basal part of the stem. Then the water uptake activity was gradually decreased until 17:00. The water amount taken up by a plant at 13:00 was about 40% higher than that at 17:00. (author)

  17. Identify and rank key factors influencing the adoption of cloud computing for a healthy Electronics

    Directory of Open Access Journals (Sweden)

    Javad Shukuhy

    2015-02-01

    Full Text Available Cloud computing as a new technology with Internet infrastructure and new approaches can be significant benefits in providing medical services electronically. Aplying this technology in E-Health requires consideration of various factors. The main objective of this study is to identify and rank the factors influencing the adoption of e-health cloud. Based on the Technology-Organization-Environment (TOE framework and Human-Organization-Technology fit (HOT-fit model, 16 sub-factors were identified in four major factors. With survey of 60 experts, academics and experts in health information technology and with the help of fuzzy analytic hierarchy process had ranked these sub-factors and factors. In the literature, considering newness this study, no internal or external study, have not alluded these number of criteria. The results show that when deciding to adopt cloud computing in E-Health, respectively, must be considered technological, human, organizational and environmental factors.

  18. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  19. Rapid, computer vision-enabled murine screening system identifies neuropharmacological potential of two new mechanisms

    Directory of Open Access Journals (Sweden)

    Steven L Roberds

    2011-09-01

    Full Text Available The lack of predictive in vitro models for behavioral phenotypes impedes rapid advancement in neuropharmacology and psychopharmacology. In vivo behavioral assays are more predictive of activity in human disorders, but such assays are often highly resource-intensive. Here we describe the successful application of a computer vision-enabled system to identify potential neuropharmacological activity of two new mechanisms. The analytical system was trained using multiple drugs that are used clinically to treat depression, schizophrenia, anxiety, and other psychiatric or behavioral disorders. During blinded testing the PDE10 inhibitor TP-10 produced a signature of activity suggesting potential antipsychotic activity. This finding is consistent with TP-10’s activity in multiple rodent models that is similar to that of clinically used antipsychotic drugs. The CK1ε inhibitor PF-670462 produced a signature consistent with anxiolytic activity and, at the highest dose tested, behavioral effects similar to that of opiate analgesics. Neither TP-10 nor PF-670462 was included in the training set. Thus, computer vision-based behavioral analysis can facilitate drug discovery by identifying neuropharmacological effects of compounds acting through new mechanisms.

  20. Circadian rhythm in ''1''5O-labeled water uptake manner of a soybean plant by PETIS (Positron Emitting Tracer Imaging System)

    Energy Technology Data Exchange (ETDEWEB)

    Nakanishi, Tomoko M.; Yokota, Harumi; Tanoi, Keitaro; Furukawa, Jun; Ikeue, Natsuko; Ookuni, Yoko [Tokyo Univ. (Japan). Graduate School of Agricultural and Life Sciences; Uchida, Hiroshi; Tsuji, Atsunori

    2001-05-01

    We present a circadian rhythm of water uptake manner in a soybean plant through realtime imaging of water, labeled with {sup 15}O. Nitrogen gas was irradiated with deuterons accelerated by a cyclotron at Hamamatsu Photonics Co. to produce {sup 15}O-labeled water. Then the {sup 15}O-labeled water was supplied to a soybean plant from the root and the realtime water uptake amount was measured for 20 min by Positron Emitting Tracer Imaging System (PETIS). All the targeting positions for the measurements were stems, two points at an internode between root and the first leaves, between the first leaves and the first trifoliates and between the first trifoliates and the second trifoliates. The water uptake amount was gradually increased and showed its maximum at around 13:00, especially at the basal part of the stem. Then the water uptake activity was gradually decreased until 17:00. The water amount taken up by a plant at 13:00 was about 40% higher than that at 17:00. (author)

  1. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    Science.gov (United States)

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  2. Identifying controlling variables for math computation fluency through experimental analysis: the interaction of stimulus control and reinforcing consequences.

    Science.gov (United States)

    Hofstadter-Duke, Kristi L; Daly, Edward J

    2015-03-01

    This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.

  3. Vehicle systems and payload requirements evaluation. [computer programs for identifying launch vehicle system requirements

    Science.gov (United States)

    Rea, F. G.; Pittenger, J. L.; Conlon, R. J.; Allen, J. D.

    1975-01-01

    Techniques developed for identifying launch vehicle system requirements for NASA automated space missions are discussed. Emphasis is placed on development of computer programs and investigation of astrionics for OSS missions and Scout. The Earth Orbit Mission Program - 1 which performs linear error analysis of launch vehicle dispersions for both vehicle and navigation system factors is described along with the Interactive Graphic Orbit Selection program which allows the user to select orbits which satisfy mission requirements and to evaluate the necessary injection accuracy.

  4. Tri-trophic level impact of host plant linamarin and lotaustralin on Tetranychus urticae and its predator Phytoseiulus persimilis.

    Science.gov (United States)

    Rojas, M Guadalupe; Morales-Ramos, Juan Alfredo

    2010-12-01

    The impact of linamarin and lotaustralin content in the leaves of lima beans, Phaseolus lunatus L., on the second and third trophic levels was studied in the two-spotted spider mite, Tetranychus urticae (Koch), and its predator Phytoseiulus persimilis Athias-Henriot. The content of linamarin was higher in terminal trifoliate leaves (435.5 ppm) than in primary leaves (142.1 ppm) of Henderson bush lima beans. However, linamarin concentrations were reversed at the second trophic level showing higher concentrations in spider mites feeding on primary leaves (429.8 ppm) than those feeding on terminal trifoliate leaves (298.2 ppm). Concentrations of linamarin in the predatory mites were 18.4 and 71.9 ppm when feeding on spider mites grown on primary and terminal leaves, respectively. The concentration of lotaustralin in primary lima bean leaves was 103.12 ppm, and in spider mites feeding on these leaves was 175.0 ppm. Lotaustralin was absent in lima bean terminal trifoliate leaves and in mites feeding on these leaves. Fecundity of spider mites feeding on lima bean leaves (primary or trifoliate) was not significantly different from mites feeding on red bean, Phaseolus vulgaris L., primary leaves. However, the progeny sex ratio (in females per male) of spider mites feeding on lima bean leaves was significantly lower than progeny of spider mites feeding on red bean leaves (control). Fecundity and progeny sex ratio of P. persimilis were both significantly affected by the concentration of linamarin present in the prey. Changes in concentration of linamarin in living tissue across the three trophic levels are discussed.

  5. Características da laranjeira 'Valência' sobre clones e híbridos de porta-enxertos tolerantes à tristeza Characteristics of 'Valencia' sweet orange onto clones and hybrid rootstocks tolerant to the tristeza disease

    Directory of Open Access Journals (Sweden)

    Rita Bordignon

    2003-01-01

    ção precoce e elevadosºBrix e ratio desse genitor, tratando-se de determinantes genéticos independentes. Trifoliata induziu altos valores de ratio do suco e, todos os seus grupos de híbridos foram superiores à Sunki e ao Cravo. Quanto à produção, verificou-se a superioridade do Cravo em relação à Sunki e esta em relação ao Trifoliata, enquanto nos híbridos constatou-se ampla variabilidade genética, sendo 228 significativamente mais produtivos que o Trifoliata, 100 superiores à Sunki e 47 ao Cravo. Os resultados evidenciaram alto potencial de seleção desses híbridos.Variability and selection potential of 396 hybrids of Rangpur lime 'Limeira', (Citrus limonia (C, Sunki mandarin (C. sunki (S, Sour orange 'São Paulo'(C. aurantium (A and Trifoliate orange 'Davis A'(Poncirus trifoliata (T tolerant to the tristeza disease were studied, comparatively to the genitors Rangpur lime, Sunki and Trifoliate orange. Hybrids TxA, TxS, SxT, CxS, SxC, CxA and SxA were investigated as to yield of first three crops, productivity and several vegetative and industrial characteristics of Valencia sweet orange onto them. Rangpur lime, Trifoliate orange and T x S, S x T, T x A, C x A hybrids initiated yielding before Sunki and S x C, C x S, S x A hybrids. This result indicates a dominance of the precocious yield of Trifoliate even in the hybrids with Sunki and conversely, the opposite trend of Sunki and its hybrids, except in the combination with Trifoliate orange. Yield per canopy area induced by Trifoliate orange was low, contrasting with Rangpur lime, Sunki mandarin and T x S, S x T hybrids. It was observed a close relationship between the diameter of scions, the diameter of rootstocks right after transplant to the field and the same parameters in the subsequent years. Height, canopy, rootstock and scion trunk diameters were highly correlated and useful for composing an index vigor. Trifoliate orange and Sunki mandarin are the most divergent genitors regarding vigor, and the

  6. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  7. 7 CFR 301.75-3 - Regulated articles.

    Science.gov (United States)

    2010-01-01

    ..., tangerine, satsuma, tangor, citron, sweet orange, sour orange, mandarin, tangelo, ethrog, kumquat, limequat, calamondin, trifoliate orange, and wampi. (b) Grass, plant, and tree clippings. (c) Any other product...

  8. Application of Computer Simulation to Identify Erosion Resistance of Materials of Wet-steam Turbine Blades

    Science.gov (United States)

    Korostelyov, D. A.; Dergachyov, K. V.

    2017-10-01

    A problem of identifying the efficiency of using materials, coatings, linings and solderings of wet-steam turbine rotor blades by means of computer simulation is considered. Numerical experiments to define erosion resistance of materials of wet-steam turbine blades are described. Kinetic curves for erosion area and weight of the worn rotor blade material of turbines K-300-240 LMP and atomic icebreaker “Lenin” have been defined. The conclusion about the effectiveness of using different erosion-resistant materials and protection configuration of rotor blades is also made.

  9. A novel computational method identifies intra- and inter-species recombination events in Staphylococcus aureus and Streptococcus pneumoniae.

    Directory of Open Access Journals (Sweden)

    Lisa Sanguinetti

    Full Text Available Advances in high-throughput DNA sequencing technologies have determined an explosion in the number of sequenced bacterial genomes. Comparative sequence analysis frequently reveals evidences of homologous recombination occurring with different mechanisms and rates in different species, but the large-scale use of computational methods to identify recombination events is hampered by their high computational costs. Here, we propose a new method to identify recombination events in large datasets of whole genome sequences. Using a filtering procedure of the gene conservation profiles of a test genome against a panel of strains, this algorithm identifies sets of contiguous genes acquired by homologous recombination. The locations of the recombination breakpoints are determined using a statistical test that is able to account for the differences in the natural rate of evolution between different genes. The algorithm was tested on a dataset of 75 genomes of Staphylococcus aureus and 50 genomes comprising different streptococcal species, and was able to detect intra-species recombination events in S. aureus and in Streptococcus pneumoniae. Furthermore, we found evidences of an inter-species exchange of genetic material between S. pneumoniae and Streptococcus mitis, a closely related commensal species that colonizes the same ecological niche. The method has been implemented in an R package, Reco, which is freely available from supplementary material, and provides a rapid screening tool to investigate recombination on a genome-wide scale from sequence data.

  10. Coronary plaque quantification and fractional flow reserve by coronary computed tomography angiography identify ischaemia-causing lesions

    DEFF Research Database (Denmark)

    Gaur, Sara; Øvrehus, Kristian Altern; Dey, Damini

    2016-01-01

    AIMS: Coronary plaque characteristics are associated with ischaemia. Differences in plaque volumes and composition may explain the discordance between coronary stenosis severity and ischaemia. We evaluated the association between coronary stenosis severity, plaque characteristics, coronary computed...... tomography angiography (CTA)-derived fractional flow reserve (FFRCT), and lesion-specific ischaemia identified by FFR in a substudy of the NXT trial (Analysis of Coronary Blood Flow Using CT Angiography: Next Steps). METHODS AND RESULTS: Coronary CTA stenosis, plaque volumes, FFRCT, and FFR were assessed...

  11. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  12. Diagnostic Accuracy of Periapical Radiography and Cone-beam Computed Tomography in Identifying Root Canal Configuration of Human Premolars.

    Science.gov (United States)

    Sousa, Thiago Oliveira; Haiter-Neto, Francisco; Nascimento, Eduarda Helena Leandro; Peroni, Leonardo Vieira; Freitas, Deborah Queiroz; Hassan, Bassam

    2017-07-01

    The aim of this study was to assess the diagnostic accuracy of periapical radiography (PR) and cone-beam computed tomographic (CBCT) imaging in the detection of the root canal configuration (RCC) of human premolars. PR and CBCT imaging of 114 extracted human premolars were evaluated by 2 oral radiologists. RCC was recorded according to Vertucci's classification. Micro-computed tomographic imaging served as the gold standard to determine RCC. Accuracy, sensitivity, specificity, and predictive values were calculated. The Friedman test compared both PR and CBCT imaging with the gold standard. CBCT imaging showed higher values for all diagnostic tests compared with PR. Accuracy was 0.55 and 0.89 for PR and CBCT imaging, respectively. There was no difference between CBCT imaging and the gold standard, whereas PR differed from both CBCT and micro-computed tomographic imaging (P < .0001). CBCT imaging was more accurate than PR for evaluating different types of RCC individually. Canal configuration types III, VII, and "other" were poorly identified on CBCT imaging with a detection accuracy of 50%, 0%, and 43%, respectively. With PR, all canal configurations except type I were poorly visible. PR presented low performance in the detection of RCC in premolars, whereas CBCT imaging showed no difference compared with the gold standard. Canals with complex configurations were less identifiable using both imaging methods, especially PR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  13. Rootstocks for 'Tahiti' lime

    Directory of Open Access Journals (Sweden)

    Stenzel Neusa Maria Colauto

    2004-01-01

    Full Text Available The 'Tahiti' lime (Citrus latifolia Tanaka is an important commercial citrus cultivar in Brazil. 'Rangpur' lime has being used as its main rootstock, but it is susceptible to root rot caused by Phytophthora, reducing tree longevity. An experiment was set up in a randomized block design, with three trees per plot of each rootstock and four replicates, and run for 12 years, aiming to compare the performance of 'IAC-5 Tahiti' lime, budded on 'Rangpur' lime (Citrus limonia Osb.; 'C-13' citrange (Citrus sinensis (L. Osb. × Poncirus trifoliata (L. Raf.; 'African' rough lemon (Citrus jambhiri Lush.; 'Volkamer' lemon (Citrus volkameriana Ten. & Pasq.; trifoliate orange (Poncirus trifoliata (L. Raf.; 'Sunki' mandarin (Citrus sunki Hort. ex Tan. and 'Cleopatra' mandarin (Citrus reshni Hort. ex Tan.. Eleven years after the establishment of the orchard, trees with the greatest canopy development were budded on 'C-13' citrange and 'African' rough lemon, and both differed significantly from trees budded on trifoliate orange, 'Sunki' and 'Cleopatra' mandarins, which presented the smallest canopy development. Trees budded on 'Rangpur' lime and 'C-13' citrange had the highest cumulative yields, and were different from trees budded on trifoliate orange, 'Cleopatra' and 'Sunki' mandarins. There was no rootstock effect on mean fruit weight and on the total soluble solid/acid ratio in the juice. The 'Rangpur' lime and the 'Cleopatra' mandarin rootstocks reduced longevity of plants.

  14. The Multimorbidity Cluster Analysis Tool: Identifying Combinations and Permutations of Multiple Chronic Diseases Using a Record-Level Computational Analysis

    Directory of Open Access Journals (Sweden)

    Kathryn Nicholson

    2017-12-01

    Full Text Available Introduction: Multimorbidity, or the co-occurrence of multiple chronic health conditions within an individual, is an increasingly dominant presence and burden in modern health care systems.  To fully capture its complexity, further research is needed to uncover the patterns and consequences of these co-occurring health states.  As such, the Multimorbidity Cluster Analysis Tool and the accompanying Multimorbidity Cluster Analysis Toolkit have been created to allow researchers to identify distinct clusters that exist within a sample of participants or patients living with multimorbidity.  Development: The Tool and Toolkit were developed at Western University in London, Ontario, Canada.  This open-access computational program (JAVA code and executable file was developed and tested to support an analysis of thousands of individual records and up to 100 disease diagnoses or categories.  Application: The computational program can be adapted to the methodological elements of a research project, including type of data, type of chronic disease reporting, measurement of multimorbidity, sample size and research setting.  The computational program will identify all existing, and mutually exclusive, combinations and permutations within the dataset.  An application of this computational program is provided as an example, in which more than 75,000 individual records and 20 chronic disease categories resulted in the detection of 10,411 unique combinations and 24,647 unique permutations among female and male patients.  Discussion: The Tool and Toolkit are now available for use by researchers interested in exploring the complexities of multimorbidity.  Its careful use, and the comparison between results, will be valuable additions to the nuanced understanding of multimorbidity.

  15. Abiodun et al (19)

    African Journals Online (AJOL)

    DELL

    2018-03-01

    Mar 1, 2018 ... Physicochemical properties of modified trifoliate yam starches were determined. ... but at ambient temperature, oxidized starch had high paste clarity during storage. Acid ... of other starches from corn, cassava, potato and.

  16. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  17. Computed Tomography Fractional Flow Reserve Can Identify Culprit Lesions in Aortoiliac Occlusive Disease Using Minimally Invasive Techniques.

    Science.gov (United States)

    Ward, Erin P; Shiavazzi, Daniele; Sood, Divya; Marsden, Allison; Lane, John; Owens, Erik; Barleben, Andrew

    2017-01-01

    Currently, the gold standard diagnostic examination for significant aortoiliac lesions is angiography. Fractional flow reserve (FFR) has a growing body of literature in coronary artery disease as a minimally invasive diagnostic procedure. Improvements in numerical hemodynamics have allowed for an accurate and minimally invasive approach to estimating FFR, utilizing cross-sectional imaging. We aim to demonstrate a similar approach to aortoiliac occlusive disease (AIOD). A retrospective review evaluated 7 patients with claudication and cross-sectional imaging showing AIOD. FFR was subsequently measured during conventional angiogram with pull-back pressures in a retrograde fashion. To estimate computed tomography (CT) FFR, CT angiography (CTA) image data were analyzed using the SimVascular software suite to create a computational fluid dynamics model of the aortoiliac system. Inlet flow conditions were derived based on cardiac output, while 3-element Windkessel outlet boundary conditions were optimized to match the expected systolic and diastolic pressures, with outlet resistance distributed based on Murray's law. The data were evaluated with a Student's t-test and receiver operating characteristic curve. All patients had evidence of AIOD on CT and FFR was successfully measured during angiography. The modeled data were found to have high sensitivity and specificity between the measured and CT FFR (P = 0.986, area under the curve = 1). The average difference between the measured and calculated FFRs was 0.136, with a range from 0.03 to 0.30. CT FFR successfully identified aortoiliac lesions with significant pressure drops that were identified with angiographically measured FFR. CT FFR has the potential to provide a minimally invasive approach to identify flow-limiting stenosis for AIOD. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Highly efficient computer algorithm for identifying layer thickness of atomically thin 2D materials

    Science.gov (United States)

    Lee, Jekwan; Cho, Seungwan; Park, Soohyun; Bae, Hyemin; Noh, Minji; Kim, Beom; In, Chihun; Yang, Seunghoon; Lee, Sooun; Seo, Seung Young; Kim, Jehyun; Lee, Chul-Ho; Shim, Woo-Young; Jo, Moon-Ho; Kim, Dohun; Choi, Hyunyong

    2018-03-01

    The fields of layered material research, such as transition-metal dichalcogenides (TMDs), have demonstrated that the optical, electrical and mechanical properties strongly depend on the layer number N. Thus, efficient and accurate determination of N is the most crucial step before the associated device fabrication. An existing experimental technique using an optical microscope is the most widely used one to identify N. However, a critical drawback of this approach is that it relies on extensive laboratory experiences to estimate N; it requires a very time-consuming image-searching task assisted by human eyes and secondary measurements such as atomic force microscopy and Raman spectroscopy, which are necessary to ensure N. In this work, we introduce a computer algorithm based on the image analysis of a quantized optical contrast. We show that our algorithm can apply to a wide variety of layered materials, including graphene, MoS2, and WS2 regardless of substrates. The algorithm largely consists of two parts. First, it sets up an appropriate boundary between target flakes and substrate. Second, to compute N, it automatically calculates the optical contrast using an adaptive RGB estimation process between each target, which results in a matrix with different integer Ns and returns a matrix map of Ns onto the target flake position. Using a conventional desktop computational power, the time taken to display the final N matrix was 1.8 s on average for the image size of 1280 pixels by 960 pixels and obtained a high accuracy of 90% (six estimation errors among 62 samples) when compared to the other methods. To show the effectiveness of our algorithm, we also apply it to TMD flakes transferred on optically transparent c-axis sapphire substrates and obtain a similar result of the accuracy of 94% (two estimation errors among 34 samples).

  19. Photosynthate partitioning and distribution in soybean plant

    International Nuclear Information System (INIS)

    Latche, J.; Cavalie, G.

    1983-01-01

    Plants were grown in a controlled environment chamber and fed with a modified Hoagland solution containing nitrate as nitrogen source (N+ medium). Soybeans, 33 days old (flowering stage), 45 and 56 days old (pods formation and filling stages) were used for experimentation. In each experiment, the eight trifoliated leaf (F 8 ) was exposed to 14 CO 2 (10 μCi; 400 vpm), in the light (80 W x m -2 ) for 30 min. After a 6 h chase period (22 - 25 0 C; 80 W x m -2 ), the radiocarbon distribution among plant parts was determined and labelled compounds were identified. (orig.)

  20. A High-Throughput Computational Framework for Identifying Significant Copy Number Aberrations from Array Comparative Genomic Hybridisation Data

    Directory of Open Access Journals (Sweden)

    Ian Roberts

    2012-01-01

    Full Text Available Reliable identification of copy number aberrations (CNA from comparative genomic hybridization data would be improved by the availability of a generalised method for processing large datasets. To this end, we developed swatCGH, a data analysis framework and region detection heuristic for computational grids. swatCGH analyses sequentially displaced (sliding windows of neighbouring probes and applies adaptive thresholds of varying stringency to identify the 10% of each chromosome that contains the most frequently occurring CNAs. We used the method to analyse a published dataset, comparing data preprocessed using four different DNA segmentation algorithms, and two methods for prioritising the detected CNAs. The consolidated list of the most commonly detected aberrations confirmed the value of swatCGH as a simplified high-throughput method for identifying biologically significant CNA regions of interest.

  1. Extraction of low molecular weight RNA from Citrus trifolita tissues ...

    African Journals Online (AJOL)

    Jane

    2010-12-20

    Dec 20, 2010 ... many biological and metabolic processes, including tissue ... water before getting LMW RNA. ... 1 cm) were collected from five-year-old trifoliate orange (C. ... using 700 µl 75% ethanol and total RNA was precipitated by.

  2. Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.

    Science.gov (United States)

    Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P

    2010-01-15

    A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.

  3. Integration of experimental and computational methods for identifying geometric, thermal and diffusive properties of biomaterials

    Science.gov (United States)

    Weres, Jerzy; Kujawa, Sebastian; Olek, Wiesław; Czajkowski, Łukasz

    2016-04-01

    Knowledge of physical properties of biomaterials is important in understanding and designing agri-food and wood processing industries. In the study presented in this paper computational methods were developed and combined with experiments to enhance identification of agri-food and forest product properties, and to predict heat and water transport in such products. They were based on the finite element model of heat and water transport and supplemented with experimental data. Algorithms were proposed for image processing, geometry meshing, and inverse/direct finite element modelling. The resulting software system was composed of integrated subsystems for 3D geometry data acquisition and mesh generation, for 3D geometry modelling and visualization, and for inverse/direct problem computations for the heat and water transport processes. Auxiliary packages were developed to assess performance, accuracy and unification of data access. The software was validated by identifying selected properties and using the estimated values to predict the examined processes, and then comparing predictions to experimental data. The geometry, thermal conductivity, specific heat, coefficient of water diffusion, equilibrium water content and convective heat and water transfer coefficients in the boundary layer were analysed. The estimated values, used as an input for simulation of the examined processes, enabled reduction in the uncertainty associated with predictions.

  4. Performance of 'Okitsu' Satsuma Mandarin on nine rootstocks

    Directory of Open Access Journals (Sweden)

    Zuleide Hissano Tazima

    2013-12-01

    Full Text Available Mandarins have become increasingly valued as citrus fruits for the fresh market due to the easy peeling, attractive flavor, and health and nutritional properties. Plant growth and yield, and characteristics of fruits of 'Okitsu' Satsuma mandarin (Citrus unshiu Marc. trees grafted on nine rootstocks were evaluated in Londrina, northern Paraná, Brazil. The rootstocks were: 'Rangpur' lime (Citrus limonia Osb.; 'Cleopatra' (Citrus reshni hort. ex Tanaka and 'Sunki' mandarins (Citrus sunki hort. ex Tanaka; 'C-13' [Citrus sinensis × Poncirus trifoliata (L. Raf.] and 'Carrizo' citranges [C. sinensis × P. trifoliata (L. Raf.]; 'Volkamer' lemon (Citrus volkameriana V. Ten. & Pasq.; trifoliate orange [P. trifoliata (L. Raf.; 'Caipira DAC' sweet orange [C. sinensis (L. Osb.] and 'Swingle' citrumelo [Citrus paradisi Macfad. cv. Duncan × P. trifoliata (L. Raf.]. The highest plant growth was for the trees on 'Cleopatra' mandarin and 'Caipira DAC' sweet orange. In contrast, the smallest size was for the trees on 'Volkamer' lemon and trifoliate orange. The largest difference between the trunk diameter below and above the grafting point was induced by 'Swingle' citrumelo. Trees of 'Okitsu' Satsuma mandarin on 'Swingle' citrumelo presented the highest yield, while 'C-13', 'Carrizo', 'Sunki', and 'Swingle' induced the largest fruit masses. With regard to fruit characteristics, 'Carrizo' and trifoliate orange induced the best ratio and juice content. Based on theoretical values, 'Rangpur' lime and 'Volkamer' lemon induced the lowest yields

  5. Sentinel nodes identified by computed tomography-lymphography accurately stage the axilla in patients with breast cancer

    International Nuclear Information System (INIS)

    Motomura, Kazuyoshi; Sumino, Hiroshi; Noguchi, Atsushi; Horinouchi, Takashi; Nakanishi, Katsuyuki

    2013-01-01

    Sentinel node biopsy often results in the identification and removal of multiple nodes as sentinel nodes, although most of these nodes could be non-sentinel nodes. This study investigated whether computed tomography-lymphography (CT-LG) can distinguish sentinel nodes from non-sentinel nodes and whether sentinel nodes identified by CT-LG can accurately stage the axilla in patients with breast cancer. This study included 184 patients with breast cancer and clinically negative nodes. Contrast agent was injected interstitially. The location of sentinel nodes was marked on the skin surface using a CT laser light navigator system. Lymph nodes located just under the marks were first removed as sentinel nodes. Then, all dyed nodes or all hot nodes were removed. The mean number of sentinel nodes identified by CT-LG was significantly lower than that of dyed and/or hot nodes removed (1.1 vs 1.8, p <0.0001). Twenty-three (12.5%) patients had ≥2 sentinel nodes identified by CT-LG removed, whereas 94 (51.1%) of patients had ≥2 dyed and/or hot nodes removed (p <0.0001). Pathological evaluation demonstrated that 47 (25.5%) of 184 patients had metastasis to at least one node. All 47 patients demonstrated metastases to at least one of the sentinel nodes identified by CT-LG. CT-LG can distinguish sentinel nodes from non-sentinel nodes, and sentinel nodes identified by CT-LG can accurately stage the axilla in patients with breast cancer. Successful identification of sentinel nodes using CT-LG may facilitate image-based diagnosis of metastasis, possibly leading to the omission of sentinel node biopsy

  6. Compression, Mechanical and Release Properties of Chloroquine ...

    African Journals Online (AJOL)

    Results: Tablet formulations containing trifoliate yam starch exhibited faster onset and higher amount of plastic deformation during compression than those containing corn starch. The crushing strength, disintegration and dissolution times of the tablets increased with binder concentration while friability values decreased.

  7. Novel PCA-VIP scheme for ranking MRI protocols and identifying computer-extracted MRI measurements associated with central gland and peripheral zone prostate tumors.

    Science.gov (United States)

    Ginsburg, Shoshana B; Viswanath, Satish E; Bloch, B Nicolas; Rofsky, Neil M; Genega, Elizabeth M; Lenkinski, Robert E; Madabhushi, Anant

    2015-05-01

    To identify computer-extracted features for central gland and peripheral zone prostate cancer localization on multiparametric magnetic resonance imaging (MRI). Preoperative T2-weighted (T2w), diffusion-weighted imaging (DWI), and dynamic contrast-enhanced (DCE) MRI were acquired from 23 men with confirmed prostate cancer. Following radical prostatectomy, the cancer extent was delineated by a pathologist on ex vivo histology and mapped to MRI by nonlinear registration of histology and corresponding MRI slices. In all, 244 computer-extracted features were extracted from MRI, and principal component analysis (PCA) was employed to reduce the data dimensionality so that a generalizable classifier could be constructed. A novel variable importance on projection (VIP) measure for PCA (PCA-VIP) was leveraged to identify computer-extracted MRI features that discriminate between cancer and normal prostate, and these features were used to construct classifiers for cancer localization. Classifiers using features selected by PCA-VIP yielded an area under the curve (AUC) of 0.79 and 0.85 for peripheral zone and central gland tumors, respectively. For tumor localization in the central gland, T2w, DCE, and DWI MRI features contributed 71.6%, 18.1%, and 10.2%, respectively; for peripheral zone tumors T2w, DCE, and DWI MRI contributed 29.6%, 21.7%, and 48.7%, respectively. PCA-VIP identified relatively stable subsets of MRI features that performed well in localizing prostate cancer on MRI. © 2014 Wiley Periodicals, Inc.

  8. Use of computed tomography to identify atrial fibrillation associated differences in left atrial wall thickness and density.

    Science.gov (United States)

    Dewland, Thomas A; Wintermark, Max; Vaysman, Anna; Smith, Lisa M; Tong, Elizabeth; Vittinghoff, Eric; Marcus, Gregory M

    2013-01-01

    Left atrial (LA) tissue characteristics may play an important role in atrial fibrillation (AF) induction and perpetuation. Although frequently used in clinical practice, computed tomography (CT) has not been employed to describe differences in LA wall properties between AF patients and controls. We sought to noninvasively characterize AF-associated differences in LA tissue using CT. CT images of the LA were obtained in 98 consecutive patients undergoing AF ablation and in 89 controls. A custom software algorithm was used to measure wall thickness and density in four prespecified regions of the LA. On average, LA walls were thinner (-15.5%, 95% confidence interval [CI] -23.2 to -7.8%, P identified significant thinning of the LA wall and regional alterations in tissue density in patients with a history of AF. These findings suggest differences in LA tissue composition can be noninvasively identified and quantified using CT. ©2012, The Authors. Journal compilation ©2012 Wiley Periodicals, Inc.

  9. Establishment of an Off-Highway Vehicle (OHV) Program at Arnold Air Force Base, Tennessee Final Environmental Assessment

    Science.gov (United States)

    2010-05-01

    Pinus spp. Pine spp. High Poncirus trifoliata Trifoliate orange High Alliaria petiolata Garlic mustard Medium E/aeagnus umbel/ala Autumn olive...exposure, and whether other physical stresses such as drought are occurring around the time of exposure (Larkin, 1996). Studies have documented hearing

  10. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  11. Sepsis reconsidered: Identifying novel metrics for behavioral landscape characterization with a high-performance computing implementation of an agent-based model.

    Science.gov (United States)

    Cockrell, Chase; An, Gary

    2017-10-07

    Sepsis affects nearly 1 million people in the United States per year, has a mortality rate of 28-50% and requires more than $20 billion a year in hospital costs. Over a quarter century of research has not yielded a single reliable diagnostic test or a directed therapeutic agent for sepsis. Central to this insufficiency is the fact that sepsis remains a clinical/physiological diagnosis representing a multitude of molecularly heterogeneous pathological trajectories. Advances in computational capabilities offered by High Performance Computing (HPC) platforms call for an evolution in the investigation of sepsis to attempt to define the boundaries of traditional research (bench, clinical and computational) through the use of computational proxy models. We present a novel investigatory and analytical approach, derived from how HPC resources and simulation are used in the physical sciences, to identify the epistemic boundary conditions of the study of clinical sepsis via the use of a proxy agent-based model of systemic inflammation. Current predictive models for sepsis use correlative methods that are limited by patient heterogeneity and data sparseness. We address this issue by using an HPC version of a system-level validated agent-based model of sepsis, the Innate Immune Response ABM (IIRBM), as a proxy system in order to identify boundary conditions for the possible behavioral space for sepsis. We then apply advanced analysis derived from the study of Random Dynamical Systems (RDS) to identify novel means for characterizing system behavior and providing insight into the tractability of traditional investigatory methods. The behavior space of the IIRABM was examined by simulating over 70 million sepsis patients for up to 90 days in a sweep across the following parameters: cardio-respiratory-metabolic resilience; microbial invasiveness; microbial toxigenesis; and degree of nosocomial exposure. In addition to using established methods for describing parameter space, we

  12. Environmental Assessment for Establishment of an Off-Highway Vehicle (OHV) Program at Arnold Air Force Base, Tennessee

    Science.gov (United States)

    2010-05-01

    ha/apense Johnsongrass High Vinca minor Periwinkle High Wisteria sinensis Wisteria High Pinus spp. Pine spp. High Poncirus trifoliata Trifoliate... drought are occurring around the time of exposure (Larkin, 1996). Studies have documented hearing loss caused by the noise of dune buggies, dirt bikes

  13. LID: Computer code for identifying atomic and ionic lines below 3500 Angstroms

    International Nuclear Information System (INIS)

    Peek, J.M.; Dukart, R.J.

    1987-08-01

    An interactive computer code has been written to search a data base containing information useful for identifying lines in experimentally-observed spectra or for designing experiments. The data base was the basis for the Kelly and Palumbo critical review of well-resolved lines below 2000 Angstroms, includes lines below 3500 Angstroms for atoms and ions of hydrogen through krypton, and was obtained from R.L. Kelly. This code allows the user to search the data base for a user-specified wavelength region, with this search either limited to atoms or ions of the user's choice for all atoms and ions contained in the data base. The line information found in the search is stored in a local file for later reference. A plotting capability is provided to graphically display the lines resulting from the search. Several options are available to control the nature of these graphs. It is also possible to bring in data from another source, such as an experimental spectra, for display along with the lines from the data-base search. Options for manipulating the experimental spectra's background intensity and wavelength scale are also available to the user. The intensities for the lines from each ion found in the data-base search can be scaled by a multiplicative constant to better simulate the observed spectrum

  14. Computer Science Research at Langley

    Science.gov (United States)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  15. Identification of water storage tissue in the stem of cowpea plant (Vigna unguliculata Walp) by neutron radiography

    International Nuclear Information System (INIS)

    Nakanishi, T.M.; Don-Jin, K.; Ishii, R.; Matsubayashi, M.

    1999-01-01

    Cowpea (Vigna unguliculata Walp) is considered one of the most drought resistant species among the pulse crops. It was suggested that in the lower part of the stem, parenchymatous tissue for storing water has been developed for the function of drought resistance. However, such tissue has not been identified yet. In order to identify the water storing tissue in the stem of cowpea plant, the authors performed neutron radiography, which provides a non-destructive image of water distribution pattern in a plant. Common bean plant and soybean plant were used as references. Comparing the neutron radiograph for the stems of the plants, i.e., cowpea, common bean and soybean plants, the parenchymatous tissue with water storing function was distinguished in the intermode between primary leaf and the first trifoliate leaf specifically in cowpea plant. (author)

  16. Site-Mutation of Hydrophobic Core Residues Synchronically Poise Super Interleukin 2 for Signaling: Identifying Distant Structural Effects through Affordable Computations

    Directory of Open Access Journals (Sweden)

    Longcan Mei

    2018-03-01

    Full Text Available A superkine variant of interleukin-2 with six site mutations away from the binding interface developed from the yeast display technique has been previously characterized as undergoing a distal structure alteration which is responsible for its super-potency and provides an elegant case study with which to get insight about how to utilize allosteric effect to achieve desirable protein functions. By examining the dynamic network and the allosteric pathways related to those mutated residues using various computational approaches, we found that nanosecond time scale all-atom molecular dynamics simulations can identify the dynamic network as efficient as an ensemble algorithm. The differentiated pathways for the six core residues form a dynamic network that outlines the area of structure alteration. The results offer potentials of using affordable computing power to predict allosteric structure of mutants in knowledge-based mutagenesis.

  17. Control of Bean Rust using Antibiotics Produced by Bacillus and ...

    African Journals Online (AJOL)

    Antibiotic culture filtrates produced by Bacillus (CA5) and Streptomyces spp. were tested for translocation and persistence when applied on snap beans inoculated with rust (Uromyces appendiculatus) in greenhouse pot experiments. The antibiotics were applied on the first trifoliate leaves and translocation was assessed as ...

  18. Role of peripheral quantitative computed tomography in identifying disuse osteoporosis in paraplegia

    Energy Technology Data Exchange (ETDEWEB)

    Coupaud, Sylvie [University of Glasgow, Centre for Rehabilitation Engineering, Department of Mechanical Engineering, Glasgow (United Kingdom); Southern General Hospital, Queen Elizabeth National Spinal Injuries Unit, Glasgow (United Kingdom); McLean, Alan N.; Allan, David B. [Southern General Hospital, Queen Elizabeth National Spinal Injuries Unit, Glasgow (United Kingdom)

    2009-10-15

    Disuse osteoporosis is a major long-term health consequence of spinal cord injury (SCI) that still needs to be addressed. Its management in SCI should begin with accurate diagnosis, followed by targeted treatments in the most vulnerable subgroups. We present data quantifying disuse osteoporosis in a cross-section of the Scottish paraplegic population to identify subgroups with lowest bone mineral density (BMD). Forty-seven people with chronic SCI at levels T2-L2 were scanned using peripheral quantitative computed tomography at four tibial sites and two femoral sites, at the Queen Elizabeth National Spinal Injuries Unit, Glasgow (UK). At the distal epiphyses, trabecular BMD (BMDtrab), total BMD, total bone cross-sectional area (CSA) and bone mineral content (BMC) were determined. In the diaphyses, cortical BMD, total bone CSA, cortical CSA and BMC were calculated. Bone, muscle and fat CSAs were estimated in the lower leg and thigh. BMDtrab decreased exponentially with time since injury at different rates in the tibia and femur. At most sites, female paraplegics had significantly lower BMC, total bone CSA and muscle CSA than male paraplegics. Subjects with lumbar SCI tended to have lower bone values and smaller muscle CSAs than in thoracic SCI. At the distal epiphyses of the tibia and femur, there is generally a rapid and extensive reduction in BMDtrab after SCI. Female subjects, and those with lumbar SCI, tend to have lower bone values than males or those with thoracic SCI, respectively. (orig.)

  19. Role of peripheral quantitative computed tomography in identifying disuse osteoporosis in paraplegia

    International Nuclear Information System (INIS)

    Coupaud, Sylvie; McLean, Alan N.; Allan, David B.

    2009-01-01

    Disuse osteoporosis is a major long-term health consequence of spinal cord injury (SCI) that still needs to be addressed. Its management in SCI should begin with accurate diagnosis, followed by targeted treatments in the most vulnerable subgroups. We present data quantifying disuse osteoporosis in a cross-section of the Scottish paraplegic population to identify subgroups with lowest bone mineral density (BMD). Forty-seven people with chronic SCI at levels T2-L2 were scanned using peripheral quantitative computed tomography at four tibial sites and two femoral sites, at the Queen Elizabeth National Spinal Injuries Unit, Glasgow (UK). At the distal epiphyses, trabecular BMD (BMDtrab), total BMD, total bone cross-sectional area (CSA) and bone mineral content (BMC) were determined. In the diaphyses, cortical BMD, total bone CSA, cortical CSA and BMC were calculated. Bone, muscle and fat CSAs were estimated in the lower leg and thigh. BMDtrab decreased exponentially with time since injury at different rates in the tibia and femur. At most sites, female paraplegics had significantly lower BMC, total bone CSA and muscle CSA than male paraplegics. Subjects with lumbar SCI tended to have lower bone values and smaller muscle CSAs than in thoracic SCI. At the distal epiphyses of the tibia and femur, there is generally a rapid and extensive reduction in BMDtrab after SCI. Female subjects, and those with lumbar SCI, tend to have lower bone values than males or those with thoracic SCI, respectively. (orig.)

  20. Computer Education and Computer Use by Preschool Educators

    Science.gov (United States)

    Towns, Bernadette

    2010-01-01

    Researchers have found that teachers seldom use computers in the preschool classroom. However, little research has examined why preschool teachers elect not to use computers. This case study focused on identifying whether community colleges that prepare teachers for early childhood education include in their curriculum how teachers can effectively…

  1. Use of cone beam computed tomography in identifying postmenopausal women with osteoporosis.

    Science.gov (United States)

    Brasileiro, C B; Chalub, L L F H; Abreu, M H N G; Barreiros, I D; Amaral, T M P; Kakehasi, A M; Mesquita, R A

    2017-12-01

    The aim of this study is to correlate radiometric indices from cone beam computed tomography (CBCT) images and bone mineral density (BMD) in postmenopausal women. Quantitative CBCT indices can be used to screen for women with low BMD. Osteoporosis is a disease characterized by the deterioration of bone tissue and the consequent decrease in BMD and increase in bone fragility. Several studies have been performed to assess radiometric indices in panoramic images as low-BMD predictors. The aim of this study is to correlate radiometric indices from CBCT images and BMD in postmenopausal women. Sixty postmenopausal women with indications for dental implants and CBCT evaluation were selected. Dual-energy X-ray absorptiometry (DXA) was performed, and the patients were divided into normal, osteopenia, and osteoporosis groups, according to the World Health Organization (WHO) criteria. Cross-sectional images were used to evaluate the computed tomography mandibular index (CTMI), the computed tomography index (inferior) (CTI (I)) and computed tomography index (superior) (CTI (S)). Student's t test was used to compare the differences between the indices of the groups' intraclass correlation coefficient (ICC). Statistical analysis showed a high degree of interobserver and intraobserver agreement for all measurements (ICC > 0.80). The mean values of CTMI, CTI (S), and CTI (I) were lower in the osteoporosis group than in osteopenia and normal patients (p < 0.05). In comparing normal patients and women with osteopenia, there was no statistically significant difference in the mean value of CTI (I) (p = 0.075). Quantitative CBCT indices may help dentists to screen for women with low spinal and femoral bone mineral density so that they can refer postmenopausal women for bone densitometry.

  2. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  3. Performance of ‘Okitsu’ satsuma mandarin trees on different rootstocks in Northwestern Parana State

    Directory of Open Access Journals (Sweden)

    Zuleide Hissano Tazima

    2014-10-01

    Full Text Available In the State of Paraná, citrus production is based mainly on Rangpur lime rootstock, which has good results with the established cultivars. However, research is needed into rootstocks for use with cultivars that remain to be commercially exploited. The objective of this study was to evaluate the vegetative development and yield of ‘Okitsu’ satsuma mandarin plants (Citrus unshiu Marc., as well as fruit quality, budded on nine rootstocks in the Northwest State of Paraná, Brazil. The orchard was established at the Experimental Station of the Agronomic Institute of Paraná-IAPAR, Paranavaí, PR, in January 2001. The experimental design was randomized blocks with nine treatments, three replications, and two plants per plot. The rootstocks were Rangpur lime (Citrus limonia Osb., Cleopatra mandarin (Citrus reshni hort. ex Tanaka, C-13 citrange [Citrus sinensis × Poncirus trifoliata orange (L. Raf.], Volkamer lemon (Citrus volkameriana V. Ten. e Pasq., Carrizo citrange [C. sinensis × P. trifoliata (L. Raf.], Sunki mandarin (Citrus sunki hort. ex Tanaka, trifoliate orange [P. trifoliata (L. Raf. ], Swingle citrumelo [Citrus paradisi Macfad. cv. Duncan × P. trifoliata (L. Raf.], and Caipira DAC sweet orange [C. sinensis (L. Osb.]. The largest plant canopy to ‘Okitsu’ was induced by Cleopatra and the lowest by trifoliata, with 37.1 m3 and 9.9 m3, respectively. The highest relationship between scion and rootstock trunk diameter was observed for the plants budded on Swingle. The largest accumulated yields per plant over eight seasons were induced by Volkamer, Rangpur, Caipira DAC, Cleopatra, and Carrizo, ranging from 867.3 to 989.6 kg. These rootstocks also induced the largest fruit mass, along with Sunki, ranging from 173.3 to 188.0 g. Trifoliate induced accumulated production of 52.5% in relation to Rangpur lime. Rangpur, Carrizo, trifoliate, and Swingle induced the largest averages for the ratio, ranging from 10.41 to 10.79. For orchard

  4. Active optical sensor assessment of spider mite damage on greenhouse beans and cotton

    Science.gov (United States)

    The two-spotted spider mite, Tetranychus urticae Koch is an important pest of cotton in mid-southern United States and causes yield reduction, and deprivation in fiber fitness. A greenhouse colony of the spider mite was used to infest cotton and pinto beans at the three-leaf and trifoliate stages, r...

  5. A Novel Imaging Technique (X-Map) to Identify Acute Ischemic Lesions Using Noncontrast Dual-Energy Computed Tomography.

    Science.gov (United States)

    Noguchi, Kyo; Itoh, Toshihide; Naruto, Norihito; Takashima, Shutaro; Tanaka, Kortaro; Kuroda, Satoshi

    2017-01-01

    We evaluated whether X-map, a novel imaging technique, can visualize ischemic lesions within 20 hours after the onset in patients with acute ischemic stroke, using noncontrast dual-energy computed tomography (DECT). Six patients with acute ischemic stroke were included in this study. Noncontrast head DECT scans were acquired with 2 X-ray tubes operated at 80 kV and Sn150 kV between 32 minutes and 20 hours after the onset. Using these DECT scans, the X-map was reconstructed based on 3-material decomposition and compared with a simulated standard (120 kV) computed tomography (CT) and diffusion-weighted imaging (DWI). The X-map showed more sensitivity to identify the lesions as an area of lower attenuation value than a simulated standard CT in all 6 patients. The lesions on the X-map correlated well with those on DWI. In 3 of 6 patients, the X-map detected a transient decrease in the attenuation value in the peri-infarct area within 1 day after the onset. The X-map is a powerful tool to supplement a simulated standard CT and characterize acute ischemic lesions. However, the X-map cannot replace a simulated standard CT to diagnose acute cerebral infarction. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Identifying Computer-Generated Portraits: The Importance of Training and Incentives.

    Science.gov (United States)

    Mader, Brandon; Banks, Martin S; Farid, Hany

    2017-09-01

    The past two decades have seen remarkable advances in photo-realistic rendering of everything from inanimate objects to landscapes, animals, and humans. We previously showed that despite these tremendous advances, human observers remain fairly good at distinguishing computer-generated from photographic images. Building on these results, we describe a series of follow-up experiments that reveal how to improve observer performance. Of general interest to anyone performing psychophysical studies on Mechanical Turk or similar platforms, we find that observer performance can be significantly improved with the proper incentives.

  7. Absorption, translocation and metabolism of polycarbamate, a dithiocarbamate fungicide, in kidney bean seedlings

    International Nuclear Information System (INIS)

    Kumagai, H.; Kiyohara, C.; Komiyama, S.; Guo, Y.; Hirose, S.; Ichikawa, Y.; Endo, J.; Ikari, H.

    1991-01-01

    Absorption, translocation and metabolism of dizinc bis (dimethyldithiocarbamate)-ethylenebis (dithiocarbamate), bisdithane, were studied in kidney bean seedlings with its ethylene- 14 C-labeled [E- 14 C] and dimethyl- 14 C-labeled [D- 14 C] compounds. Most of the radioactivity remained at the application sites when labeled bisdithanes were applied on the surface of the first-trifoliate leaves of the plants. A small amount of the radioactivity was absorbed through the treated leaves. Translocation of the radioactivity from the leaves treated with the labeled bisdithanes to other parts of the plant was very small. These results were supported by the autoradiographic observations. The radioactive metabolites obtained from [E- 14 C] bisdithane were identified as ethylenethiourea and ethyleneurea. Tetramethylthiuram monosulfide, tetramethylthiuram disulfide, thiazolidine-2-thione-4-carboxylic acid and 1-(dimethylthiocarbamoylthio)-β-glucoside were identified when [D- 14 C] bisdithane was used. (author)

  8. Development of the regional EPR and PACS sharing system on the infrastructure of cloud computing technology controlled by patient identifier cross reference manager.

    Science.gov (United States)

    Kondoh, Hiroshi; Teramoto, Kei; Kawai, Tatsurou; Mochida, Maki; Nishimura, Motohiro

    2013-01-01

    A Newly developed Oshidori-Net2, providing medical professionals with remote access to electronic patient record systems (EPR) and PACSs of four hospitals, of different venders, using cloud computing technology and patient identifier cross reference manager. The operation was started from April 2012. The patients moved to other hospital were applied. Objective is to show the merit and demerit of the new system.

  9. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  10. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  11. Optimisation of multiplet identifier processing on a PLAYSTATION® 3

    Science.gov (United States)

    Hattori, Masami; Mizuno, Takashi

    2010-02-01

    To enable high-performance computing (HPC) for applications with large datasets using a Sony® PLAYSTATION® 3 (PS3™) video game console, we configured a hybrid system consisting of a Windows® PC and a PS3™. To validate this system, we implemented the real-time multiplet identifier (RTMI) application, which identifies multiplets of microearthquakes in terms of the similarity of their waveforms. The cross-correlation computation, which is a core algorithm of the RTMI application, was optimised for the PS3™ platform, while the rest of the computation, including data input and output remained on the PC. With this configuration, the core part of the algorithm ran 69 times faster than the original program, accelerating total computation speed more than five times. As a result, the system processed up to 2100 total microseismic events, whereas the original implementation had a limit of 400 events. These results indicate that this system enables high-performance computing for large datasets using the PS3™, as long as data transfer time is negligible compared with computation time.

  12. Lumichrome and riboflavin are two novel symbiotic signals eliciting developmental changes in both monocot and dicot plant species

    Directory of Open Access Journals (Sweden)

    Felix Dapare Dakora

    2015-09-01

    Full Text Available Lumichrome and riboflavin are novel molecules from rhizobial exudates that stimulate plant growth. Developmental changes elicited by lumichrome at very low nanomolar concentrations (5 nM include early initiation of trifoliate leaves, expansion of unifoliate and trifoliate leaves, increased stem elongation and leaf area, and consequently greater biomass accumulation in monocots and dicots. However, higher lumichrome concentration (50 nM depressed root development and reduced growth of unifoliate and second trifoliate leaves. Applying either 10 nM lumichrome, 10 nM ABA, or 10 ml of infective rhizobial cells (0.2 OD600 to roots of monocots and dicots for 44 h produced identical effects, which included decreased stomatal conductance and leaf transpiration in Bambara groundnut, soybean and maize, increased stomatal conductance and transpiration in cowpea and lupin, and elevated root respiration in maize (19% by rhizobia and 20% by lumichrome. Extracellular exudation of lumichrome, riboflavin and IAA was greater in N2-fixing rhizobia than non-fixing bacteria, indicating their role as symbiotic signals. Xylem concentration of lumichrome in cowpea and soybean was greater in plants inoculated with infective rhizobia and treated with lumichrome (61.2 µmol lumichrome.ml-1 sap, followed by uninoculated plants receiving lumichrome (41.12 µmol lumichrome.ml-1 sap, and lowest in uninoculated, lumichrome-free plants (26.8 µmol lumichrome.ml-1 sap. Overall, soybean showed greater xylem concentration of lumichrome and a correspondingly increased accumulation in leaves relative to cowpea. As a result, soybean exhibited dramatic developmental changes than cowpea. Taken together, lumichrome and riboflavin secreted by soil rhizobia function as environmental cues for sensing stress. The fact that exogenous application of ABA to plant roots caused the same effect as lumichrome on stomatal functioning suggests molecular cross-talk in plant response to environmental

  13. Effectively identifying user profiles in network and host metrics

    Science.gov (United States)

    Murphy, John P.; Berk, Vincent H.; Gregorio-de Souza, Ian

    2010-04-01

    This work presents a collection of methods that is used to effectively identify users of computers systems based on their particular usage of the software and the network. Not only are we able to identify individual computer users by their behavioral patterns, we are also able to detect significant deviations in their typical computer usage over time, or compared to a group of their peers. For instance, most people have a small, and relatively unique selection of regularly visited websites, certain email services, daily work hours, and typical preferred applications for mandated tasks. We argue that these habitual patterns are sufficiently specific to identify fully anonymized network users. We demonstrate that with only a modest data collection capability, profiles of individual computer users can be constructed so as to uniquely identify a profiled user from among their peers. As time progresses and habits or circumstances change, the methods presented update each profile so that changes in user behavior can be reliably detected over both abrupt and gradual time frames, without losing the ability to identify the profiled user. The primary benefit of our methodology allows one to efficiently detect deviant behaviors, such as subverted user accounts, or organizational policy violations. Thanks to the relative robustness, these techniques can be used in scenarios with very diverse data collection capabilities, and data privacy requirements. In addition to behavioral change detection, the generated profiles can also be compared against pre-defined examples of known adversarial patterns.

  14. A computational technique to identify the optimal stiffness matrix for a discrete nuclear fuel assembly model

    International Nuclear Information System (INIS)

    Park, Nam-Gyu; Kim, Kyoung-Joo; Kim, Kyoung-Hong; Suh, Jung-Min

    2013-01-01

    Highlights: ► An identification method of the optimal stiffness matrix for a fuel assembly structure is discussed. ► The least squares optimization method is introduced, and a closed form solution of the problem is derived. ► The method can be expanded to the system with the limited number of modes. ► Identification error due to the perturbed mode shape matrix is analyzed. ► Verification examples show that the proposed procedure leads to a reliable solution. -- Abstract: A reactor core structural model which is used to evaluate the structural integrity of the core contains nuclear fuel assembly models. Since the reactor core consists of many nuclear fuel assemblies, the use of a refined fuel assembly model leads to a considerable amount of computing time for performing nonlinear analyses such as the prediction of seismic induced vibration behaviors. The computational time could be reduced by replacing the detailed fuel assembly model with a simplified model that has fewer degrees of freedom, but the dynamic characteristics of the detailed model must be maintained in the simplified model. Such a model based on an optimal design method is proposed in this paper. That is, when a mass matrix and a mode shape matrix are given, the optimal stiffness matrix of a discrete fuel assembly model can be estimated by applying the least squares minimization method. The verification of the method is completed by comparing test results and simulation results. This paper shows that the simplified model's dynamic behaviors are quite similar to experimental results and that the suggested method is suitable for identifying reliable mathematical model for fuel assemblies

  15. Report of the Task Force on Computer Charging.

    Science.gov (United States)

    Computer Co-ordination Group, Ottawa (Ontario).

    The objectives of the Task Force on Computer Charging as approved by the Committee of Presidents of Universities of Ontario were: (1) to identify alternative methods of costing computing services; (2) to identify alternative methods of pricing computing services; (3) to develop guidelines for the pricing of computing services; (4) to identify…

  16. Unusual presentation of metastatic carcinoma cervix with clinically silent primary identified by 18F-flouro deoxy glucose positron emission tomography/computed tomography

    International Nuclear Information System (INIS)

    Senthil, Raja; Mohapatra, Ranjan Kumar; Srinivas, Shripriya; Sampath, Mouleeswaran Koramadai; Sundaraiya, Sumati

    2016-01-01

    Carcinoma cervix is the most common gynecological malignancy among Indian women. The common symptoms at presentation include abnormal vaginal bleeding, unusual discharge from the vagina, or pain during coitus and postmenopausal bleeding. Rarely, few patients may present with distant metastases without local symptoms. We present two patients with an unusual presentation of metastatic disease without any gynecological symptoms, where 18 F-flouro deoxy glucose positron emission tomography/computed tomography helped in identifying the primary malignancy in the uterine cervix

  17. Early phase drug discovery: cheminformatics and computational techniques in identifying lead series.

    Science.gov (United States)

    Duffy, Bryan C; Zhu, Lei; Decornez, Hélène; Kitchen, Douglas B

    2012-09-15

    Early drug discovery processes rely on hit finding procedures followed by extensive experimental confirmation in order to select high priority hit series which then undergo further scrutiny in hit-to-lead studies. The experimental cost and the risk associated with poor selection of lead series can be greatly reduced by the use of many different computational and cheminformatic techniques to sort and prioritize compounds. We describe the steps in typical hit identification and hit-to-lead programs and then describe how cheminformatic analysis assists this process. In particular, scaffold analysis, clustering and property calculations assist in the design of high-throughput screening libraries, the early analysis of hits and then organizing compounds into series for their progression from hits to leads. Additionally, these computational tools can be used in virtual screening to design hit-finding libraries and as procedures to help with early SAR exploration. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Cloud Computing (1/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  19. Cloud Computing (2/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  20. Transcriptator: An Automated Computational Pipeline to Annotate Assembled Reads and Identify Non Coding RNA.

    Directory of Open Access Journals (Sweden)

    Kumar Parijat Tripathi

    Full Text Available RNA-seq is a new tool to measure RNA transcript counts, using high-throughput sequencing at an extraordinary accuracy. It provides quantitative means to explore the transcriptome of an organism of interest. However, interpreting this extremely large data into biological knowledge is a problem, and biologist-friendly tools are lacking. In our lab, we developed Transcriptator, a web application based on a computational Python pipeline with a user-friendly Java interface. This pipeline uses the web services available for BLAST (Basis Local Search Alignment Tool, QuickGO and DAVID (Database for Annotation, Visualization and Integrated Discovery tools. It offers a report on statistical analysis of functional and Gene Ontology (GO annotation's enrichment. It helps users to identify enriched biological themes, particularly GO terms, pathways, domains, gene/proteins features and protein-protein interactions related informations. It clusters the transcripts based on functional annotations and generates a tabular report for functional and gene ontology annotations for each submitted transcript to the web server. The implementation of QuickGo web-services in our pipeline enable the users to carry out GO-Slim analysis, whereas the integration of PORTRAIT (Prediction of transcriptomic non coding RNA (ncRNA by ab initio methods helps to identify the non coding RNAs and their regulatory role in transcriptome. In summary, Transcriptator is a useful software for both NGS and array data. It helps the users to characterize the de-novo assembled reads, obtained from NGS experiments for non-referenced organisms, while it also performs the functional enrichment analysis of differentially expressed transcripts/genes for both RNA-seq and micro-array experiments. It generates easy to read tables and interactive charts for better understanding of the data. The pipeline is modular in nature, and provides an opportunity to add new plugins in the future. Web application is

  1. Computer-based video analysis identifies infants with absence of fidgety movements.

    Science.gov (United States)

    Støen, Ragnhild; Songstad, Nils Thomas; Silberg, Inger Elisabeth; Fjørtoft, Toril; Jensenius, Alexander Refsum; Adde, Lars

    2017-10-01

    BackgroundAbsence of fidgety movements (FMs) at 3 months' corrected age is a strong predictor of cerebral palsy (CP) in high-risk infants. This study evaluates the association between computer-based video analysis and the temporal organization of FMs assessed with the General Movement Assessment (GMA).MethodsInfants were eligible for this prospective cohort study if referred to a high-risk follow-up program in a participating hospital. Video recordings taken at 10-15 weeks post term age were used for GMA and computer-based analysis. The variation of the spatial center of motion, derived from differences between subsequent video frames, was used for quantitative analysis.ResultsOf 241 recordings from 150 infants, 48 (24.1%) were classified with absence of FMs or sporadic FMs using the GMA. The variation of the spatial center of motion (C SD ) during a recording was significantly lower in infants with normal (0.320; 95% confidence interval (CI) 0.309, 0.330) vs. absence of or sporadic (0.380; 95% CI 0.361, 0.398) FMs (P<0.001). A triage model with C SD thresholds chosen for sensitivity of 90% and specificity of 80% gave a 40% referral rate for GMA.ConclusionQuantitative video analysis during the FMs' period can be used to triage infants at high risk of CP to early intervention or observational GMA.

  2. Market-Oriented Cloud Computing: Vision, Hype, and Reality for Delivering IT Services as Computing Utilities

    OpenAIRE

    Buyya, Rajkumar; Yeo, Chee Shin; Venugopal, Srikumar

    2008-01-01

    This keynote paper: presents a 21st century vision of computing; identifies various computing paradigms promising to deliver the vision of computing utilities; defines Cloud computing and provides the architecture for creating market-oriented Clouds by leveraging technologies such as VMs; provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; presents...

  3. SABER: a computational method for identifying active sites for new reactions.

    Science.gov (United States)

    Nosrati, Geoffrey R; Houk, K N

    2012-05-01

    A software suite, SABER (Selection of Active/Binding sites for Enzyme Redesign), has been developed for the analysis of atomic geometries in protein structures, using a geometric hashing algorithm (Barker and Thornton, Bioinformatics 2003;19:1644-1649). SABER is used to explore the Protein Data Bank (PDB) to locate proteins with a specific 3D arrangement of catalytic groups to identify active sites that might be redesigned to catalyze new reactions. As a proof-of-principle test, SABER was used to identify enzymes that have the same catalytic group arrangement present in o-succinyl benzoate synthase (OSBS). Among the highest-scoring scaffolds identified by the SABER search for enzymes with the same catalytic group arrangement as OSBS were L-Ala D/L-Glu epimerase (AEE) and muconate lactonizing enzyme II (MLE), both of which have been redesigned to become effective OSBS catalysts, demonstrated by experiments. Next, we used SABER to search for naturally existing active sites in the PDB with catalytic groups similar to those present in the designed Kemp elimination enzyme KE07. From over 2000 geometric matches to the KE07 active site, SABER identified 23 matches that corresponded to residues from known active sites. The best of these matches, with a 0.28 Å catalytic atom RMSD to KE07, was then redesigned to be compatible with the Kemp elimination using RosettaDesign. We also used SABER to search for potential Kemp eliminases using a theozyme predicted to provide a greater rate acceleration than the active site of KE07, and used Rosetta to create a design based on the proteins identified. Copyright © 2012 The Protein Society.

  4. Analogue particle identifier and test unit for automatic measuring of errors

    International Nuclear Information System (INIS)

    Boden, A.; Lauch, J.

    1979-04-01

    A high accuracy analogue particle identifier is described. The unit is used for particle identification or data correction of experimental based errors in magnetic spectrometers. Signals which are proportional to the energy, the time-of-flight or the position of absorption of the particles are supplied to an analogue computation circuit (multifunction converter). Three computation functions are available for different applications. The output of the identifier produces correction signals or pulses whose amplitudes are proportional to the mass of the particles. Particle identification and data correction can be optimized by the adjustment of variable parameters. An automatic test unit has been developed for adjustment and routine checking of particle identifiers. The computation functions can be tested by this unit with an accuracy of 1%. (orig.) [de

  5. Influence of Cultivar on the Postharvest Hardening of Trifoliate Yam (Dioscorea dumetorum Tubers

    Directory of Open Access Journals (Sweden)

    Christian Siadjeu

    2016-01-01

    Full Text Available The influence of cultivar on the postharvest hardening of Dioscorea dumetorum tubers was assessed. 32 cultivars of D. dumetorum tubers were planted in April 2014, harvested at physiological maturity, and stored under prevailing tropical ambient conditions (19–28°C, 60–85% RH for 0, 5, 14, 21, and 28 days. Samples were evaluated for cooked hardness. Results showed that one cultivar, Ibo sweet 3, was not affected by the hardening phenomenon. The remaining 31 were all subject to the hardening phenomenon at different degree. Cooked hardness increased more rapidly in cultivars with many roots on the tuber surface compared to cultivars with few roots on the tuber surface. When both the characteristics flesh colour and number of roots on tuber surface were associated, cooked hardness in cultivars with yellow flesh and many roots increased more rapidly than in cultivars with white flesh and many roots, whereas cooked hardness in cultivars with yellow flesh and few roots increased more slowly than in cultivars with white flesh and few roots. Accessions collected in high altitude increased more rapidly compared to accessions collected in low altitude. The cultivar Ibo sweet 3 identified in this study could provide important information for breeding program of D. dumetorum against postharvest hardening phenomenon.

  6. Identified state-space prediction model for aero-optical wavefronts

    Science.gov (United States)

    Faghihi, Azin; Tesch, Jonathan; Gibson, Steve

    2013-07-01

    A state-space disturbance model and associated prediction filter for aero-optical wavefronts are described. The model is computed by system identification from a sequence of wavefronts measured in an airborne laboratory. Estimates of the statistics and flow velocity of the wavefront data are shown and can be computed from the matrices in the state-space model without returning to the original data. Numerical results compare velocity values and power spectra computed from the identified state-space model with those computed from the aero-optical data.

  7. The Ulam Index: Methods of Theoretical Computer Science Help in Identifying Chemical Substances

    Science.gov (United States)

    Beltran, Adriana; Salvador, James

    1997-01-01

    In this paper, we show how methods developed for solving a theoretical computer problem of graph isomorphism are used in structural chemistry. We also discuss potential applications of these methods to exobiology: the search for life outside Earth.

  8. Advanced computational biology methods identify molecular switches for malignancy in an EGF mouse model of liver cancer.

    Directory of Open Access Journals (Sweden)

    Philip Stegmaier

    Full Text Available The molecular causes by which the epidermal growth factor receptor tyrosine kinase induces malignant transformation are largely unknown. To better understand EGFs' transforming capacity whole genome scans were applied to a transgenic mouse model of liver cancer and subjected to advanced methods of computational analysis to construct de novo gene regulatory networks based on a combination of sequence analysis and entrained graph-topological algorithms. Here we identified transcription factors, processes, key nodes and molecules to connect as yet unknown interacting partners at the level of protein-DNA interaction. Many of those could be confirmed by electromobility band shift assay at recognition sites of gene specific promoters and by western blotting of nuclear proteins. A novel cellular regulatory circuitry could therefore be proposed that connects cell cycle regulated genes with components of the EGF signaling pathway. Promoter analysis of differentially expressed genes suggested the majority of regulated transcription factors to display specificity to either the pre-tumor or the tumor state. Subsequent search for signal transduction key nodes upstream of the identified transcription factors and their targets suggested the insulin-like growth factor pathway to render the tumor cells independent of EGF receptor activity. Notably, expression of IGF2 in addition to many components of this pathway was highly upregulated in tumors. Together, we propose a switch in autocrine signaling to foster tumor growth that was initially triggered by EGF and demonstrate the knowledge gain form promoter analysis combined with upstream key node identification.

  9. Effect of cassava starch substituion on the functional and sensory ...

    African Journals Online (AJOL)

    The starch cake was rinsed four times, dried in the oven at 40oC for 24 hrs, milled and sieved. The cassava starch was used to substitute 10, 20, 30, 40 and 50% of trifoliate yam flour. The control white yam (Dioscorea rotundata) tubers were peeled, washed and diced. The diced yam tubers were parboiled at temperature of ...

  10. Cloud Computing Security: A Survey

    Directory of Open Access Journals (Sweden)

    Issa M. Khalil

    2014-02-01

    Full Text Available Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing and outsourcing, create new challenges to the security community. Addressing these challenges requires, in addition to the ability to cultivate and tune the security measures developed for traditional computing systems, proposing new security policies, models, and protocols to address the unique cloud security challenges. In this work, we provide a comprehensive study of cloud computing security and privacy concerns. We identify cloud vulnerabilities, classify known security threats and attacks, and present the state-of-the-art practices to control the vulnerabilities, neutralize the threats, and calibrate the attacks. Additionally, we investigate and identify the limitations of the current solutions and provide insights of the future security perspectives. Finally, we provide a cloud security framework in which we present the various lines of defense and identify the dependency levels among them. We identify 28 cloud security threats which we classify into five categories. We also present nine general cloud attacks along with various attack incidents, and provide effectiveness analysis of the proposed countermeasures.

  11. Patterns of students' computer use and relations to their computer and information literacy

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe; Gerick, Julia

    2017-01-01

    Background: Previous studies have shown that there is a complex relationship between students’ computer and information literacy (CIL) and their use of information and communication technologies (ICT) for both recreational and school use. Methods: This study seeks to dig deeper into these complex...... relations by identifying different patterns of students’ school-related and recreational computer use in the 21 countries participating in the International Computer and Information Literacy Study (ICILS 2013). Results: Latent class analysis (LCA) of the student questionnaire and performance data from......, raising important questions about differences in contexts. Keywords: ICILS, Computer use, Latent class analysis (LCA), Computer and information literacy....

  12. Mobile computing initiatives within pharmacy education.

    Science.gov (United States)

    Cain, Jeff; Bird, Eleanora R; Jones, Mikael

    2008-08-15

    To identify mobile computing initiatives within pharmacy education, including how devices are obtained, supported, and utilized within the curriculum. An 18-item questionnaire was developed and delivered to academic affairs deans (or closest equivalent) of 98 colleges and schools of pharmacy. Fifty-four colleges and schools completed the questionnaire for a 55% completion rate. Thirteen of those schools have implemented mobile computing requirements for students. Twenty schools reported they were likely to formally consider implementing a mobile computing initiative within 5 years. Numerous models of mobile computing initiatives exist in terms of device obtainment, technical support, infrastructure, and utilization within the curriculum. Responders identified flexibility in teaching and learning as the most positive aspect of the initiatives and computer-aided distraction as the most negative, Numerous factors should be taken into consideration when deciding if and how a mobile computing requirement should be implemented.

  13. Impact of new computing systems on finite element computations

    International Nuclear Information System (INIS)

    Noor, A.K.; Fulton, R.E.; Storaasi, O.O.

    1983-01-01

    Recent advances in computer technology that are likely to impact finite element computations are reviewed. The characteristics of supersystems, highly parallel systems, and small systems (mini and microcomputers) are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario is presented for future hardware/software environment and finite element systems. A number of research areas which have high potential for improving the effectiveness of finite element analysis in the new environment are identified

  14. 48 CFR 212.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  15. Persistent Identifier Practice for Big Data Management at NCI

    Directory of Open Access Journals (Sweden)

    Jingbo Wang

    2017-04-01

    Full Text Available The National Computational Infrastructure (NCI manages over 10 PB research data, which is co-located with the high performance computer (Raijin and an HPC class 3000 core OpenStack cloud system (Tenjin. In support of this integrated High Performance Computing/High Performance Data (HPC/HPD infrastructure, NCI’s data management practices includes building catalogues, DOI minting, data curation, data publishing, and data delivery through a variety of data services. The metadata catalogues, DOIs, THREDDS, and Vocabularies, all use different Uniform Resource Locator (URL styles. A Persistent IDentifier (PID service provides an important utility to manage URLs in a consistent, controlled and monitored manner to support the robustness of our national ‘Big Data’ infrastructure. In this paper we demonstrate NCI’s approach of utilising the NCI’s 'PID Service 'to consistently manage its persistent identifiers with various applications.

  16. Exploiting intrinsic fluctuations to identify model parameters.

    Science.gov (United States)

    Zimmer, Christoph; Sahle, Sven; Pahle, Jürgen

    2015-04-01

    Parameterisation of kinetic models plays a central role in computational systems biology. Besides the lack of experimental data of high enough quality, some of the biggest challenges here are identification issues. Model parameters can be structurally non-identifiable because of functional relationships. Noise in measured data is usually considered to be a nuisance for parameter estimation. However, it turns out that intrinsic fluctuations in particle numbers can make parameters identifiable that were previously non-identifiable. The authors present a method to identify model parameters that are structurally non-identifiable in a deterministic framework. The method takes time course recordings of biochemical systems in steady state or transient state as input. Often a functional relationship between parameters presents itself by a one-dimensional manifold in parameter space containing parameter sets of optimal goodness. Although the system's behaviour cannot be distinguished on this manifold in a deterministic framework it might be distinguishable in a stochastic modelling framework. Their method exploits this by using an objective function that includes a measure for fluctuations in particle numbers. They show on three example models, immigration-death, gene expression and Epo-EpoReceptor interaction, that this resolves the non-identifiability even in the case of measurement noise with known amplitude. The method is applied to partially observed recordings of biochemical systems with measurement noise. It is simple to implement and it is usually very fast to compute. This optimisation can be realised in a classical or Bayesian fashion.

  17. Binary Logistic Regression Analysis in Assessment and Identifying Factors That Influence Students' Academic Achievement: The Case of College of Natural and Computational Science, Wolaita Sodo University, Ethiopia

    Science.gov (United States)

    Zewude, Bereket Tessema; Ashine, Kidus Meskele

    2016-01-01

    An attempt has been made to assess and identify the major variables that influence student academic achievement at college of natural and computational science of Wolaita Sodo University in Ethiopia. Study time, peer influence, securing first choice of department, arranging study time outside class, amount of money received from family, good life…

  18. Quantum steady computation

    International Nuclear Information System (INIS)

    Castagnoli, G.

    1991-01-01

    This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition

  19. Quantum steady computation

    Energy Technology Data Exchange (ETDEWEB)

    Castagnoli, G. (Dipt. di Informatica, Sistemistica, Telematica, Univ. di Genova, Viale Causa 13, 16145 Genova (IT))

    1991-08-10

    This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition.

  20. Granular computing: perspectives and challenges.

    Science.gov (United States)

    Yao, JingTao; Vasilakos, Athanasios V; Pedrycz, Witold

    2013-12-01

    Granular computing, as a new and rapidly growing paradigm of information processing, has attracted many researchers and practitioners. Granular computing is an umbrella term to cover any theories, methodologies, techniques, and tools that make use of information granules in complex problem solving. The aim of this paper is to review foundations and schools of research and to elaborate on current developments in granular computing research. We first review some basic notions of granular computing. Classification and descriptions of various schools of research in granular computing are given. We also present and identify some research directions in granular computing.

  1. [Key effect genes responding to nerve injury identified by gene ontology and computer pattern recognition].

    Science.gov (United States)

    Pan, Qian; Peng, Jin; Zhou, Xue; Yang, Hao; Zhang, Wei

    2012-07-01

    In order to screen out important genes from large gene data of gene microarray after nerve injury, we combine gene ontology (GO) method and computer pattern recognition technology to find key genes responding to nerve injury, and then verify one of these screened-out genes. Data mining and gene ontology analysis of gene chip data GSE26350 was carried out through MATLAB software. Cd44 was selected from screened-out key gene molecular spectrum by comparing genes' different GO terms and positions on score map of principal component. Function interferences were employed to influence the normal binding of Cd44 and one of its ligands, chondroitin sulfate C (CSC), to observe neurite extension. Gene ontology analysis showed that the first genes on score map (marked by red *) mainly distributed in molecular transducer activity, receptor activity, protein binding et al molecular function GO terms. Cd44 is one of six effector protein genes, and attracted us with its function diversity. After adding different reagents into the medium to interfere the normal binding of CSC and Cd44, varying-degree remissions of CSC's inhibition on neurite extension were observed. CSC can inhibit neurite extension through binding Cd44 on the neuron membrane. This verifies that important genes in given physiological processes can be identified by gene ontology analysis of gene chip data.

  2. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  3. Monitoring the viability of citrus rootstocks seeds stored under refrigeration

    Directory of Open Access Journals (Sweden)

    Sérgio Alves de Carvalho

    2013-03-01

    Full Text Available The citrus nursery tree is produced through the bud grafting process, in which rootstock is usually grown from seed germination. The objective of this research was to evaluate, in two dissimilar environmental conditions, the viability and polyembryony expression of five citrus rootstocks seeds stored in different periods under refrigeration. The rootstock varieties evaluated were: Rangpur lime (Citrus limonia Osb. cv. Limeira, Trifoliate orange (Poncirus trifoliata Raf. cv. Limeira, Citrumelo (P. trifoliata x C. paradisi Macf. cv. Swingle, Sunki mandarin (C. sunki Hort. ex Tanaka and Volkamer lemon (C. volkameriana Ten. & Pasq. cv. Catania 2. The experimental design was the randomized blocks in a 11 x 5 x 2 factorial scheme, evaluating from time zero to the tenth month of storage, the five varieties of rootstock in two environments: germination and growth B.O.D type chamber (Biological Oxygen Demand - Eletrolab Brand Model FC 122 at 25 °C; and greenhouse seedbed with partial temperature control (22 °C to 36 °C and humidity control (75-85%. The plot had 24 seeds in four replicates, using trays with substrate in greenhouse and Petri dishes with filter paper in B.O.D. chamber. The seed germination rate and polyembryony expression were evaluated monthly. It was concluded that Trifoliate and Citrumelo Swingle seeds can be stored for up to seven months, while Volkamer lemon, Rangpur lime and Sunki seeds can be stored for up to ten months. The polyembryony expression rate was slightly higher when measured in greenhouse than in B.O.D. chamber and remained stable in both environments until the seventh month, from which dropped sharply. Citrumelo Swingle seeds expressed the highest polyembryony rate (18.8%, followed by Rangpur lime and Volkamer lemon (average value of 13.7%, Sunki (9.4% and Trifoliate (3.2%. Despite some differences among varieties, the viability of rootstock stored seeds can be monitored either in the greenhouse or in B

  4. Arbuscular Mycorrhizal Fungus Enhances Lateral Root Formation in Poncirus trifoliata (L.) as Revealed by RNA-Seq Analysis.

    Science.gov (United States)

    Chen, Weili; Li, Juan; Zhu, Honghui; Xu, Pengyang; Chen, Jiezhong; Yao, Qing

    2017-01-01

    Arbuscular mycorrhizal fungi (AMF) establish symbiosis with most terrestrial plants, and greatly regulate lateral root (LR) formation. Phosphorus (P), sugar, and plant hormones are proposed being involved in this regulation, however, no global evidence regarding these factors is available so far, especially in woody plants. In this study, we inoculated trifoliate orange seedlings ( Poncirus trifoliata L. Raf) with an AMF isolate, Rhizophagus irregularis BGC JX04B. After 4 months of growth, LR formation was characterized, and sugar contents in roots were determined. RNA-Seq analysis was performed to obtain the transcriptomes of LR root tips from non-mycorrhizal and mycorrhizal seedlings. Quantitative real time PCR (qRT-PCR) of selected genes was also conducted for validation. The results showed that AMF significantly increased LR number, as well as plant biomass and shoot P concentration. The contents of glucose and fructose in primary root, and sucrose content in LR were also increased. A total of 909 differentially expressed genes (DEGs) were identified in response to AMF inoculation, and qRT-PCR validated the transcriptomic data. The numbers of DEGs related to P, sugar, and plant hormones were 31, 32, and 25, respectively. For P metabolism, the most up-regulated DEGs mainly encoded phosphate transporter, and the most down-regulated DEGs encoded acid phosphatase. For sugar metabolism, the most up-regulated DEGs encoded polygalacturonase and chitinase. For plant hormones, the most up-regulated DEGs were related to auxin signaling, and the most down-regulated DEGs were related to ethylene signaling. PLS-SEM analysis indicates that P metabolism was the most important pathway by which AMF regulates LR formation in this study. These data reveal the changes of genome-wide gene expression in responses to AMF inoculation in trifoliate orange and provide a solid basis for the future identification and characterization of key genes involved in LR formation induced by AMF.

  5. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  6. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Science.gov (United States)

    McBride, Sebastian; Huelse, Martin; Lee, Mark

    2013-01-01

    Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1) conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2) implementation and validation of the model into robotic hardware (as a representative of an active vision system). Seven computational requirements were identified: 1) transformation of retinotopic to egocentric mappings, 2) spatial memory for the purposes of medium-term inhibition of return, 3) synchronization of 'where' and 'what' information from the two visual streams, 4) convergence of top-down and bottom-up information to a centralized point of information processing, 5) a threshold function to elicit saccade action, 6) a function to represent task relevance as a ratio of excitation and inhibition, and 7) derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  7. Identifying Corneal Infections in Formalin-Fixed Specimens Using Next Generation Sequencing.

    Science.gov (United States)

    Li, Zhigang; Breitwieser, Florian P; Lu, Jennifer; Jun, Albert S; Asnaghi, Laura; Salzberg, Steven L; Eberhart, Charles G

    2018-01-01

    We test the ability of next-generation sequencing, combined with computational analysis, to identify a range of organisms causing infectious keratitis. This retrospective study evaluated 16 cases of infectious keratitis and four control corneas in formalin-fixed tissues from the pathology laboratory. Infectious cases also were analyzed in the microbiology laboratory using culture, polymerase chain reaction, and direct staining. Classified sequence reads were analyzed with two different metagenomics classification engines, Kraken and Centrifuge, and visualized using the Pavian software tool. Sequencing generated 20 to 46 million reads per sample. On average, 96% of the reads were classified as human, 0.3% corresponded to known vectors or contaminant sequences, 1.7% represented microbial sequences, and 2.4% could not be classified. The two computational strategies successfully identified the fungal, bacterial, and amoebal pathogens in most patients, including all four bacterial and mycobacterial cases, five of six fungal cases, three of three Acanthamoeba cases, and one of three herpetic keratitis cases. In several cases, additional potential pathogens also were identified. In one case with cytomegalovirus identified by Kraken and Centrifuge, the virus was confirmed by direct testing, while two where Staphylococcus aureus or cytomegalovirus were identified by Centrifuge but not Kraken could not be confirmed. Confirmation was not attempted for an additional three potential pathogens identified by Kraken and 11 identified by Centrifuge. Next generation sequencing combined with computational analysis can identify a wide range of pathogens in formalin-fixed corneal specimens, with potential applications in clinical diagnostics and research.

  8. Identifying shared genetic structure patterns among Pacific Northwest forest taxa: insights from use of visualization tools and computer simulations.

    Directory of Open Access Journals (Sweden)

    Mark P Miller

    2010-10-01

    Full Text Available Identifying causal relationships in phylogeographic and landscape genetic investigations is notoriously difficult, but can be facilitated by use of multispecies comparisons.We used data visualizations to identify common spatial patterns within single lineages of four taxa inhabiting Pacific Northwest forests (northern spotted owl: Strix occidentalis caurina; red tree vole: Arborimus longicaudus; southern torrent salamander: Rhyacotriton variegatus; and western white pine: Pinus monticola. Visualizations suggested that, despite occupying the same geographical region and habitats, species responded differently to prevailing historical processes. S. o. caurina and P. monticola demonstrated directional patterns of spatial genetic structure where genetic distances and diversity were greater in southern versus northern locales. A. longicaudus and R. variegatus displayed opposite patterns where genetic distances were greater in northern versus southern regions. Statistical analyses of directional patterns subsequently confirmed observations from visualizations. Based upon regional climatological history, we hypothesized that observed latitudinal patterns may have been produced by range expansions. Subsequent computer simulations confirmed that directional patterns can be produced by expansion events.We discuss phylogeographic hypotheses regarding historical processes that may have produced observed patterns. Inferential methods used here may become increasingly powerful as detailed simulations of organisms and historical scenarios become plausible. We further suggest that inter-specific comparisons of historical patterns take place prior to drawing conclusions regarding effects of current anthropogenic change within landscapes.

  9. Locating hardware faults in a data communications network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-01-12

    Hardware faults location in a data communications network of a parallel computer. Such a parallel computer includes a plurality of compute nodes and a data communications network that couples the compute nodes for data communications and organizes the compute node as a tree. Locating hardware faults includes identifying a next compute node as a parent node and a root of a parent test tree, identifying for each child compute node of the parent node a child test tree having the child compute node as root, running a same test suite on the parent test tree and each child test tree, and identifying the parent compute node as having a defective link connected from the parent compute node to a child compute node if the test suite fails on the parent test tree and succeeds on all the child test trees.

  10. Identifying Non-Volatile Data Storage Areas: Unique Notebook Identification Information as Digital Evidence

    Directory of Open Access Journals (Sweden)

    Nikica Budimir

    2007-03-01

    Full Text Available The research reported in this paper introduces new techniques to aid in the identification of recovered notebook computers so they may be returned to the rightful owner. We identify non-volatile data storage areas as a means of facilitating the safe storing of computer identification information. A forensic proof of concept tool has been designed to test the feasibility of several storage locations identified within this work to hold the data needed to uniquely identify a computer. The tool was used to perform the creation and extraction of created information in order to allow the analysis of the non-volatile storage locations as valid storage areas capable of holding and preserving the data created within them.  While the format of the information used to identify the machine itself is important, this research only discusses the insertion, storage and ability to retain such information.

  11. E-pharmacovigilance: development and implementation of a computable knowledge base to identify adverse drug reactions.

    Science.gov (United States)

    Neubert, Antje; Dormann, Harald; Prokosch, Hans-Ulrich; Bürkle, Thomas; Rascher, Wolfgang; Sojer, Reinhold; Brune, Kay; Criegee-Rieck, Manfred

    2013-09-01

    Computer-assisted signal generation is an important issue for the prevention of adverse drug reactions (ADRs). However, due to poor standardization of patients' medical data and a lack of computable medical drug knowledge the specificity of computerized decision support systems for early ADR detection is too low and thus those systems are not yet implemented in daily clinical practice. We report on a method to formalize knowledge about ADRs based on the Summary of Product Characteristics (SmPCs) and linking them with structured patient data to generate safety signals automatically and with high sensitivity and specificity. A computable ADR knowledge base (ADR-KB) that inherently contains standardized concepts for ADRs (WHO-ART), drugs (ATC) and laboratory test results (LOINC) was built. The system was evaluated in study populations of paediatric and internal medicine inpatients. A total of 262 different ADR concepts related to laboratory findings were linked to 212 LOINC terms. The ADR knowledge base was retrospectively applied to a study population of 970 admissions (474 internal and 496 paediatric patients), who underwent intensive ADR surveillance. The specificity increased from 7% without ADR-KB up to 73% in internal patients and from 19.6% up to 91% in paediatric inpatients, respectively. This study shows that contextual linkage of patients' medication data with laboratory test results is a useful and reasonable instrument for computer-assisted ADR detection and a valuable step towards a systematic drug safety process. The system enables automated detection of ADRs during clinical practice with a quality close to intensive chart review. © 2013 The Authors. British Journal of Clinical Pharmacology © 2013 The British Pharmacological Society.

  12. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    Science.gov (United States)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  13. Trifoliate hybrids as rootstocks for Pêra sweet orange tree

    OpenAIRE

    Jorgino Pompeu Junior; Silvia Blumer

    2014-01-01

    The Rangpur lime (Citrus limonia) has been used as the main rootstock for Pêra sweet orange (C. sinensis) trees. However, its susceptibility to citrus blight and citrus sudden death has led to the use of disease-tolerant rootstocks, such as Cleopatra mandarin reshni), Sunki mandarin (C. sunki) and Swingle citrumelo (C. paradisi x Poncirus trifoliata), which are more susceptible to drought than the Rangpur lime. These mandarin varieties are also less resistant to root rot caused by Phytophthor...

  14. An Introduction to Computer Forensics: Gathering Evidence in a Computing Environment

    Directory of Open Access Journals (Sweden)

    Henry B. Wolfe

    2001-01-01

    Full Text Available Business has become increasingly dependent on the Internet and computing to operate. It has become apparent that there are issues of evidence gathering in a computing environment, which by their nature are technical and different to other forms of evidence gathering, that must be addressed. This paper offers an introduction to some of the technical issues surrounding this new and specialized field of Computer Forensics. It attempts to identify and describe sources of evidence that can be found on disk data storage devices in the course of an investigation. It also considers sources of copies of email, which can be used in evidence, as well as case building.

  15. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  16. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    Science.gov (United States)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  17. Collectively loading an application in a parallel computer

    Science.gov (United States)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Miller, Samuel J.; Mundy, Michael B.

    2016-01-05

    Collectively loading an application in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: identifying, by a parallel computer control system, a subset of compute nodes in the parallel computer to execute a job; selecting, by the parallel computer control system, one of the subset of compute nodes in the parallel computer as a job leader compute node; retrieving, by the job leader compute node from computer memory, an application for executing the job; and broadcasting, by the job leader to the subset of compute nodes in the parallel computer, the application for executing the job.

  18. CY15 Livermore Computing Focus Areas

    Energy Technology Data Exchange (ETDEWEB)

    Connell, Tom M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Cupps, Kim C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); D' Hooge, Trent E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fahey, Tim J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fox, Dave M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Futral, Scott W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gary, Mark R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Goldstone, Robin J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hamilton, Pam G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Heer, Todd M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Long, Jeff W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mark, Rich J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Morrone, Chris J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shoopman, Jerry D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Slavec, Joe A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Smith, David W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Springmeyer, Becky R [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Stearman, Marc D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Watson, Py C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-20

    The LC team undertook a survey of primary Center drivers for CY15. Identified key drivers included enhancing user experience and productivity, pre-exascale platform preparation, process improvement, data-centric computing paradigms and business expansion. The team organized critical supporting efforts into three cross-cutting focus areas; Improving Service Quality; Monitoring, Automation, Delegation and Center Efficiency; and Next Generation Compute and Data Environments In each area the team detailed high level challenges and identified discrete actions to address these issues during the calendar year. Identifying the Center’s primary drivers, issues, and plans is intended to serve as a lens focusing LC personnel, resources, and priorities throughout the year.

  19. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

    Directory of Open Access Journals (Sweden)

    Daniel Durstewitz

    2017-06-01

    Full Text Available The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast maximum-likelihood estimation framework for PLRNNs that may enable to recover

  20. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Directory of Open Access Journals (Sweden)

    Sebastian McBride

    Full Text Available Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1 conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2 implementation and validation of the model into robotic hardware (as a representative of an active vision system. Seven computational requirements were identified: 1 transformation of retinotopic to egocentric mappings, 2 spatial memory for the purposes of medium-term inhibition of return, 3 synchronization of 'where' and 'what' information from the two visual streams, 4 convergence of top-down and bottom-up information to a centralized point of information processing, 5 a threshold function to elicit saccade action, 6 a function to represent task relevance as a ratio of excitation and inhibition, and 7 derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  1. IDENTIFIABILITY VERSUS HETEROGENEITY IN GROUNDWATER MODELING SYSTEMS

    Directory of Open Access Journals (Sweden)

    A M BENALI

    2003-06-01

    Full Text Available Review of history matching of reservoirs parameters in groundwater flow raises the problem of identifiability of aquifer systems. Lack of identifiability means that there exists parameters to which the heads are insensitive. From the guidelines of the study of the homogeneous case, we inspect the identifiability of the distributed transmissivity field of heterogeneous groundwater aquifers. These are derived from multiple realizations of a random function Y = log T  whose probability distribution function is normal. We follow the identifiability of the autocorrelated block transmissivities through the measure of the sensitivity of the local derivatives DTh = (∂hi  ∕ ∂Tj computed for each sample of a population N (0; σY, αY. Results obtained from an analysis of Monte Carlo type suggest that the more a system is heterogeneous, the less it is identifiable.

  2. Performing stencil computations

    Energy Technology Data Exchange (ETDEWEB)

    Donofrio, David

    2018-01-16

    A method and apparatus for performing stencil computations efficiently are disclosed. In one embodiment, a processor receives an offset, and in response, retrieves a value from a memory via a single instruction, where the retrieving comprises: identifying, based on the offset, one of a plurality of registers of the processor; loading an address stored in the identified register; and retrieving from the memory the value at the address.

  3. Trends in computer hardware and software.

    Science.gov (United States)

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  4. The Influence of Personal Characteristics, Interaction: (Computer/Individual), Computer Self-efficacy, Personal Innovativeness in Information Technology to Computer Anxiety in use of Mind your Own Business Accounting Software

    OpenAIRE

    Mayasari, Mega; ., Gudono

    2015-01-01

    The purpose of this study was to identify the factors that cause computer anxiety in the use of Mind Your Own Business (MYOB) accounting software, i.e., to assess if there are any influence of age, gender, amount of training, ownership (usage of accounting software on a regular basis), computer self-efficacy, personal innovativeness in Information Technology (IT) to computer anxiety. The study also examined whether there is a relationship trait anxiety and negative affect to computer self-eff...

  5. Computer system architecture for laboratory automation

    International Nuclear Information System (INIS)

    Penney, B.K.

    1978-01-01

    This paper describes the various approaches that may be taken to provide computing resources for laboratory automation. Three distinct approaches are identified, the single dedicated small computer, shared use of a larger computer, and a distributed approach in which resources are provided by a number of computers, linked together, and working in some cooperative way. The significance of the microprocessor in laboratory automation is discussed, and it is shown that it is not simply a cheap replacement of the minicomputer. (Auth.)

  6. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  7. ISD97, a computer program to analyze data from a series of in situ measurements on a grid and identify potential localized areas of elevated activity

    International Nuclear Information System (INIS)

    Reginatto, M.; Shebell, P.; Miller, K.M.

    1997-10-01

    A computer program, ISD97, was developed to analyze data from a series of in situ measurements on a grid and identify potential localized areas of elevated activity. The ISD97 code operates using a two-step process. A deconvolution of the data is carried out using the maximum entropy method, and a map of activity on the ground that fits the data within experimental error is generated. This maximum entropy map is then analyzed to determine the locations and magnitudes of potential areas of elevated activity that are consistent with the data. New deconvolutions are then carried out for each potential area of elevated activity identified by the code. Properties of the algorithm are demonstrated using data from actual field measurements

  8. Quantum computation with Turaev-Viro codes

    International Nuclear Information System (INIS)

    Koenig, Robert; Kuperberg, Greg; Reichardt, Ben W.

    2010-01-01

    For a 3-manifold with triangulated boundary, the Turaev-Viro topological invariant can be interpreted as a quantum error-correcting code. The code has local stabilizers, identified by Levin and Wen, on a qudit lattice. Kitaev's toric code arises as a special case. The toric code corresponds to an abelian anyon model, and therefore requires out-of-code operations to obtain universal quantum computation. In contrast, for many categories, such as the Fibonacci category, the Turaev-Viro code realizes a non-abelian anyon model. A universal set of fault-tolerant operations can be implemented by deforming the code with local gates, in order to implement anyon braiding. We identify the anyons in the code space, and present schemes for initialization, computation and measurement. This provides a family of constructions for fault-tolerant quantum computation that are closely related to topological quantum computation, but for which the fault tolerance is implemented in software rather than coming from a physical medium.

  9. Quantum Internet: from Communication to Distributed Computing!

    OpenAIRE

    Caleffi, Marcello; Cacciapuoti, Angela Sara; Bianchi, Giuseppe

    2018-01-01

    In this invited paper, the authors discuss the exponential computing speed-up achievable by interconnecting quantum computers through a quantum internet. They also identify key future research challenges and open problems for quantum internet design and deployment.

  10. Knowledge-based computer security advisor

    International Nuclear Information System (INIS)

    Hunteman, W.J.; Squire, M.B.

    1991-01-01

    The rapid expansion of computer security information and technology has included little support to help the security officer identify the safeguards needed to comply with a policy and to secure a computing system. This paper reports that Los Alamos is developing a knowledge-based computer security system to provide expert knowledge to the security officer. This system includes a model for expressing the complex requirements in computer security policy statements. The model is part of an expert system that allows a security officer to describe a computer system and then determine compliance with the policy. The model contains a generic representation that contains network relationships among the policy concepts to support inferencing based on information represented in the generic policy description

  11. Lecture 4: Cloud Computing in Large Computer Centers

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    This lecture will introduce Cloud Computing concepts identifying and analyzing its characteristics, models, and applications. Also, you will learn how CERN built its Cloud infrastructure and which tools are been used to deploy and manage it. About the speaker: Belmiro Moreira is an enthusiastic software engineer passionate about the challenges and complexities of architecting and deploying Cloud Infrastructures in ve...

  12. Context-aware computing and self-managing systems

    CERN Document Server

    Dargie, Waltenegus

    2009-01-01

    Bringing together an extensively researched area with an emerging research issue, Context-Aware Computing and Self-Managing Systems presents the core contributions of context-aware computing in the development of self-managing systems, including devices, applications, middleware, and networks. The expert contributors reveal the usefulness of context-aware computing in developing autonomous systems that have practical application in the real world.The first chapter of the book identifies features that are common to both context-aware computing and autonomous computing. It offers a basic definit

  13. Heterogeneous compute in computer vision: OpenCL in OpenCV

    Science.gov (United States)

    Gasparakis, Harris

    2014-02-01

    We explore the relevance of Heterogeneous System Architecture (HSA) in Computer Vision, both as a long term vision, and as a near term emerging reality via the recently ratified OpenCL 2.0 Khronos standard. After a brief review of OpenCL 1.2 and 2.0, including HSA features such as Shared Virtual Memory (SVM) and platform atomics, we identify what genres of Computer Vision workloads stand to benefit by leveraging those features, and we suggest a new mental framework that replaces GPU compute with hybrid HSA APU compute. As a case in point, we discuss, in some detail, popular object recognition algorithms (part-based models), emphasizing the interplay and concurrent collaboration between the GPU and CPU. We conclude by describing how OpenCL has been incorporated in OpenCV, a popular open source computer vision library, emphasizing recent work on the Transparent API, to appear in OpenCV 3.0, which unifies the native CPU and OpenCL execution paths under a single API, allowing the same code to execute either on CPU or on a OpenCL enabled device, without even recompiling.

  14. Reducing power consumption during execution of an application on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-06-05

    Methods, apparatus, and products are disclosed for reducing power consumption during execution of an application on a plurality of compute nodes that include: executing, by each compute node, an application, the application including power consumption directives corresponding to one or more portions of the application; identifying, by each compute node, the power consumption directives included within the application during execution of the portions of the application corresponding to those identified power consumption directives; and reducing power, by each compute node, to one or more components of that compute node according to the identified power consumption directives during execution of the portions of the application corresponding to those identified power consumption directives.

  15. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  16. Identifying Broadband Rotational Spectra with Neural Networks

    Science.gov (United States)

    Zaleski, Daniel P.; Prozument, Kirill

    2017-06-01

    A typical broadband rotational spectrum may contain several thousand observable transitions, spanning many species. Identifying the individual spectra, particularly when the dynamic range reaches 1,000:1 or even 10,000:1, can be challenging. One approach is to apply automated fitting routines. In this approach, combinations of 3 transitions can be created to form a "triple", which allows fitting of the A, B, and C rotational constants in a Watson-type Hamiltonian. On a standard desktop computer, with a target molecule of interest, a typical AUTOFIT routine takes 2-12 hours depending on the spectral density. A new approach is to utilize machine learning to train a computer to recognize the patterns (frequency spacing and relative intensities) inherit in rotational spectra and to identify the individual spectra in a raw broadband rotational spectrum. Here, recurrent neural networks have been trained to identify different types of rotational spectra and classify them accordingly. Furthermore, early results in applying convolutional neural networks for spectral object recognition in broadband rotational spectra appear promising. Perez et al. "Broadband Fourier transform rotational spectroscopy for structure determination: The water heptamer." Chem. Phys. Lett., 2013, 571, 1-15. Seifert et al. "AUTOFIT, an Automated Fitting Tool for Broadband Rotational Spectra, and Applications to 1-Hexanal." J. Mol. Spectrosc., 2015, 312, 13-21. Bishop. "Neural networks for pattern recognition." Oxford university press, 1995.

  17. Nuclide identifier and grat data reader application for ORIGEN output file

    International Nuclear Information System (INIS)

    Arif Isnaeni

    2011-01-01

    ORIGEN is a one-group depletion and radioactive decay computer code developed at the Oak Ridge National Laboratory (ORNL). ORIGEN takes one-group neutronics calculation providing various nuclear material characteristics (the buildup, decay and processing of radioactive materials). ORIGEN output is a text-based file, ORIGEN output file contains only numbers in the form of group data nuclide, nuclide identifier and grat. This application was created to facilitate data collection nuclide identifier and grat, this application also has a function to acquire mass number data and calculate mass (gram) for each nuclide. Output from these applications can be used for computer code data input for neutronic calculations such as MCNP. (author)

  18. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  19. Structural parameter identifiability analysis for dynamic reaction networks

    DEFF Research Database (Denmark)

    Davidescu, Florin Paul; Jørgensen, Sten Bay

    2008-01-01

    method based on Lie derivatives. The proposed systematic two phase methodology is illustrated on a mass action based model for an enzymatically catalyzed reaction pathway network where only a limited set of variables is measured. The methodology clearly pinpoints the structurally identifiable parameters...... where for a given set of measured variables it is desirable to investigate which parameters may be estimated prior to spending computational effort on the actual estimation. This contribution addresses the structural parameter identifiability problem for the typical case of reaction network models....... The proposed analysis is performed in two phases. The first phase determines the structurally identifiable reaction rates based on reaction network stoichiometry. The second phase assesses the structural parameter identifiability of the specific kinetic rate expressions using a generating series expansion...

  20. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    Science.gov (United States)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  1. Security Problems in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Rola Motawie

    2016-12-01

    Full Text Available Cloud is a pool of computing resources which are distributed among cloud users. Cloud computing has many benefits like scalability, flexibility, cost savings, reliability, maintenance and mobile accessibility. Since cloud-computing technology is growing day by day, it comes with many security problems. Securing the data in the cloud environment is most critical challenges which act as a barrier when implementing the cloud. There are many new concepts that cloud introduces, such as resource sharing, multi-tenancy, and outsourcing, create new challenges for the security community. In this work, we provide a comparable study of cloud computing privacy and security concerns. We identify and classify known security threats, cloud vulnerabilities, and attacks.

  2. Method for Statically Checking an Object-oriented Computer Program Module

    Science.gov (United States)

    Bierhoff, Kevin M. (Inventor); Aldrich, Jonathan (Inventor)

    2012-01-01

    A method for statically checking an object-oriented computer program module includes the step of identifying objects within a computer program module, at least one of the objects having a plurality of references thereto, possibly from multiple clients. A discipline of permissions is imposed on the objects identified within the computer program module. The permissions enable tracking, from among a discrete set of changeable states, a subset of states each object might be in. A determination is made regarding whether the imposed permissions are violated by a potential reference to any of the identified objects. The results of the determination are output to a user.

  3. Identifying trace evidence in data wiping application software

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2012-06-01

    Full Text Available One area of particular concern for computer forensics examiners involves situations in which someone utilized software applications to destroy evidence. There are products available in the marketplace that are relatively inexpensive and advertised as being able to destroy targeted portions of data stored within a computer system. This study was undertaken to identify these tools and analyze them to determine the extent to which each of the evaluated data wiping applications perform their tasks and to identify trace evidence, if any, left behind on disk media after executing these applications. We evaluated five Windows 7 compatible software products whose advertised features include the ability for users to wipe targeted files, folders, or evidence of selected activities. We conducted a series of experiments that involved executing each application on systems with identical data, and we then analyzed the results and compared the before and after images for each application. We identified information for each application that is beneficial to forensics examiners when faced with similar situations. This paper describes our application selection process, our application evaluation methodology, and our findings. Following this, we describe limitations of this study and suggest areas of additional research that will benefit the study of digital forensics.

  4. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  5. Sparse Linear Identifiable Multivariate Modeling

    DEFF Research Database (Denmark)

    Henao, Ricardo; Winther, Ole

    2011-01-01

    and bench-marked on artificial and real biological data sets. SLIM is closest in spirit to LiNGAM (Shimizu et al., 2006), but differs substantially in inference, Bayesian network structure learning and model comparison. Experimentally, SLIM performs equally well or better than LiNGAM with comparable......In this paper we consider sparse and identifiable linear latent variable (factor) and linear Bayesian network models for parsimonious analysis of multivariate data. We propose a computationally efficient method for joint parameter and model inference, and model comparison. It consists of a fully...

  6. Identifying a few foot-and-mouth disease virus signature nucleotide strings for computational genotyping

    Directory of Open Access Journals (Sweden)

    Xu Lizhe

    2008-06-01

    Full Text Available Abstract Background Serotypes of the Foot-and-Mouth disease viruses (FMDVs were generally determined by biological experiments. The computational genotyping is not well studied even with the availability of whole viral genomes, due to uneven evolution among genes as well as frequent genetic recombination. Naively using sequence comparison for genotyping is only able to achieve a limited extent of success. Results We used 129 FMDV strains with known serotype as training strains to select as many as 140 most serotype-specific nucleotide strings. We then constructed a linear-kernel Support Vector Machine classifier using these 140 strings. Under the leave-one-out cross validation scheme, this classifier was able to assign correct serotype to 127 of these 129 strains, achieving 98.45% accuracy. It also assigned serotype correctly to an independent test set of 83 other FMDV strains downloaded separately from NCBI GenBank. Conclusion Computational genotyping is much faster and much cheaper than the wet-lab based biological experiments, upon the availability of the detailed molecular sequences. The high accuracy of our proposed method suggests the potential of utilizing a few signature nucleotide strings instead of whole genomes to determine the serotypes of novel FMDV strains.

  7. Applications of X-ray Computed Tomography and Emission Computed Tomography

    International Nuclear Information System (INIS)

    Seletchi, Emilia Dana; Sutac, Victor

    2005-01-01

    Computed Tomography is a non-destructive imaging method that allows visualization of internal features within non-transparent objects such as sedimentary rocks. Filtering techniques have been applied to circumvent the artifacts and achieve high-quality images for quantitative analysis. High-resolution X-ray computed tomography (HRXCT) can be used to identify the position of the growth axis in speleothems by detecting subtle changes in calcite density between growth bands. HRXCT imagery reveals the three-dimensional variability of coral banding providing information on coral growth and climate over the past several centuries. The Nuclear Medicine imaging technique uses a radioactive tracer, several radiation detectors, and sophisticated computer technologies to understand the biochemical basis of normal and abnormal functions within the brain. The goal of Emission Computed Tomography (ECT) is to accurately determine the three-dimensional radioactivity distribution resulting from the radiopharmaceutical uptake inside the patient instead of the attenuation coefficient distribution from different tissues as obtained from X-ray Computer Tomography. ECT is a very useful tool for investigating the cognitive functions. Because of the low radiation doses associated with Positron Emission Tomography (PET), this technique has been applied in clinical research, allowing the direct study of human neurological diseases. (authors)

  8. Aggregated Computational Toxicology Resource (ACTOR)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aggregated Computational Toxicology Resource (ACTOR) is a database on environmental chemicals that is searchable by chemical name and other identifiers, and by...

  9. Evaluation of valvular heart diseases with computed tomography

    International Nuclear Information System (INIS)

    Tomoda, Haruo; Hoshiai, Mitsumoto; Matsuyama, Seiya

    1982-01-01

    Forty-two patients with valvular heart diseases were studied with a third-generation computed tomographic system. The cardiac chambers (the atria and ventricles) were evaluated semiquantitatively, and valvular calcification was easily detected with computed tomography. Computed tomography was most valuable in revealing left atrial thrombi which were not identified by other diagnostic procedures in some cases. (author)

  10. Studi Perbandingan Layanan Cloud Computing

    Directory of Open Access Journals (Sweden)

    Afdhal Afdhal

    2014-03-01

    Full Text Available In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud delivery service, correlation and inter-dependency. This article compares and contrasts the different levels of delivery services and the development models, identify issues, and future directions on cloud computing. The end-users comprehension of cloud computing delivery service classification will equip them with knowledge to determine and decide which business model that will be chosen and adopted securely and comfortably. The last part of this article provides several recommendations for cloud computing service providers and end-users.

  11. Computer programs as accounting object

    Directory of Open Access Journals (Sweden)

    I.V. Perviy

    2015-03-01

    Full Text Available Existing approaches to the regulation of accounting software as one of the types of intangible assets have been considered. The features and current state of the legal protection of computer programs have been analyzed. The reasons for the need to use patent law as a means of legal protection of individual elements of computer programs have been discovered. The influence of the legal aspects of the use of computer programs for national legislation to their accounting reflection has been analyzed. The possible options for the transfer of rights from computer programs copyright owners have been analyzed that should be considered during creation of software accounting system at the enterprise. Identified and analyzed the characteristics of computer software as an intangible asset under the current law. General economic characteristics of computer programs as one of the types of intangible assets have been grounded. The main distinguishing features of software compared to other types of intellectual property have been all ocated

  12. Development of computational fluid dynamics--habitat suitability (CFD-HSI) models to identify potential passage--Challenge zones for migratory fishes in the Penobscot River

    Science.gov (United States)

    Haro, Alexander J.; Dudley, Robert W.; Chelminski, Michael

    2012-01-01

    A two-dimensional computational fluid dynamics-habitat suitability (CFD–HSI) model was developed to identify potential zones of shallow depth and high water velocity that may present passage challenges for five anadromous fish species in the Penobscot River, Maine, upstream from two existing dams and as a result of the proposed future removal of the dams. Potential depth-challenge zones were predicted for larger species at the lowest flow modeled in the dam-removal scenario. Increasing flows under both scenarios increased the number and size of potential velocity-challenge zones, especially for smaller species. This application of the two-dimensional CFD–HSI model demonstrated its capabilities to estimate the potential effects of flow and hydraulic alteration on the passage of migratory fish.

  13. Adiabatic graph-state quantum computation

    International Nuclear Information System (INIS)

    Antonio, B; Anders, J; Markham, D

    2014-01-01

    Measurement-based quantum computation (MBQC) and holonomic quantum computation (HQC) are two very different computational methods. The computation in MBQC is driven by adaptive measurements executed in a particular order on a large entangled state. In contrast in HQC the system starts in the ground subspace of a Hamiltonian which is slowly changed such that a transformation occurs within the subspace. Following the approach of Bacon and Flammia, we show that any MBQC on a graph state with generalized flow (gflow) can be converted into an adiabatically driven holonomic computation, which we call adiabatic graph-state quantum computation (AGQC). We then investigate how properties of AGQC relate to the properties of MBQC, such as computational depth. We identify a trade-off that can be made between the number of adiabatic steps in AGQC and the norm of H-dot as well as the degree of H, in analogy to the trade-off between the number of measurements and classical post-processing seen in MBQC. Finally the effects of performing AGQC with orderings that differ from standard MBQC are investigated. (paper)

  14. Computer virus information update CIAC-2301

    Energy Technology Data Exchange (ETDEWEB)

    Orvis, W.J.

    1994-01-15

    While CIAC periodically issues bulletins about specific computer viruses, these bulletins do not cover all the computer viruses that affect desktop computers. The purpose of this document is to identify most of the known viruses for the MS-DOS and Macintosh platforms and give an overview of the effects of each virus. The authors also include information on some windows, Atari, and Amiga viruses. This document is revised periodically as new virus information becomes available. This document replaces all earlier versions of the CIAC Computer virus Information Update. The date on the front cover indicates date on which the information in this document was extracted from CIAC`s Virus database.

  15. ATLAS@Home: Harnessing Volunteer Computing for HEP

    CERN Document Server

    Cameron, David; The ATLAS collaboration

    2015-01-01

    The ATLAS collaboration has setup a volunteer computing project called ATLAS@home. Volunteers running Monte-Carlo simulation on their personal computer provide significant computing resources, but also belong to a community potentially interested in HEP. Four types of contributors have been identified, whose questions range from advanced technical details to the reason why simulation is needed, how Computing is organized and how it relates to society. The creation of relevant outreach material for simulation, event visualization and distributed production will be described, as well as lessons learned while interacting with the BOINC volunteers community.

  16. Computer/Information Science

    Science.gov (United States)

    Birman, Ken; Roughgarden, Tim; Seltzer, Margo; Spohrer, Jim; Stolterman, Erik; Kearsley, Greg; Koszalka, Tiffany; de Jong, Ton

    2013-01-01

    Scholars representing the field of computer/information science were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Ken Birman, Jennifer Rexford, Tim Roughgarden, Margo Seltzer, Jim Spohrer, and…

  17. Animal-Computer Interaction (ACI) : An analysis, a perspective, and guidelines

    NARCIS (Netherlands)

    van den Broek, E.L.

    2016-01-01

    Animal-Computer Interaction (ACI)’s founding elements are discussed in relation to its overarching discipline Human-Computer Interaction (HCI). Its basic dimensions are identified: agent, computing machinery, and interaction, and their levels of processing: perceptual, cognitive, and affective.

  18. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  19. A comparison of approaches for finding minimum identifying codes on graphs

    Science.gov (United States)

    Horan, Victoria; Adachi, Steve; Bak, Stanley

    2016-05-01

    In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a typical computer. One sample problem explored is that of finding a minimum identifying code. To work around the computational issues, a variety of methods are explored and consist of a parallel computing approach using MATLAB, an adiabatic quantum optimization approach using a D-Wave quantum annealing processor, and lastly using satisfiability modulo theory (SMT) and corresponding SMT solvers. Each of these methods requires the problem to be formulated in a unique manner. In this paper, we address the challenges of computing solutions to this NP-hard problem with respect to each of these methods.

  20. Multiscale Computation. Needs and Opportunities for BER Science

    Energy Technology Data Exchange (ETDEWEB)

    Scheibe, Timothy D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Jeremy C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-01

    The Environmental Molecular Sciences Laboratory (EMSL), a scientific user facility managed by Pacific Northwest National Laboratory for the U.S. Department of Energy, Office of Biological and Environmental Research (BER), conducted a one-day workshop on August 26, 2014 on the topic of “Multiscale Computation: Needs and Opportunities for BER Science.” Twenty invited participants, from various computational disciplines within the BER program research areas, were charged with the following objectives; Identify BER-relevant models and their potential cross-scale linkages that could be exploited to better connect molecular-scale research to BER research at larger scales and; Identify critical science directions that will motivate EMSL decisions regarding future computational (hardware and software) architectures.

  1. Comparative study of compounds of primary exchange duckweed (Lemna minor L., trisulki duckweed (Lemna trisulca L. and spirodela (Spirodella polyrrhiza L. Schleid.

    Directory of Open Access Journals (Sweden)

    L. A. Nikiforov

    2017-01-01

    Full Text Available The purpose of the paper is to study qualitative composition and quantitative content of primary exchange compounds in duckweed (Lemna minor L., trisulki duckweed (Lemna trisulca L. and spirodela (Spirodella polyrrhiza (L. Schleid.Materials and methods. The subject of the study was air-dried samples of grass collected during their 2010- 2011 growing season in low-flow and stagnant water bodies of Kozhevnikovsky and Tomsk districts of the Tomsk region. The concentration of free monosaccharides was determined by direct-phase high-performance liquid chromatography. The concentration of the bound sugars was determined by capillary electrophoresis using Applied Biosystem 273T (Thermophischer Ltd., USA. To obtain data on the qualitative composition and quantitative content of amino acids, the amino acid analyzer Hitachi 835 (Japan was used.Results. It was found out that the least amount of amino acids contained in the water extract from duckweed trifoliate – 96,14 mg, which is 2 times less than in extracts of Lemna minor and Lemna multirooted (205,65 and 208,38 mg, respectively. In duckweed the minimum content of free and bound monosaccharides was determined to be 10,54%, while in the Lemna trifoliate and Lemna multirooted their content was 14,30% and 15,35%, respectively. This study showed the qualitative and quantitative differences of free and bound monosaccharide and amino acid composition between previously mentioned species. 

  2. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-01-01

    The application of computers to controlled thermonuclear research (CTR) is essential. In the near future the use of computers in the numerical modeling of fusion systems should increase substantially. A recent panel has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies is called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. To meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR Laboratories by a communication network. The crucial element needed for success is trained personnel. The number of people with knowledge of plasma science and engineering trained in numerical methods and computer science must be increased substantially in the next few years. Nuclear engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing

  3. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  4. Selection of personalized patient therapy through the use of knowledge-based computational models that identify tumor-driving signal transduction pathways.

    Science.gov (United States)

    Verhaegh, Wim; van Ooijen, Henk; Inda, Márcia A; Hatzis, Pantelis; Versteeg, Rogier; Smid, Marcel; Martens, John; Foekens, John; van de Wiel, Paul; Clevers, Hans; van de Stolpe, Anja

    2014-06-01

    Increasing knowledge about signal transduction pathways as drivers of cancer growth has elicited the development of "targeted drugs," which inhibit aberrant signaling pathways. They require a companion diagnostic test that identifies the tumor-driving pathway; however, currently available tests like estrogen receptor (ER) protein expression for hormonal treatment of breast cancer do not reliably predict therapy response, at least in part because they do not adequately assess functional pathway activity. We describe a novel approach to predict signaling pathway activity based on knowledge-based Bayesian computational models, which interpret quantitative transcriptome data as the functional output of an active signaling pathway, by using expression levels of transcriptional target genes. Following calibration on only a small number of cell lines or cohorts of patient data, they provide a reliable assessment of signaling pathway activity in tumors of different tissue origin. As proof of principle, models for the canonical Wnt and ER pathways are presented, including initial clinical validation on independent datasets from various cancer types. ©2014 American Association for Cancer Research.

  5. Academic Training Lecture Regular Programme: Cloud Computing

    CERN Multimedia

    2012-01-01

    Cloud Computing (1/2), by Belmiro Rodrigues Moreira (LIP Laboratorio de Instrumentacao e Fisica Experimental de Part).   Wednesday, May 30, 2012 from 11:00 to 12:00 (Europe/Zurich) at CERN ( 500-1-001 - Main Auditorium ) Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  6. Machine Learning Classification to Identify the Stage of Brain-Computer Interface Therapy for Stroke Rehabilitation Using Functional Connectivity

    Directory of Open Access Journals (Sweden)

    Rosaleena Mohanty

    2018-05-01

    Full Text Available Interventional therapy using brain-computer interface (BCI technology has shown promise in facilitating motor recovery in stroke survivors; however, the impact of this form of intervention on functional networks outside of the motor network specifically is not well-understood. Here, we investigated resting-state functional connectivity (rs-FC in stroke participants undergoing BCI therapy across stages, namely pre- and post-intervention, to identify discriminative functional changes using a machine learning classifier with the goal of categorizing participants into one of the two therapy stages. Twenty chronic stroke participants with persistent upper-extremity motor impairment received neuromodulatory training using a closed-loop neurofeedback BCI device, and rs-functional MRI (rs-fMRI scans were collected at four time points: pre-, mid-, post-, and 1 month post-therapy. To evaluate the peak effects of this intervention, rs-FC was analyzed from two specific stages, namely pre- and post-therapy. In total, 236 seeds spanning both motor and non-motor regions of the brain were computed at each stage. A univariate feature selection was applied to reduce the number of features followed by a principal component-based data transformation used by a linear binary support vector machine (SVM classifier to classify each participant into a therapy stage. The SVM classifier achieved a cross-validation accuracy of 92.5% using a leave-one-out method. Outside of the motor network, seeds from the fronto-parietal task control, default mode, subcortical, and visual networks emerged as important contributors to the classification. Furthermore, a higher number of functional changes were observed to be strengthening from the pre- to post-therapy stage than the ones weakening, both of which involved motor and non-motor regions of the brain. These findings may provide new evidence to support the potential clinical utility of BCI therapy as a form of stroke

  7. IDENTIFYING THE DETERMINANTS OF CLOUD COMPUTING ADOPTION IN A GOVERNMENT SECTOR – A CASE STUDY OF SAUDI ORGANISATION

    OpenAIRE

    Alsanea, Majed; Wainwright, David

    2014-01-01

    The adoption of Cloud Computing technology is an essential step forward within both the public and private sectors, particularly in the context of the current economic crisis. However, the trend is struggling for many reasons. The purpose of this study is to establish the foundations for the development of a framework to guide government organisations through the process of transferring to Cloud Computing technology. The main aim of this research is to evaluate the factors affecting the adopt...

  8. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  9. Systematic procedure for identifying the five main ossification stages of the medial clavicular epiphysis using computed tomography: a practical proposal for forensic age diagnostics.

    Science.gov (United States)

    Wittschieber, Daniel; Schulz, Ronald; Pfeiffer, Heidi; Schmeling, Andreas; Schmidt, Sven

    2017-01-01

    In forensic age estimations of living individuals, computed tomography of the clavicle is widely used for determining the age of majority. To this end, the degree of ossification of the medial clavicular epiphysis can be determined by means of two classification systems complementing each other: a 5-stage system and an additional 6-stage system that further sub-classifies the stages 2 and 3. In recent years, practical experience and new data revealed that difficulties and even wrong stage determinations may occur especially when following the short descriptions of the fundamental 5-stage system only. Based on current literature, this article provides a systematic procedure for identifying the five main ossification stages by listing important preconditions and presenting an algorithm that is comprised of four specific questions. Each question is accompanied by comprehensive and detailed descriptions which specify the criteria used for differentiation. The information is subdivided into "single-slice view" and "multi-slice view." In addition, illustrative case examples and schematic drawings facilitate application of the procedure in forensic practice. The pitfalls associated with the criteria of stage determination will be discussed in detail. Eventually, two general rules will be inferred to assign correct ossification stages of the medial clavicular epiphysis by means of computed tomography.

  10. Integrated evolutionary computation neural network quality controller for automated systems

    Energy Technology Data Exchange (ETDEWEB)

    Patro, S.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

    1999-06-01

    With increasing competition in the global market, more and more stringent quality standards and specifications are being demands at lower costs. Manufacturing applications of computing power are becoming more common. The application of neural networks to identification and control of dynamic processes has been discussed. The limitations of using neural networks for control purposes has been pointed out and a different technique, evolutionary computation, has been discussed. The results of identifying and controlling an unstable, dynamic process using evolutionary computation methods has been presented. A framework for an integrated system, using both neural networks and evolutionary computation, has been proposed to identify the process and then control the product quality, in a dynamic, multivariable system, in real-time.

  11. Identifying the Key Weaknesses in Network Security at Colleges.

    Science.gov (United States)

    Olsen, Florence

    2000-01-01

    A new study identifies and ranks the 10 security gaps responsible for most outsider attacks on college computer networks. The list is intended to help campus system administrators establish priorities as they work to increase security. One network security expert urges that institutions utilize multiple security layers. (DB)

  12. Operating System Concepts for Reconfigurable Computing: Review and Survey

    Directory of Open Access Journals (Sweden)

    Marcel Eckert

    2016-01-01

    Full Text Available One of the key future challenges for reconfigurable computing is to enable higher design productivity and a more easy way to use reconfigurable computing systems for users that are unfamiliar with the underlying concepts. One way of doing this is to provide standardization and abstraction, usually supported and enforced by an operating system. This article gives historical review and a summary on ideas and key concepts to include reconfigurable computing aspects in operating systems. The article also presents an overview on published and available operating systems targeting the area of reconfigurable computing. The purpose of this article is to identify and summarize common patterns among those systems that can be seen as de facto standard. Furthermore, open problems, not covered by these already available systems, are identified.

  13. An Approach for a Synthetic CTL Vaccine Design against Zika Flavivirus Using Class I and Class II Epitopes Identified by Computer Modeling

    Directory of Open Access Journals (Sweden)

    Edecio Cunha-Neto

    2017-06-01

    Full Text Available The threat posed by severe congenital abnormalities related to Zika virus (ZKV infection during pregnancy has turned development of a ZKV vaccine into an emergency. Recent work suggests that the cytotoxic T lymphocyte (CTL response to infection is an important defense mechanism in response to ZKV. Here, we develop the rationale and strategy for a new approach to developing cytotoxic T lymphocyte (CTL vaccines for ZKV flavivirus infection. The proposed approach is based on recent studies using a protein structure computer model for HIV epitope selection designed to select epitopes for CTL attack optimized for viruses that exhibit antigenic drift. Because naturally processed and presented human ZKV T cell epitopes have not yet been described, we identified predicted class I peptide sequences on ZKV matching previously identified DNV (Dengue class I epitopes and by using a Major Histocompatibility Complex (MHC binding prediction tool. A subset of those met the criteria for optimal CD8+ attack based on physical chemistry parameters determined by analysis of the ZKV protein structure encoded in open source Protein Data File (PDB format files. We also identified candidate ZKV epitopes predicted to bind promiscuously to multiple HLA class II molecules that could provide help to the CTL responses. This work suggests that a CTL vaccine for ZKV may be possible even if ZKV exhibits significant antigenic drift. We have previously described a microsphere-based CTL vaccine platform capable of eliciting an immune response for class I epitopes in mice and are currently working toward in vivo testing of class I and class II epitope delivery directed against ZKV epitopes using the same microsphere-based vaccine.

  14. Computer security threats faced by small businesses in Australia

    OpenAIRE

    Hutchings, Alice

    2012-01-01

    In this paper, an overview is provided of computer security threats faced by small businesses. Having identified the threats, the implications for small business owners are described, along with countermeasures that can be adopted to prevent incidents from occurring. The results of the Australian Business Assessment of Computer User Security (ABACUS) survey, commissioned by the Australian Institute of Criminology (AIC), are drawn upon to identify key risks (Challice 2009; Richards 2009). Addi...

  15. The benefit of enterprise ontology in identifying business components

    OpenAIRE

    Albani, Antonia

    2006-01-01

    The benefit of enterprise ontology in identifying business components / A. Albani, J. Dietz. - In: Artificial intelligence in theory and practice : IFIP 19th World Computer Congress ; TC 12: IFIP AI 2006 Stream, August 21-24, 2006, Santiago, Chile / ed. by Max Bramer. - New York : Springer, 2006. - S. 1-12. - (IFIP ; 217)

  16. Opportunities for discovery: Theory and computation in Basic Energy Sciences

    Energy Technology Data Exchange (ETDEWEB)

    Harmon, Bruce; Kirby, Kate; McCurdy, C. William

    2005-01-11

    New scientific frontiers, recent advances in theory, and rapid increases in computational capabilities have created compelling opportunities for theory and computation to advance the scientific mission of the Office of Basic Energy Sciences (BES). The prospects for success in the experimental programs of BES will be enhanced by pursuing these opportunities. This report makes the case for an expanded research program in theory and computation in BES. The Subcommittee on Theory and Computation of the Basic Energy Sciences Advisory Committee was charged with identifying current and emerging challenges and opportunities for theoretical research within the scientific mission of BES, paying particular attention to how computing will be employed to enable that research. A primary purpose of the Subcommittee was to identify those investments that are necessary to ensure that theoretical research will have maximum impact in the areas of importance to BES, and to assure that BES researchers will be able to exploit the entire spectrum of computational tools, including leadership class computing facilities. The Subcommittee s Findings and Recommendations are presented in Section VII of this report.

  17. The importance of trust in computer security

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2014-01-01

    The computer security community has traditionally regarded security as a “hard” property that can be modelled and formally proven under certain simplifying assumptions. Traditional security technologies assume that computer users are either malicious, e.g. hackers or spies, or benevolent, competent...... and well informed about the security policies. Over the past two decades, however, computing has proliferated into all aspects of modern society and the spread of malicious software (malware) like worms, viruses and botnets have become an increasing threat. This development indicates a failure in some...... of the fundamental assumptions that underpin existing computer security technologies and that a new view of computer security is long overdue. In this paper, we examine traditionalmodels, policies and mechanisms of computer security in order to identify areas where the fundamental assumptions may fail. In particular...

  18. Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.

    Science.gov (United States)

    Handels, H; Ehrhardt, J

    2009-01-01

    Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or

  19. Computational Physics' Greatest Hits

    Science.gov (United States)

    Bug, Amy

    2011-03-01

    The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.

  20. Identifiability of PBPK Models with Applications to ...

    Science.gov (United States)

    Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy

  1. Computational Investigations in Rectangular Convergent and Divergent Ribbed Channels

    Science.gov (United States)

    Sivakumar, Karthikeyan; Kulasekharan, N.; Natarajan, E.

    2018-05-01

    Computational investigations on the rib turbulated flow inside a convergent and divergent rectangular channel with square ribs of different rib heights and different Reynolds numbers (Re=20,000, 40,000 and 60,000). The ribs were arranged in a staggered fashion between the upper and lower surfaces of the test section. Computational investigations are carried out using computational fluid dynamic software ANSYS Fluent 14.0. Suitable solver settings like turbulence models were identified from the literature and the boundary conditions for the simulations on a solution of independent grid. Computations were carried out for both convergent and divergent channels with 0 (smooth duct), 1.5, 3, 6, 9 and 12 mm rib heights, to identify the ribbed channel with optimal performance, assessed using a thermo hydraulic performance parameter. The convergent and divergent rectangular channels show higher Nu values than the standard correlation values.

  2. Cloud Computing for Standard ERP Systems

    DEFF Research Database (Denmark)

    Schubert, Petra; Adisa, Femi

    for the operation of ERP systems. We argue that the phenomenon of cloud computing could lead to a decisive change in the way business software is deployed in companies. Our reference framework contains three levels (IaaS, PaaS, SaaS) and clarifies the meaning of public, private and hybrid clouds. The three levels......Cloud Computing is a topic that has gained momentum in the last years. Current studies show that an increasing number of companies is evaluating the promised advantages and considering making use of cloud services. In this paper we investigate the phenomenon of cloud computing and its importance...... of cloud computing and their impact on ERP systems operation are discussed. From the literature we identify areas for future research and propose a research agenda....

  3. Computer Modeling of Platinum Reforming Reactors | Momoh ...

    African Journals Online (AJOL)

    This paper, instead of using a theoretical approach has considered a computer model as means of assessing the reformate composition for three-stage fixed bed reactors in platforming unit. This is done by identifying many possible hydrocarbon transformation reactions that are peculiar to the process unit, identify the ...

  4. Can eye tracking boost usability evaluation of computer games?

    DEFF Research Database (Denmark)

    Johansen, Sune Alstrup; Noergaard, Mie; Soerensen, Janus Rau

    2008-01-01

    Good computer games need to be challenging while at the same time being easy to use. Accordingly, besides struggling with well known challenges for usability work, such as persuasiveness, the computer game industry also faces system-specific challenges, such as identifying methods that can provide...... data on players' attention during a game. This position paper discusses how eye tracking may address three core challenges faced by computer game producer IO Interactive in their on-going work to ensure games that are fun, usable, and challenging. These challenges are: (1) Persuading game designers...... about the relevance of usability results, (2) involving game designers in usability work, and (3) identifying methods that provide new data about user behaviour and experience....

  5. The Computer Backgrounds of Soldiers in Army Units: FY01

    National Research Council Canada - National Science Library

    Singh, Harnam

    2002-01-01

    A multi-year research effort was instituted in FY99 to examine soldiers' experiences with computers, self- perceptions of their computer skill, and their ability to identify frequently used, Windows-based icons...

  6. Fibonacci’s Computation Methods vs Modern Algorithms

    Directory of Open Access Journals (Sweden)

    Ernesto Burattini

    2013-12-01

    Full Text Available In this paper we discuss some computational procedures given by Leonardo Pisano Fibonacci in his famous Liber Abaci book, and we propose their translation into a modern language for computers (C ++. Among the other we describe the method of “cross” multiplication, we evaluate its computational complexity in algorithmic terms and we show the output of a C ++ code that describes the development of the method applied to the product of two integers. In a similar way we show the operations performed on fractions introduced by Fibonacci. Thanks to the possibility to reproduce on a computer, the Fibonacci’s different computational procedures, it was possible to identify some calculation errors present in the different versions of the original text.

  7. Computational methods in metabolic engineering for strain design.

    Science.gov (United States)

    Long, Matthew R; Ong, Wai Kit; Reed, Jennifer L

    2015-08-01

    Metabolic engineering uses genetic approaches to control microbial metabolism to produce desired compounds. Computational tools can identify new biological routes to chemicals and the changes needed in host metabolism to improve chemical production. Recent computational efforts have focused on exploring what compounds can be made biologically using native, heterologous, and/or enzymes with broad specificity. Additionally, computational methods have been developed to suggest different types of genetic modifications (e.g. gene deletion/addition or up/down regulation), as well as suggest strategies meeting different criteria (e.g. high yield, high productivity, or substrate co-utilization). Strategies to improve the runtime performances have also been developed, which allow for more complex metabolic engineering strategies to be identified. Future incorporation of kinetic considerations will further improve strain design algorithms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  9. The concept of computer software designed to identify and analyse logistics costs in agricultural enterprises

    Directory of Open Access Journals (Sweden)

    Karol Wajszczyk

    2009-01-01

    Full Text Available The study comprised research, development and computer programming works concerning the development of a concept for the IT tool to be used in the identification and analysis of logistics costs in agricultural enterprises in terms of the process-based approach. As a result of research and programming work an overall functional and IT concept of software was developed for the identification and analysis of logistics costs for agricultural enterprises.

  10. (Some) Computer Futures: Mainframes.

    Science.gov (United States)

    Joseph, Earl C.

    Possible futures for the world of mainframe computers can be forecast through studies identifying forces of change and their impact on current trends. Some new prospects for the future have been generated by advances in information technology; for example, recent United States successes in applied artificial intelligence (AI) have created new…

  11. Proposal for a security management in cloud computing for health care.

    Science.gov (United States)

    Haufe, Knut; Dzombeta, Srdan; Brandis, Knud

    2014-01-01

    Cloud computing is actually one of the most popular themes of information systems research. Considering the nature of the processed information especially health care organizations need to assess and treat specific risks according to cloud computing in their information security management system. Therefore, in this paper we propose a framework that includes the most important security processes regarding cloud computing in the health care sector. Starting with a framework of general information security management processes derived from standards of the ISO 27000 family the most important information security processes for health care organizations using cloud computing will be identified considering the main risks regarding cloud computing and the type of information processed. The identified processes will help a health care organization using cloud computing to focus on the most important ISMS processes and establish and operate them at an appropriate level of maturity considering limited resources.

  12. Proposal for a Security Management in Cloud Computing for Health Care

    Directory of Open Access Journals (Sweden)

    Knut Haufe

    2014-01-01

    Full Text Available Cloud computing is actually one of the most popular themes of information systems research. Considering the nature of the processed information especially health care organizations need to assess and treat specific risks according to cloud computing in their information security management system. Therefore, in this paper we propose a framework that includes the most important security processes regarding cloud computing in the health care sector. Starting with a framework of general information security management processes derived from standards of the ISO 27000 family the most important information security processes for health care organizations using cloud computing will be identified considering the main risks regarding cloud computing and the type of information processed. The identified processes will help a health care organization using cloud computing to focus on the most important ISMS processes and establish and operate them at an appropriate level of maturity considering limited resources.

  13. Integrating Computational Science Tools into a Thermodynamics Course

    Science.gov (United States)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  14. Computer Breakdown as a Stress Factor during Task Completion under Time Pressure: Identifying Gender Differences Based on Skin Conductance

    Directory of Open Access Journals (Sweden)

    René Riedl

    2013-01-01

    Full Text Available In today’s society, as computers, the Internet, and mobile phones pervade almost every corner of life, the impact of Information and Communication Technologies (ICT on humans is dramatic. The use of ICT, however, may also have a negative side. Human interaction with technology may lead to notable stress perceptions, a phenomenon referred to as technostress. An investigation of the literature reveals that computer users’ gender has largely been ignored in technostress research, treating users as “gender-neutral.” To close this significant research gap, we conducted a laboratory experiment in which we investigated users’ physiological reaction to the malfunctioning of technology. Based on theories which explain that men, in contrast to women, are more sensitive to “achievement stress,” we predicted that male users would exhibit higher levels of stress than women in cases of system breakdown during the execution of a human-computer interaction task under time pressure, if compared to a breakdown situation without time pressure. Using skin conductance as a stress indicator, the hypothesis was confirmed. Thus, this study shows that user gender is crucial to better understanding the influence of stress factors such as computer malfunctions on physiological stress reactions.

  15. Foliar absorption of phosphorus by common bean

    International Nuclear Information System (INIS)

    Boaretto, A.E.; Rosa, J.P.P.

    1984-01-01

    The effet of urea and/or sucrose on P uptake from H 3 PO 4 and monoammonium phosphate by bean leaves. A solution containing 0.145% P and specific activity 10μ Ci/ml is sprayed early in the morning or late afternoon. Besides the treatment without urea and sucrose, these substances are added in two concentrations 0.66% N + sucrose, and 1.32% N + sucrose. Twenty four hous after application, 52% of the applied P is absorved by the bean trifoliate leaf. (M.A.C.) [pt

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  17. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-02-01

    The role of Nuclear Engineering Education in the application of computers to controlled fusion research can be a very important one. In the near future the use of computers in the numerical modelling of fusion systems should increase substantially. A recent study group has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. In order to meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR laboratories by a communications network. The crucial element that is needed for success is trained personnel. The number of people with knowledge of plasma science and engineering that are trained in numerical methods and computer science is quite small, and must be increased substantially in the next few years. Nuclear Engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing. (U.S.)

  18. ASCR Workshop on Quantum Computing for Science

    Energy Technology Data Exchange (ETDEWEB)

    Aspuru-Guzik, Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Van Dam, Wim [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Farhi, Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gaitan, Frank [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Humble, Travis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jordan, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Landahl, Andrew J [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lucas, Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Preskill, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Muller, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Svore, Krysta [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wiebe, Nathan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Williams, Carl [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms for linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.

  19. Cloud Computing Principles and Paradigms

    CERN Document Server

    Buyya, Rajkumar; Goscinski, Andrzej M

    2010-01-01

    The primary purpose of this book is to capture the state-of-the-art in Cloud Computing technologies and applications. The book will also aim to identify potential research directions and technologies that will facilitate creation a global market-place of cloud computing services supporting scientific, industrial, business, and consumer applications. We expect the book to serve as a reference for larger audience such as systems architects, practitioners, developers, new researchers and graduate level students. This area of research is relatively recent, and as such has no existing reference boo

  20. Computational structural mechanics for engine structures

    Science.gov (United States)

    Chamis, C. C.

    1989-01-01

    The computational structural mechanics (CSM) program at Lewis encompasses: (1) fundamental aspects for formulating and solving structural mechanics problems, and (2) development of integrated software systems to computationally simulate the performance/durability/life of engine structures. It is structured to mainly supplement, complement, and whenever possible replace, costly experimental efforts which are unavoidable during engineering research and development programs. Specific objectives include: investigate unique advantages of parallel and multiprocesses for: reformulating/solving structural mechanics and formulating/solving multidisciplinary mechanics and develop integrated structural system computational simulators for: predicting structural performances, evaluating newly developed methods, and for identifying and prioritizing improved/missing methods needed. Herein the CSM program is summarized with emphasis on the Engine Structures Computational Simulator (ESCS). Typical results obtained using ESCS are described to illustrate its versatility.

  1. Structural Identifiability of Dynamic Systems Biology Models.

    Science.gov (United States)

    Villaverde, Alejandro F; Barreiro, Antonio; Papachristodoulou, Antonis

    2016-10-01

    A powerful way of gaining insight into biological systems is by creating a nonlinear differential equation model, which usually contains many unknown parameters. Such a model is called structurally identifiable if it is possible to determine the values of its parameters from measurements of the model outputs. Structural identifiability is a prerequisite for parameter estimation, and should be assessed before exploiting a model. However, this analysis is seldom performed due to the high computational cost involved in the necessary symbolic calculations, which quickly becomes prohibitive as the problem size increases. In this paper we show how to analyse the structural identifiability of a very general class of nonlinear models by extending methods originally developed for studying observability. We present results about models whose identifiability had not been previously determined, report unidentifiabilities that had not been found before, and show how to modify those unidentifiable models to make them identifiable. This method helps prevent problems caused by lack of identifiability analysis, which can compromise the success of tasks such as experiment design, parameter estimation, and model-based optimization. The procedure is called STRIKE-GOLDD (STRuctural Identifiability taKen as Extended-Generalized Observability with Lie Derivatives and Decomposition), and it is implemented in a MATLAB toolbox which is available as open source software. The broad applicability of this approach facilitates the analysis of the increasingly complex models used in systems biology and other areas.

  2. Computational Dehydration of Crystalline Hydrates Using Molecular Dynamics Simulations

    DEFF Research Database (Denmark)

    Larsen, Anders Støttrup; Rantanen, Jukka; Johansson, Kristoffer E

    2017-01-01

    Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach...... to the dehydration of ampicillin trihydrate. The crystallographic unit cell of the trihydrate is used to construct the simulation cell containing 216 ampicillin and 648 water molecules. This system is dehydrated by removing water molecules during a 2200 ps simulation, and depending on the computational dehydration....... The structural changes could be followed in real time, and in addition, an intermediate amorphous phase was identified. The computationally identified dehydrated structure (anhydrate) was slightly different from the experimentally known anhydrate structure suggesting that the simulated computational structure...

  3. Novel Schemes for Measurement-Based Quantum Computation

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2007-01-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics--based on finitely correlated or projected entangled pair states--to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems

  4. Novel schemes for measurement-based quantum computation.

    Science.gov (United States)

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  5. Controlling Laboratory Processes From A Personal Computer

    Science.gov (United States)

    Will, H.; Mackin, M. A.

    1991-01-01

    Computer program provides natural-language process control from IBM PC or compatible computer. Sets up process-control system that either runs without operator or run by workers who have limited programming skills. Includes three smaller programs. Two of them, written in FORTRAN 77, record data and control research processes. Third program, written in Pascal, generates FORTRAN subroutines used by other two programs to identify user commands with device-driving routines written by user. Also includes set of input data allowing user to define user commands to be executed by computer. Requires personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. Also requires FORTRAN 77 compiler and device drivers written by user.

  6. Bayesian inference for partially identified models exploring the limits of limited data

    CERN Document Server

    Gustafson, Paul

    2015-01-01

    Introduction Identification What Is against Us? What Is for Us? Some Simple Examples of Partially Identified ModelsThe Road Ahead The Structure of Inference in Partially Identified Models Bayesian Inference The Structure of Posterior Distributions in PIMs Computational Strategies Strength of Bayesian Updating, Revisited Posterior MomentsCredible Intervals Evaluating the Worth of Inference Partial Identification versus Model Misspecification The Siren Call of Identification Comp

  7. Radiological difficulty in identifying unicompartmental knee replacement dislocation

    Directory of Open Access Journals (Sweden)

    Mr Oruaro Adebayo Onibere, MBBS, MRCS

    2017-09-01

    Full Text Available Unicondylar knee replacement is a relatively common elective orthopedic procedure but is not often seen in the Emergency Department setting. Familiarity with normal clinical and radiological appearances is difficult to gain. Dislocation of the mobile bearing component “spacer” is a known complication of unicondylar knee replacements, and these patients will initially present to the accident and Emergency Department. In this setting, an accurate and prompt diagnosis is necessary to appropriately manage the patient's condition. There is normally a radiological challenge in identifying dislocated mobile bearings on plain radiographs. These patients may need to have further imaging, such as a computer tomographic scan to identify the dislocated mobile bearing.

  8. Computer Simulation of Reading.

    Science.gov (United States)

    Leton, Donald A.

    In recent years, coding and decoding have been claimed to be the processes for converting one language form to another. But there has been little effort to locate these processes in the human learner or to identify the nature of the internal codes. Computer simulation of reading is useful because the similarities in the human reception and…

  9. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  10. SitesIdentify: a protein functional site prediction tool

    Directory of Open Access Journals (Sweden)

    Doig Andrew J

    2009-11-01

    Full Text Available Abstract Background The rate of protein structures being deposited in the Protein Data Bank surpasses the capacity to experimentally characterise them and therefore computational methods to analyse these structures have become increasingly important. Identifying the region of the protein most likely to be involved in function is useful in order to gain information about its potential role. There are many available approaches to predict functional site, but many are not made available via a publicly-accessible application. Results Here we present a functional site prediction tool (SitesIdentify, based on combining sequence conservation information with geometry-based cleft identification, that is freely available via a web-server. We have shown that SitesIdentify compares favourably to other functional site prediction tools in a comparison of seven methods on a non-redundant set of 237 enzymes with annotated active sites. Conclusion SitesIdentify is able to produce comparable accuracy in predicting functional sites to its closest available counterpart, but in addition achieves improved accuracy for proteins with few characterised homologues. SitesIdentify is available via a webserver at http://www.manchester.ac.uk/bioinformatics/sitesidentify/

  11. Computer self-efficacy - is there a gender gap in tertiary level introductory computing classes?

    Directory of Open Access Journals (Sweden)

    Shirley Gibbs

    Full Text Available This paper explores the relationship between introductory computing students, self-efficacy, and gender. Since the use of computers has become more common there has been speculation that the confidence and ability to use them differs between genders. Self-efficacy is an important and useful concept used to describe how a student may perceive their own ability or confidence in using and learning new technology. A survey of students in an introductory computing class has been completed intermittently since the late 1990\\'s. Although some questions have been adapted to meet the changing technology the aim of the survey has remain unchanged. In this study self-efficacy is measured using two self-rating questions. Students are asked to rate their confidence using a computer and also asked to give their perception of their computing knowledge. This paper examines these two aspects of a person\\'s computer self-efficacy in order to identify any differences that may occur between genders in two introductory computing classes, one in 1999 and the other in 2012. Results from the 1999 survey are compared with those from the survey completed in 2012 and investigated to ascertain if the perception that males were more likely to display higher computer self-efficacy levels than their female classmates does or did exist in a class of this type. Results indicate that while overall there has been a general increase in self-efficacy levels in 2012 compared with 1999, there is no significant gender gap.

  12. Cloud Computing Adoption in Organisations: Review of Empirical Literature

    Directory of Open Access Journals (Sweden)

    Hassan Haslinda

    2017-01-01

    Full Text Available This study reviews literature on cloud computing adoption in organisations to identify its influential factors and its operationalisation in prior literature. We classify the factors that influence the cloud computing adoption using the three contexts suggested by the Technology-Organisation-Environment (TOE framework, namely, technology, organisation, and environment. The finding suggests that the influences of these factors vary across studies and most of the studies have operationalised cloud computing adoption using intention to adopt cloud computing or binary variable, rather than the actual use of the technology.

  13. Distributed Persistent Identifiers System Design

    Directory of Open Access Journals (Sweden)

    Pavel Golodoniuc

    2017-06-01

    Full Text Available The need to identify both digital and physical objects is ubiquitous in our society. Past and present persistent identifier (PID systems, of which there is a great variety in terms of technical and social implementation, have evolved with the advent of the Internet, which has allowed for globally unique and globally resolvable identifiers. PID systems have, by in large, catered for identifier uniqueness, integrity, and persistence, regardless of the identifier’s application domain. Trustworthiness of these systems has been measured by the criteria first defined by Bütikofer (2009 and further elaborated by Golodoniuc 'et al'. (2016 and Car 'et al'. (2017. Since many PID systems have been largely conceived and developed by a single organisation they faced challenges for widespread adoption and, most importantly, the ability to survive change of technology. We believe that a cause of PID systems that were once successful fading away is the centralisation of support infrastructure – both organisational and computing and data storage systems. In this paper, we propose a PID system design that implements the pillars of a trustworthy system – ensuring identifiers’ independence of any particular technology or organisation, implementation of core PID system functions, separation from data delivery, and enabling the system to adapt for future change. We propose decentralisation at all levels — persistent identifiers and information objects registration, resolution, and data delivery — using Distributed Hash Tables and traditional peer-to-peer networks with information replication and caching mechanisms, thus eliminating the need for a central PID data store. This will increase overall system fault tolerance thus ensuring its trustworthiness. We also discuss important aspects of the distributed system’s governance, such as the notion of the authoritative source and data integrity

  14. Availability and Use of Telecommunication/Computer Information ...

    African Journals Online (AJOL)

    buz_centr_03

    the study area, identify Information and Communication Technologies (ICTs) in telecommunication/computer .... recommended that extension should forge new links and create network for sharing knowledge and experience .... Food Security.

  15. Computer Vision System For Locating And Identifying Defects In Hardwood Lumber

    Science.gov (United States)

    Conners, Richard W.; Ng, Chong T.; Cho, Tai-Hoon; McMillin, Charles W.

    1989-03-01

    This paper describes research aimed at developing an automatic cutup system for use in the rough mills of the hardwood furniture and fixture industry. In particular, this paper describes attempts to create the vision system that will power this automatic cutup system. There are a number of factors that make the development of such a vision system a challenge. First there is the innate variability of the wood material itself. No two species look exactly the same, in fact, they can have a significant visual difference in appearance among species. Yet a truly robust vision system must be able to handle a variety of such species, preferably with no operator intervention required when changing from one species to another. Secondly, there is a good deal of variability in the definition of what constitutes a removable defect. The hardwood furniture and fixture industry is diverse in the nature of the products that it makes. The products range from hardwood flooring to fancy hardwood furniture, from simple mill work to kitchen cabinets. Thus depending on the manufacturer, the product, and the quality of the product the nature of what constitutes a removable defect can and does vary. The vision system must be such that it can be tailored to meet each of these unique needs, preferably without any additional program modifications. This paper will describe the vision system that has been developed. It will assess the current system capabilities, and it will discuss the directions for future research. It will be argued that artificial intelligence methods provide a natural mechanism for attacking this computer vision application.

  16. Automated computation of arbor densities: a step toward identifying neuronal cell types

    Directory of Open Access Journals (Sweden)

    Uygar eSümbül

    2014-11-01

    Full Text Available The shape and position of a neuron convey information regarding its molecular and functional identity. The identification of cell types from structure, a classic method, relies on the time-consuming step of arbor tracing. However, as genetic tools and imaging methods make data-driven approaches to neuronal circuit analysis feasible, the need for automated processing increases. Here, we first establish that mouse retinal ganglion cell types can be as precise about distributing their arbor volumes across the inner plexiform layer as they are about distributing the skeletons of the arbors. Then, we describe an automated approach to computing the spatial distribution of the dendritic arbors, or arbor density, with respect to a global depth coordinate based on this observation. Our method involves three-dimensional reconstruction of neuronal arbors by a supervised machine learning algorithm, post-processing of the enhanced stacks to remove somata and isolate the neuron of interest, and registration of neurons to each other using automatically detected arbors of the starburst amacrine interneurons as fiducial markers. In principle, this method could be generalizable to other structures of the CNS, provided that they allow sparse labeling of the cells and contain a reliable axis of spatial reference.

  17. Office ergonomics: deficiencies in computer workstation design.

    Science.gov (United States)

    Shikdar, Ashraf A; Al-Kindi, Mahmoud A

    2007-01-01

    The objective of this research was to study and identify ergonomic deficiencies in computer workstation design in typical offices. Physical measurements and a questionnaire were used to study 40 workstations. Major ergonomic deficiencies were found in physical design and layout of the workstations, employee postures, work practices, and training. The consequences in terms of user health and other problems were significant. Forty-five percent of the employees used nonadjustable chairs, 48% of computers faced windows, 90% of the employees used computers more than 4 hrs/day, 45% of the employees adopted bent and unsupported back postures, and 20% used office tables for computers. Major problems reported were eyestrain (58%), shoulder pain (45%), back pain (43%), arm pain (35%), wrist pain (30%), and neck pain (30%). These results indicated serious ergonomic deficiencies in office computer workstation design, layout, and usage. Strategies to reduce or eliminate ergonomic deficiencies in computer workstation design were suggested.

  18. 14 CFR 417.123 - Computing systems and software.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated with...

  19. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  20. Learning to Identify Local Flora with Human Feedback (Author’s Manuscript)

    Science.gov (United States)

    2014-06-23

    cally tag images with species names of flora or fauna to sup- port content-based retrieval [10]. Detecting and identifying species could help to infer...Learning to Identify Local Flora with Human Feedback Stefan Lee and David Crandall School of Informatics and Computing Indiana University {steflee...applications that use consumer pho- tos to track the distribution of natural phenomena [8]. But flora identification is a very difficult problem, both

  1. From Computer Forensics to Forensic Computing: Investigators Investigate, Scientists Associate

    OpenAIRE

    Dewald, Andreas; Freiling, Felix C.

    2014-01-01

    This paper draws a comparison of fundamental theories in traditional forensic science and the state of the art in current computer forensics, thereby identifying a certain disproportion between the perception of central aspects in common theory and the digital forensics reality. We propose a separation of what is currently demanded of practitioners in digital forensics into a rigorous scientific part on the one hand, and a more general methodology of searching and seizing digital evidence an...

  2. Computational mechanics research at ONR

    International Nuclear Information System (INIS)

    Kushner, A.S.

    1986-01-01

    Computational mechanics is not an identified program at the Office of Naval Research (ONR), but rather plays a key role in the Solid Mechanics, Fluid Mechanics, Energy Conversion, and Materials Science programs. The basic philosophy of the Mechanics Division at ONR is to support fundamental research which expands the basis for understanding, predicting, and controlling the behavior of solid and fluid materials and systems at the physical and geometric scales appropriate to the phenomena of interest. It is shown in this paper that a strong commonalty of computational mechanics drivers exists for the forefront research areas in both solid and fluid mechanics

  3. CLOUD COMPUTING SECURITY ISSUES

    Directory of Open Access Journals (Sweden)

    Florin OGIGAU-NEAMTIU

    2012-01-01

    Full Text Available The term “cloud computing” has been in the spotlights of IT specialists the last years because of its potential to transform this industry. The promised benefits have determined companies to invest great sums of money in researching and developing this domain and great steps have been made towards implementing this technology. Managers have traditionally viewed IT as difficult and expensive and the promise of cloud computing leads many to think that IT will now be easy and cheap. The reality is that cloud computing has simplified some technical aspects of building computer systems, but the myriad challenges facing IT environment still remain. Organizations which consider adopting cloud based services must also understand the many major problems of information policy, including issues of privacy, security, reliability, access, and regulation. The goal of this article is to identify the main security issues and to draw the attention of both decision makers and users to the potential risks of moving data into “the cloud”.

  4. Soft X-ray radio-sensitivities of pollens in several fruit species

    International Nuclear Information System (INIS)

    Hu Chungen; Deng Xiuxin

    1996-01-01

    Irradiated with different dosages of soft X-ray, pollen germinations of prunus baimang, pear kieffer, trifoliate orange and pummelo, were investigated immediately or several days later after irradiation. The results revealed that the pollens of these fruit tress had different sensitivites to soft X-ray and various responses to storage duration. Therefore, even to the same kind of pollen, irradiation with different optimal exposure doses, as well as pollination at different times during storage, should be adopted variously according to the different aims and methods of breeding programs. (author)

  5. Soil Erosion Estimation Using Grid-based Computation

    Directory of Open Access Journals (Sweden)

    Josef Vlasák

    2005-06-01

    Full Text Available Soil erosion estimation is an important part of a land consolidation process. Universal soil loss equation (USLE was presented by Wischmeier and Smith. USLE computation uses several factors, namely R – rainfall factor, K – soil erodability, L – slope length factor, S – slope gradient factor, C – cropping management factor, and P – erosion control management factor. L and S factors are usually combined to one LS factor – Topographic factor. The single factors are determined from several sources, such as DTM (Digital Terrain Model, BPEJ – soil type map, aerial and satellite images, etc. A conventional approach to the USLE computation, which is widely used in the Czech Republic, is based on the selection of characteristic profiles for which all above-mentioned factors must be determined. The result (G – annual soil loss of such computation is then applied for a whole area (slope of interest. Another approach to the USLE computation uses grids as a main data-structure. A prerequisite for a grid-based USLE computation is that each of the above-mentioned factors exists as a separate grid layer. The crucial step in this computation is a selection of appropriate grid resolution (grid cell size. A large cell size can cause an undesirable precision degradation. Too small cell size can noticeably slow down the whole computation. Provided that the cell size is derived from the source’s precision, the appropriate cell size for the Czech Republic varies from 30m to 50m. In some cases, especially when new surveying was done, grid computations can be performed with higher accuracy, i.e. with a smaller grid cell size. In such case, we have proposed a new method using the two-step computation. The first step computation uses a bigger cell size and is designed to identify higher erosion spots. The second step then uses a smaller cell size but it make the computation only the area identified in the previous step. This decomposition allows a

  6. The Importance of Business Model Factors for Cloud Computing Adoption: Role of Previous Experiences

    Directory of Open Access Journals (Sweden)

    Bogataj Habjan Kristina

    2017-08-01

    Full Text Available Background and Purpose: Bringing several opportunities for more effective and efficient IT governance and service exploitation, cloud computing is expected to impact the European and global economies significantly. Market data show that despite many advantages and promised benefits the adoption of cloud computing is not as fast and widespread as foreseen. This situation shows the need for further exploration of the potentials of cloud computing and its implementation on the market. The purpose of this research was to identify individual business model factors with the highest impact on cloud computing adoption. In addition, the aim was to identify the differences in opinion regarding the importance of business model factors on cloud computing adoption according to companies’ previous experiences with cloud computing services.

  7. Securing Cloud Computing from Different Attacks Using Intrusion Detection Systems

    Directory of Open Access Journals (Sweden)

    Omar Achbarou

    2017-03-01

    Full Text Available Cloud computing is a new way of integrating a set of old technologies to implement a new paradigm that creates an avenue for users to have access to shared and configurable resources through internet on-demand. This system has many common characteristics with distributed systems, hence, the cloud computing also uses the features of networking. Thus the security is the biggest issue of this system, because the services of cloud computing is based on the sharing. Thus, a cloud computing environment requires some intrusion detection systems (IDSs for protecting each machine against attacks. The aim of this work is to present a classification of attacks threatening the availability, confidentiality and integrity of cloud resources and services. Furthermore, we provide literature review of attacks related to the identified categories. Additionally, this paper also introduces related intrusion detection models to identify and prevent these types of attacks.

  8. Managing the Risks Associated with End-User Computing.

    Science.gov (United States)

    Alavi, Maryam; Weiss, Ira R.

    1986-01-01

    Identifies organizational risks of end-user computing (EUC) associated with different stages of the end-user applications life cycle (analysis, design, implementation). Generic controls are identified that address each of the risks enumerated in a manner that allows EUC management to select those most appropriate to their EUC environment. (5…

  9. Exploring Students Intentions to Study Computer Science and Identifying the Differences among ICT and Programming Based Courses

    Science.gov (United States)

    Giannakos, Michail N.

    2014-01-01

    Computer Science (CS) courses comprise both Programming and Information and Communication Technology (ICT) issues; however these two areas have substantial differences, inter alia the attitudes and beliefs of the students regarding the intended learning content. In this research, factors from the Social Cognitive Theory and Unified Theory of…

  10. Computer-based visual communication in aphasia.

    Science.gov (United States)

    Steele, R D; Weinrich, M; Wertz, R T; Kleczewska, M K; Carlson, G S

    1989-01-01

    The authors describe their recently developed Computer-aided VIsual Communication (C-VIC) system, and report results of single-subject experimental designs probing its use with five chronic, severely impaired aphasic individuals. Studies replicate earlier results obtained with a non-computerized system, demonstrate patient competence with the computer implementation, extend the system's utility, and identify promising areas of application. Results of the single-subject experimental designs clarify patients' learning, generalization, and retention patterns, and highlight areas of performance difficulties. Future directions for the project are indicated.

  11. System reliability analysis using dominant failure modes identified by selective searching technique

    International Nuclear Information System (INIS)

    Kim, Dong-Seok; Ok, Seung-Yong; Song, Junho; Koh, Hyun-Moo

    2013-01-01

    The failure of a redundant structural system is often described by innumerable system failure modes such as combinations or sequences of local failures. An efficient approach is proposed to identify dominant failure modes in the space of random variables, and then perform system reliability analysis to compute the system failure probability. To identify dominant failure modes in the decreasing order of their contributions to the system failure probability, a new simulation-based selective searching technique is developed using a genetic algorithm. The system failure probability is computed by a multi-scale matrix-based system reliability (MSR) method. Lower-scale MSR analyses evaluate the probabilities of the identified failure modes and their statistical dependence. A higher-scale MSR analysis evaluates the system failure probability based on the results of the lower-scale analyses. Three illustrative examples demonstrate the efficiency and accuracy of the approach through comparison with existing methods and Monte Carlo simulations. The results show that the proposed method skillfully identifies the dominant failure modes, including those neglected by existing approaches. The multi-scale MSR method accurately evaluates the system failure probability with statistical dependence fully considered. The decoupling between the failure mode identification and the system reliability evaluation allows for effective applications to larger structural systems

  12. Accident sequence analysis of human-computer interface design

    International Nuclear Information System (INIS)

    Fan, C.-F.; Chen, W.-H.

    2000-01-01

    It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

  13. Can Tablet Computers Enhance Faculty Teaching?

    Science.gov (United States)

    Narayan, Aditee P; Whicker, Shari A; Benjamin, Robert W; Hawley, Jeffrey; McGann, Kathleen A

    2015-06-01

    Learner benefits of tablet computer use have been demonstrated, yet there is little evidence regarding faculty tablet use for teaching. Our study sought to determine if supplying faculty with tablet computers and peer mentoring provided benefits to learners and faculty beyond that of non-tablet-based teaching modalities. We provided faculty with tablet computers and three 2-hour peer-mentoring workshops on tablet-based teaching. Faculty used tablets to teach, in addition to their current, non-tablet-based methods. Presurveys, postsurveys, and monthly faculty surveys assessed feasibility, utilization, and comparisons to current modalities. Learner surveys assessed perceived effectiveness and comparisons to current modalities. All feedback received from open-ended questions was reviewed by the authors and organized into categories. Of 15 eligible faculty, 14 participated. Each participant attended at least 2 of the 3 workshops, with 10 to 12 participants at each workshop. All participants found the workshops useful, and reported that the new tablet-based teaching modality added value beyond that of current teaching methods. Respondents developed the following tablet-based outputs: presentations, photo galleries, evaluation tools, and online modules. Of the outputs, 60% were used in the ambulatory clinics, 33% in intensive care unit bedside teaching rounds, and 7% in inpatient medical unit bedside teaching rounds. Learners reported that common benefits of tablet computers were: improved access/convenience (41%), improved interactive learning (38%), and improved bedside teaching and patient care (13%). A common barrier faculty identified was inconsistent wireless access (14%), while no barriers were identified by the majority of learners. Providing faculty with tablet computers and having peer-mentoring workshops to discuss their use was feasible and added value.

  14. Process-Based Development of Competence Models to Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  15. Promoter-enhancer interactions identified from Hi-C data using probabilistic models and hierarchical topological domains.

    Science.gov (United States)

    Ron, Gil; Globerson, Yuval; Moran, Dror; Kaplan, Tommy

    2017-12-21

    Proximity-ligation methods such as Hi-C allow us to map physical DNA-DNA interactions along the genome, and reveal its organization into topologically associating domains (TADs). As the Hi-C data accumulate, computational methods were developed for identifying domain borders in multiple cell types and organisms. Here, we present PSYCHIC, a computational approach for analyzing Hi-C data and identifying promoter-enhancer interactions. We use a unified probabilistic model to segment the genome into domains, which we then merge hierarchically and fit using a local background model, allowing us to identify over-represented DNA-DNA interactions across the genome. By analyzing the published Hi-C data sets in human and mouse, we identify hundreds of thousands of putative enhancers and their target genes, and compile an extensive genome-wide catalog of gene regulation in human and mouse. As we show, our predictions are highly enriched for ChIP-seq and DNA accessibility data, evolutionary conservation, eQTLs and other DNA-DNA interaction data.

  16. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science.

    Science.gov (United States)

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-10-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, "Interdisciplinary Insights into Group and Team Dynamics," which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges.

  17. A Computer-Aided Writing Program for Learning Disabled Adolescents.

    Science.gov (United States)

    Fais, Laurie; Wanderman, Richard

    The paper describes the application of a computer-assisted writing program in a special high school for learning disabled and dyslexic students and reports on a study of the program's effectiveness. Particular advantages of the Macintosh Computer for such a program are identified including use of the mouse pointing tool, graphic icons to identify…

  18. Computer-related standards for the petroleum industry

    International Nuclear Information System (INIS)

    Winczewski, L.M.

    1992-01-01

    Rapid application of the computer to all areas of the petroleum industry is straining the capabilities of corporations and vendors to efficiently integrate computer tools into the work environment. Barriers to this integration arose form decades of competitive development of proprietary applications formats, along with compilation of data bases in isolation. Rapidly emerging industry-wide standards relating to computer applications and data management are poised to topple these barriers. This paper identifies the most active players within a rapidly evolving group of cooperative standardization activities sponsored by the petroleum industry. Summarized are their objectives, achievements, current activities and relationships to each other. The trends of these activities are assessed and projected

  19. Critical services in the LHC computing

    International Nuclear Information System (INIS)

    Sciaba, A

    2010-01-01

    The LHC experiments (ALICE, ATLAS, CMS and LHCb) rely for the data acquisition, processing, distribution, analysis and simulation on complex computing systems, running using a variety of services, provided by the experiments, the Worldwide LHC Computing Grid and the different computing centres. These services range from the most basic (network, batch systems, file systems) to the mass storage services or the Grid information system, up to the different workload management systems, data catalogues and data transfer tools, often internally developed in the collaborations. In this contribution we review the status of the services most critical to the experiments by quantitatively measuring their readiness with respect to the start of the LHC operations. Shortcomings are identified and common recommendations are offered.

  20. GPU-accelerated computation of electron transfer.

    Science.gov (United States)

    Höfinger, Siegfried; Acocella, Angela; Pop, Sergiu C; Narumi, Tetsu; Yasuoka, Kenji; Beu, Titus; Zerbetto, Francesco

    2012-11-05

    Electron transfer is a fundamental process that can be studied with the help of computer simulation. The underlying quantum mechanical description renders the problem a computationally intensive application. In this study, we probe the graphics processing unit (GPU) for suitability to this type of problem. Time-critical components are identified via profiling of an existing implementation and several different variants are tested involving the GPU at increasing levels of abstraction. A publicly available library supporting basic linear algebra operations on the GPU turns out to accelerate the computation approximately 50-fold with minor dependence on actual problem size. The performance gain does not compromise numerical accuracy and is of significant value for practical purposes. Copyright © 2012 Wiley Periodicals, Inc.

  1. Intelligent computational systems for space applications

    Science.gov (United States)

    Lum, Henry; Lau, Sonie

    Intelligent computational systems can be described as an adaptive computational system integrating both traditional computational approaches and artificial intelligence (AI) methodologies to meet the science and engineering data processing requirements imposed by specific mission objectives. These systems will be capable of integrating, interpreting, and understanding sensor input information; correlating that information to the "world model" stored within its data base and understanding the differences, if any; defining, verifying, and validating a command sequence to merge the "external world" with the "internal world model"; and, controlling the vehicle and/or platform to meet the scientific and engineering mission objectives. Performance and simulation data obtained to date indicate that the current flight processors baselined for many missions such as Space Station Freedom do not have the computational power to meet the challenges of advanced automation and robotics systems envisioned for the year 2000 era. Research issues which must be addressed to achieve greater than giga-flop performance for on-board intelligent computational systems have been identified, and a technology development program has been initiated to achieve the desired long-term system performance objectives.

  2. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  3. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  4. Edge computing technologies for Internet of Things: a primer

    Directory of Open Access Journals (Sweden)

    Yuan Ai

    2018-04-01

    Full Text Available With the rapid development of mobile internet and Internet of Things applications, the conventional centralized cloud computing is encountering severe challenges, such as high latency, low Spectral Efficiency (SE, and non-adaptive machine type of communication. Motivated to solve these challenges, a new technology is driving a trend that shifts the function of centralized cloud computing to edge devices of networks. Several edge computing technologies originating from different backgrounds to decrease latency, improve SE, and support the massive machine type of communication have been emerging. This paper comprehensively presents a tutorial on three typical edge computing technologies, namely mobile edge computing, cloudlets, and fog computing. In particular, the standardization efforts, principles, architectures, and applications of these three technologies are summarized and compared. From the viewpoint of radio access network, the differences between mobile edge computing and fog computing are highlighted, and the characteristics of fog computing-based radio access network are discussed. Finally, open issues and future research directions are identified as well. Keywords: Internet of Things (IoT, Mobile edge computing, Cloudlets, Fog computing

  5. Computational Aspects of Cooperative Game Theory

    CERN Document Server

    Chalkiadakis, Georgios; Wooldridge, Michael

    2011-01-01

    Cooperative game theory is a branch of (micro-)economics that studies the behavior of self-interested agents in strategic settings where binding agreements among agents are possible. Our aim in this book is to present a survey of work on the computational aspects of cooperative game theory. We begin by formally defining transferable utility games in characteristic function form, and introducing key solution concepts such as the core and the Shapley value. We then discuss two major issues that arise when considering such games from a computational perspective: identifying compact representation

  6. Software For Computer-Security Audits

    Science.gov (United States)

    Arndt, Kate; Lonsford, Emily

    1994-01-01

    Information relevant to potential breaches of security gathered efficiently. Automated Auditing Tools for VAX/VMS program includes following automated software tools performing noted tasks: Privileged ID Identification, program identifies users and their privileges to circumvent existing computer security measures; Critical File Protection, critical files not properly protected identified; Inactive ID Identification, identifications of users no longer in use found; Password Lifetime Review, maximum lifetimes of passwords of all identifications determined; and Password Length Review, minimum allowed length of passwords of all identifications determined. Written in DEC VAX DCL language.

  7. On Identifying which Intermediate Nodes Should Code in Multicast Networks

    DEFF Research Database (Denmark)

    Pinto, Tiago; Roetter, Daniel Enrique Lucani; Médard, Muriel

    2013-01-01

    the data packets. Previous work has shown that in lossless wireline networks, the performance of tree-packing mechanisms is comparable to network coding, albeit with added complexity at the time of computing the trees. This means that most nodes in the network need not code. Thus, mechanisms that identify...... intermediate nodes that do require coding is instrumental for the efficient operation of coded networks and can have a significant impact in overall energy consumption. We present a distributed, low complexity algorithm that allows every node to identify if it should code and, if so, through what output link...

  8. Computer science: Data analysis meets quantum physics

    Science.gov (United States)

    Schramm, Steven

    2017-10-01

    A technique that combines machine learning and quantum computing has been used to identify the particles known as Higgs bosons. The method could find applications in many areas of science. See Letter p.375

  9. Concordance-based Kendall's Correlation for Computationally-Light vs. Computationally-Heavy Centrality Metrics: Lower Bound for Correlation

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2017-01-01

    Full Text Available We identify three different levels of correlation (pair-wise relative ordering, network-wide ranking and linear regression that could be assessed between a computationally-light centrality metric and a computationally-heavy centrality metric for real-world networks. The Kendall's concordance-based correlation measure could be used to quantitatively assess how well we could consider the relative ordering of two vertices vi and vj with respect to a computationally-light centrality metric as the relative ordering of the same two vertices with respect to a computationally-heavy centrality metric. We hypothesize that the pair-wise relative ordering (concordance-based assessment of the correlation between centrality metrics is the most strictest of all the three levels of correlation and claim that the Kendall's concordance-based correlation coefficient will be lower than the correlation coefficient observed with the more relaxed levels of correlation measures (linear regression-based Pearson's product-moment correlation coefficient and the network wide ranking-based Spearman's correlation coefficient. We validate our hypothesis by evaluating the three correlation coefficients between two sets of centrality metrics: the computationally-light degree and local clustering coefficient complement-based degree centrality metrics and the computationally-heavy eigenvector centrality, betweenness centrality and closeness centrality metrics for a diverse collection of 50 real-world networks.

  10. Structural identifiability analysis of a cardiovascular system model.

    Science.gov (United States)

    Pironet, Antoine; Dauby, Pierre C; Chase, J Geoffrey; Docherty, Paul D; Revie, James A; Desaive, Thomas

    2016-05-01

    The six-chamber cardiovascular system model of Burkhoff and Tyberg has been used in several theoretical and experimental studies. However, this cardiovascular system model (and others derived from it) are not identifiable from any output set. In this work, two such cases of structural non-identifiability are first presented. These cases occur when the model output set only contains a single type of information (pressure or volume). A specific output set is thus chosen, mixing pressure and volume information and containing only a limited number of clinically available measurements. Then, by manipulating the model equations involving these outputs, it is demonstrated that the six-chamber cardiovascular system model is structurally globally identifiable. A further simplification is made, assuming known cardiac valve resistances. Because of the poor practical identifiability of these four parameters, this assumption is usual. Under this hypothesis, the six-chamber cardiovascular system model is structurally identifiable from an even smaller dataset. As a consequence, parameter values computed from limited but well-chosen datasets are theoretically unique. This means that the parameter identification procedure can safely be performed on the model from such a well-chosen dataset. Thus, the model may be considered suitable for use in diagnosis. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  11. Computer Literacy of Iranian Teachers of English as a Foreign Language: Challenges and Obstacles

    Science.gov (United States)

    Dashtestani, Reza

    2014-01-01

    Basically, one of the requirements for the implementation of computer-assisted language learning (CALL) is English as a foreign language (EFL) teachers' ability to use computers effectively. Educational authorities and planners should identify EFL teachers' computer literacy levels and make attempts to improve the teachers' computer competence.…

  12. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    DOE Order 5637.1, ''Classified Computer Security,'' requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, we have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system. 1 tab

  13. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    This paper reports on DIE Order 5637.1, Classified Computer Security, which requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, the authors have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system

  14. GRID computing for experimental high energy physics

    International Nuclear Information System (INIS)

    Moloney, G.R.; Martin, L.; Seviour, E.; Taylor, G.N.; Moorhead, G.F.

    2002-01-01

    Full text: The Large Hadron Collider (LHC), to be completed at the CERN laboratory in 2006, will generate 11 petabytes of data per year. The processing of this large data stream requires a large, distributed computing infrastructure. A recent innovation in high performance distributed computing, the GRID, has been identified as an important tool in data analysis for the LHC. GRID computing has actual and potential application in many fields which require computationally intensive analysis of large, shared data sets. The Australian experimental High Energy Physics community has formed partnerships with the High Performance Computing community to establish a GRID node at the University of Melbourne. Through Australian membership of the ATLAS experiment at the LHC, Australian researchers have an opportunity to be involved in the European DataGRID project. This presentation will include an introduction to the GRID, and it's application to experimental High Energy Physics. We will present the results of our studies, including participation in the first LHC data challenge

  15. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  16. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  17. Three surgical planes identified in laparoscopic complete mesocolic excision for right-sided colon cancer.

    Science.gov (United States)

    Zhu, Da-Jian; Chen, Xiao-Wu; OuYang, Man-Zhao; Lu, Yan

    2016-01-12

    Complete mesocolic excision provides a correct anatomical plane for colon cancer surgery. However, manifestation of the surgical plane during laparoscopic complete mesocolic excision versus in computed tomography images remains to be examined. Patients who underwent laparoscopic complete mesocolic excision for right-sided colon cancer underwent an abdominal computed tomography scan. The spatial relationship of the intraoperative surgical planes were examined, and then computed tomography reconstruction methods were applied. The resulting images were analyzed. In 44 right-sided colon cancer patients, the surgical plane for laparoscopic complete mesocolic excision was found to be composed of three surgical planes that were identified by computed tomography imaging with cross-sectional multiplanar reconstruction, maximum intensity projection, and volume reconstruction. For the operations performed, the mean bleeding volume was 73±32.3 ml and the mean number of harvested lymph nodes was 22±9.7. The follow-up period ranged from 6-40 months (mean 21.2), and only two patients had distant metastases. The laparoscopic complete mesocolic excision surgical plane for right-sided colon cancer is composed of three surgical planes. When these surgical planes were identified, laparoscopic complete mesocolic excision was a safe and effective procedure for the resection of colon cancer.

  18. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  19. CloudNeo: a cloud pipeline for identifying patient-specific tumor neoantigens.

    Science.gov (United States)

    Bais, Preeti; Namburi, Sandeep; Gatti, Daniel M; Zhang, Xinyu; Chuang, Jeffrey H

    2017-10-01

    We present CloudNeo, a cloud-based computational workflow for identifying patient-specific tumor neoantigens from next generation sequencing data. Tumor-specific mutant peptides can be detected by the immune system through their interactions with the human leukocyte antigen complex, and neoantigen presence has recently been shown to correlate with anti T-cell immunity and efficacy of checkpoint inhibitor therapy. However computing capabilities to identify neoantigens from genomic sequencing data are a limiting factor for understanding their role. This challenge has grown as cancer datasets become increasingly abundant, making them cumbersome to store and analyze on local servers. Our cloud-based pipeline provides scalable computation capabilities for neoantigen identification while eliminating the need to invest in local infrastructure for data transfer, storage or compute. The pipeline is a Common Workflow Language (CWL) implementation of human leukocyte antigen (HLA) typing using Polysolver or HLAminer combined with custom scripts for mutant peptide identification and NetMHCpan for neoantigen prediction. We have demonstrated the efficacy of these pipelines on Amazon cloud instances through the Seven Bridges Genomics implementation of the NCI Cancer Genomics Cloud, which provides graphical interfaces for running and editing, infrastructure for workflow sharing and version tracking, and access to TCGA data. The CWL implementation is at: https://github.com/TheJacksonLaboratory/CloudNeo. For users who have obtained licenses for all internal software, integrated versions in CWL and on the Seven Bridges Cancer Genomics Cloud platform (https://cgc.sbgenomics.com/, recommended version) can be obtained by contacting the authors. jeff.chuang@jax.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  20. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  1. Energy Conservation Using Dynamic Voltage Frequency Scaling for Computational Cloud

    Directory of Open Access Journals (Sweden)

    A. Paulin Florence

    2016-01-01

    Full Text Available Cloud computing is a new technology which supports resource sharing on a “Pay as you go” basis around the world. It provides various services such as SaaS, IaaS, and PaaS. Computation is a part of IaaS and the entire computational requests are to be served efficiently with optimal power utilization in the cloud. Recently, various algorithms are developed to reduce power consumption and even Dynamic Voltage and Frequency Scaling (DVFS scheme is also used in this perspective. In this paper we have devised methodology which analyzes the behavior of the given cloud request and identifies the associated type of algorithm. Once the type of algorithm is identified, using their asymptotic notations, its time complexity is calculated. Using best fit strategy the appropriate host is identified and the incoming job is allocated to the victimized host. Using the measured time complexity the required clock frequency of the host is measured. According to that CPU frequency is scaled up or down using DVFS scheme, enabling energy to be saved up to 55% of total Watts consumption.

  2. Hubungan Antara Faktor Risiko Individual Dan Komputer Terhadap Kejadian Computer Vision Syndrome

    OpenAIRE

    Azkadina, Amira; Julianti, Hari Peni; Pramono, Dodik

    2012-01-01

    Background : Computer USAge could cause health complaints called Computer Vision Syndrome (CVS). This syndrome was influenced by individual and computer risk factors. The objective of the study is to identify and to analyze individual and computer factors of Computer Vision Syndrome (CVS).Method : The study was an observational study by using case control method, which was held on May-June 2012 in RSI Sultan Agung, RSUP dr.Kariadi, and Bank Jateng. The samples were 60 people who were chosen b...

  3. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  4. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  5. SPARQL-enabled identifier conversion with Identifiers.org.

    Science.gov (United States)

    Wimalaratne, Sarala M; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-06-01

    On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. © The Author 2015. Published by Oxford University Press.

  6. SPARQL-enabled identifier conversion with Identifiers.org

    Science.gov (United States)

    Wimalaratne, Sarala M.; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-01-01

    Motivation: On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. Results: We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. Availability and implementation: The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. Contact: sarala@ebi.ac.uk PMID:25638809

  7. Computational methods for protein identification from mass spectrometry data.

    Directory of Open Access Journals (Sweden)

    Leo McHugh

    2008-02-01

    Full Text Available Protein identification using mass spectrometry is an indispensable computational tool in the life sciences. A dramatic increase in the use of proteomic strategies to understand the biology of living systems generates an ongoing need for more effective, efficient, and accurate computational methods for protein identification. A wide range of computational methods, each with various implementations, are available to complement different proteomic approaches. A solid knowledge of the range of algorithms available and, more critically, the accuracy and effectiveness of these techniques is essential to ensure as many of the proteins as possible, within any particular experiment, are correctly identified. Here, we undertake a systematic review of the currently available methods and algorithms for interpreting, managing, and analyzing biological data associated with protein identification. We summarize the advances in computational solutions as they have responded to corresponding advances in mass spectrometry hardware. The evolution of scoring algorithms and metrics for automated protein identification are also discussed with a focus on the relative performance of different techniques. We also consider the relative advantages and limitations of different techniques in particular biological contexts. Finally, we present our perspective on future developments in the area of computational protein identification by considering the most recent literature on new and promising approaches to the problem as well as identifying areas yet to be explored and the potential application of methods from other areas of computational biology.

  8. Identifying Ghanaian Pre-Service Teachers' Readiness for Computer Use: A Technology Acceptance Model Approach

    Science.gov (United States)

    Gyamfi, Stephen Adu

    2016-01-01

    This study extends the technology acceptance model to identify factors that influence technology acceptance among pre-service teachers in Ghana. Data from 380 usable questionnaires were tested against the research model. Utilising the extended technology acceptance model (TAM) as a research framework, the study found that: pre-service teachers'…

  9. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  10. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    OpenAIRE

    Grover Kearns

    2010-01-01

    Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants). Accounting stu...

  11. Review of Cloud Computing and existing Frameworks for Cloud adoption

    OpenAIRE

    Chang, Victor; Walters, Robert John; Wills, Gary

    2014-01-01

    This paper presents a selected review for Cloud Computing and explains the benefits and risks of adopting Cloud Computing in a business environment. Although all the risks identified may be associated with two major Cloud adoption challenges, a framework is required to support organisations as they begin to use Cloud and minimise risks of Cloud adoption. Eleven Cloud Computing frameworks are investigated and a comparison of their strengths and limitations is made; the result of the comparison...

  12. Computer aided approach for qualitative risk assessment of engineered systems

    International Nuclear Information System (INIS)

    Crowley, W.K.; Arendt, J.S.; Fussell, J.B.; Rooney, J.J.; Wagner, D.P.

    1978-01-01

    This paper outlines a computer aided methodology for determining the relative contributions of various subsystems and components to the total risk associated with an engineered system. Major contributors to overall task risk are identified through comparison of an expected frequency density function with an established risk criterion. Contributions that are inconsistently high are also identified. The results from this analysis are useful for directing efforts for improving system safety and performance. An analysis of uranium hexafluoride handling risk at a gaseous diffusion uranium enrichment plant using a preliminary version of the computer program EXCON is briefly described and illustrated

  13. Examples of testing global identifiability of biological and biomedical models with the DAISY software.

    Science.gov (United States)

    Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina

    2010-04-01

    DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.

  14. A new computed tomography method to identify meningitis-related cochlear ossification and fibrosis before cochlear implantation.

    Science.gov (United States)

    Ichikawa, Kazunori; Kashio, Akinori; Mori, Harushi; Ochi, Atushi; Karino, Shotaro; Sakamoto, Takashi; Kakigi, Akinobu; Yamasoba, Tatsuya

    2014-04-01

    To develop a new method to determine the presence of intracochlear ossification and/or fibrosis in cochlear implantation candidates with bilateral profound deafness following meningitis. Diagnostic test assessment. A university hospital. This study involved 15 ears from 13 patients with profound deafness following meningitis who underwent cochlear implantation. These ears showed normal structures, soft tissue, partial bony occlusion, and complete bony occlusion in 4, 3, 2, and 6 ears, respectively. We measured radiodensity in Hounsfield units (HU) using 0.5-mm-thick axial high-resolution computed tomography image slices at 3 different levels in the basal turn, the fenestration, and inferior and ascending segment sites, located along the electrode-insertion path. Pixel-level analysis on the DICOM viewer yielded actual computed tomography values of intracochlear soft tissues by eliminating the partial volume effect. The values were compared with the intraoperative findings. Values for ossification (n = 12) ranged from +547 HU to +1137 HU; for fibrosis (n = 11), from +154 HU to +574 HU; and for fluid (n = 22), from -49 HU to +255 HU. From these values, we developed 2 presets of window width (WW) and window level (WL): (1) WW: 1800, WL: 1100 (200 HU to 2000 HU) and (2) WW: 1500, WL: 1250 (500 HU to 2000 HU). The results using these 2 presets corresponded well to the intraoperative findings. Our new method is easy and feasible for preoperative determination of the presence of cochlear ossification and/or fibrosis that develops following meningitis.

  15. Computational fluid dynamics simulations of light water reactor flows

    International Nuclear Information System (INIS)

    Tzanos, C.P.; Weber, D.P.

    1999-01-01

    Advances in computational fluid dynamics (CFD), turbulence simulation, and parallel computing have made feasible the development of three-dimensional (3-D) single-phase and two-phase flow CFD codes that can simulate fluid flow and heat transfer in realistic reactor geometries with significantly reduced reliance, especially in single phase, on empirical correlations. The objective of this work was to assess the predictive power and computational efficiency of a CFD code in the analysis of a challenging single-phase light water reactor problem, as well as to identify areas where further improvements are needed

  16. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  17. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  18. Scientific Grand Challenges: Crosscutting Technologies for Computing at the Exascale - February 2-4, 2010, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2011-02-06

    The goal of the "Scientific Grand Challenges - Crosscutting Technologies for Computing at the Exascale" workshop in February 2010, jointly sponsored by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and the National Nuclear Security Administration, was to identify the elements of a research and development agenda that will address these challenges and create a comprehensive exascale computing environment. This exascale computing environment will enable the science applications identified in the eight previously held Scientific Grand Challenges Workshop Series.

  19. Array Manipulation And Matrix-Tree Method To Identify High Concentration Regions HCRs

    Directory of Open Access Journals (Sweden)

    Rachana Arora

    2015-08-01

    Full Text Available Abstract-Sequence Alignment and Analysis is one of the most important applications of bioinformatics. It involves alignment a pair or more sequences with each other and identify a common pattern that would ultimately lead to conclusions of homology or dissimilarity. A number of algorithms that make use of dynamic programming to perform alignment between sequences are available. One of their main disadvantages is that they involve complicated computations and backtracking methods that are difficult to implement. This paper describes a much simpler method to identify common regions in 2 sequences and align them based on the density of common sequences identified.

  20. Computer literacy: Where are nurse educators on the continuum?

    Science.gov (United States)

    Hanley, Elizabeth

    2006-01-01

    Computers are becoming ubiquitous in health and education, and it is expected that nurses from undergraduate nursing programmes are computer literate when they enter the workforce. Similarly nurse educators are expected to be computer literate to model the use of information technology in their workplace. They are expected to use email for communication and a range of computer applications for presentation of course materials and reports. Additionally, as more courses are delivered in flexible mode, educators require more comprehensive computing skills, including confidence and competence in a range of applications. A cohort of nurse educators from one tertiary institution was surveyed to assess their perceived computer literacy and how they attained this. A questionnaire that covered seven domains of computer literacy was used to assess this. The results were illuminating and identified specific training needs for this group. Their perceived lack of skill with Groupwise email and the student database program are of concern as these are essential tools for nurse educators at this polytechnic.

  1. Ultrasound compared with computed tomography and pancreatic arteriography in the detection of endocrine tumours of the pancreas

    International Nuclear Information System (INIS)

    Paeivaensalo, M.; Maekaeraeinen, H.; Siniluoto, T.; Staahlberg, M.; Jalovaara, P.; Oulu Univ. Central Hospital

    1989-01-01

    We have evaluated ultrasound, computed tomography and arteriographic findings in 15 patients with 17 endocrine pancreatic tumours having a mean diameter of 2.3 cm (range 1-7 cm). All patients underwent computed tomography, and all but one ultrasound and arteriography. Ultrasound was the initial investigation in 11 patients, and identified 10 of the 16 tumours present in 14 patients. Two tumours were found at ultrasound reexamination after having been identified by other radiological methods. Computed tomography revealed 8 out of 17 tumours, while arteriography identified 8 out of 16 tumours. Computed tomography was the initial investigation in 4 patients,and identified one tumour. In only 4 patients were tumours not detected by any of the imaging methods. The sensitivities of ultrasound, computed tomography and arteriography in the detection of pancreatic tumours were 62.5% (95% confidence interval 50.4-74.6%), 47.1% (95% confidence interval 35.0-59.2%), and 50.0% (95% confidence interval 37.5-62.5%), respectively. Ultrasound was thus more accurate than computed tomography or arteriography in detecting endocrine pancreatic tumours, and should be the initial radiological investigation. (orig.)

  2. Developing Digital Immigrants' Computer Literacy: The Case of Unemployed Women

    Science.gov (United States)

    Ktoridou, Despo; Eteokleous-Grigoriou, Nikleia

    2011-01-01

    Purpose: The purpose of this study is to evaluate the effectiveness of a 40-hour computer course for beginners provided to a group of unemployed women learners with no/minimum computer literacy skills who can be characterized as digital immigrants. The aim of the study is to identify participants' perceptions and experiences regarding technology,…

  3. POEM: Identifying joint additive effects on regulatory circuits

    Directory of Open Access Journals (Sweden)

    Maya eBotzman

    2016-04-01

    Full Text Available Motivation: Expression Quantitative Trait Locus (eQTL mapping tackles the problem of identifying variation in DNA sequence that have an effect on the transcriptional regulatory network. Major computational efforts are aimed at characterizing the joint effects of several eQTLs acting in concert to govern the expression of the same genes. Yet, progress towards a comprehensive prediction of such joint effects is limited. For example, existing eQTL methods commonly discover interacting loci affecting the expression levels of a module of co-regulated genes. Such ‘modularization’ approaches, however, are focused on epistatic relations and thus have limited utility for the case of additive (non-epistatic effects.Results: Here we present POEM (Pairwise effect On Expression Modules, a methodology for identifying pairwise eQTL effects on gene modules. POEM is specifically designed to achieve high performance in the case of additive joint effects. We applied POEM to transcription profiles measured in bone marrow-derived dendritic cells across a population of genotyped mice. Our study reveals widespread additive, trans-acting pairwise effects on gene modules, characterizes their organizational principles, and highlights high-order interconnections between modules within the immune signaling network. These analyses elucidate the central role of additive pairwise effect in regulatory circuits, and provide computational tools for future investigations into the interplay between eQTLs.Availability: The software described in this article is available at csgi.tau.ac.il/POEM/.

  4. Future requirements and roles of computers in aerodynamics

    Science.gov (United States)

    Gregory, T. J.

    1978-01-01

    While faster computers will be needed to make solution of the Navier-Stokes equations practical and useful, most all of the other aerodynamic solution techniques can benefit from faster computers. There is a wide variety of computational and measurement techniques, the prospect of more powerful computers permits extension and an enhancement across all aerodynamic methods, including wind-tunnel measurement. It is expected that, as in the past, a blend of methods will be used to predict aircraft aerodynamics in the future. These will include methods based on solution of the Navier-Stokes equations and the potential flow equations as well as those based on empirical and measured results. The primary flows of interest in aircraft aerodynamics are identified, the predictive methods currently in use and/or under development are reviewed and two of these methods are analyzed in terms of the computational resources needed to improve their usefulness and practicality.

  5. Identification and ranking of the risk factors of cloud computing in State-Owned organizations

    Directory of Open Access Journals (Sweden)

    Noor Mohammad Yaghoubi

    2015-05-01

    Full Text Available Rapid development of processing and storage technologies and the success of the Internet have made computing resources cheaper, more powerful and more available than before. This technological trend has enabled the realization of a new computing model called cloud computing. Recently, the State-Owned organizations have begun to utilize cloud computing architectures, platforms, and applications to deliver services and meet constituents’ needs. Despite all of the advantages and opportunities of cloud computing technology, there are so many risks that State-Owned organizations need to know about before their migration to cloud environment. The purpose of this study is to identify and rank the risks factors of cloud computing in State-Owned organizations by making use of IT experts’ opinion. Firstly, by reviewing key articles, a comprehensive list of risks factors were extracted and classified into two categories: tangible and intangible. Then, six experts were interviewed about these risks and their classifications, and 10 risks were identified. After that, process of ranking the risks was done by seeking help from 52 experts and by fuzzy analytic hierarchy process. The results show that experts have identified intangible risks as the most important risks in cloud computing usage by State-Owned organizations. As the results indicate, "data confidentiality" risk has the highest place among the other risks.

  6. COALA--A Computational System for Interlanguage Analysis.

    Science.gov (United States)

    Pienemann, Manfred

    1992-01-01

    Describes a linguistic analysis computational system that responds to highly complex queries about morphosyntactic and semantic structures contained in large sets of language acquisition data by identifying, displaying, and analyzing sentences that meet the defined linguistic criteria. (30 references) (Author/CB)

  7. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  8. A study of the advantages & disadvantages of mobile cloud computing versus native environment

    OpenAIRE

    Almrot, Emil; Andersson, Sebastian

    2013-01-01

    The advent of cloud computing has enabled the possibility of moving complex calculations and device operations to the “cloud” in an effective way. With cloud computing being applied to mobile devices some of the primary constraints of mobile computing, such as battery life and hardware with less computational power, could be resolved by moving complex operations and computations to the cloud. This thesis aims to identify advantages and disadvantages associated with running cloud based applica...

  9. Enzyme (re)design: lessons from natural evolution and computation.

    Science.gov (United States)

    Gerlt, John A; Babbitt, Patricia C

    2009-02-01

    The (re)design of enzymes to catalyze 'new' reactions is a topic of considerable practical and intellectual interest. Directed evolution (random mutagenesis followed by screening/selection) has been used widely to identify novel biocatalysts. However, 'rational' approaches using either natural divergent evolution or computational predictions based on chemical principles have been less successful. This review summarizes recent progress in evolution-based and computation-based (re)design.

  10. Computing elastic anisotropy to discover gum-metal-like structural alloys

    Science.gov (United States)

    Winter, I. S.; de Jong, M.; Asta, M.; Chrzan, D. C.

    2017-08-01

    The computer aided discovery of structural alloys is a burgeoning but still challenging area of research. A primary challenge in the field is to identify computable screening parameters that embody key structural alloy properties. Here, an elastic anisotropy parameter that captures a material's susceptibility to solute solution strengthening is identified. The parameter has many applications in the discovery and optimization of structural materials. As a first example, the parameter is used to identify alloys that might display the super elasticity, super strength, and high ductility of the class of TiNb alloys known as gum metals. In addition, it is noted that the parameter can be used to screen candidate alloys for shape memory response, and potentially aid in the optimization of the mechanical properties of high-entropy alloys.

  11. Profiling an application for power consumption during execution on a compute node

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Peters, Amanda E; Ratterman, Joseph D; Smith, Brian E

    2013-09-17

    Methods, apparatus, and products are disclosed for profiling an application for power consumption during execution on a compute node that include: receiving an application for execution on a compute node; identifying a hardware power consumption profile for the compute node, the hardware power consumption profile specifying power consumption for compute node hardware during performance of various processing operations; determining a power consumption profile for the application in dependence upon the application and the hardware power consumption profile for the compute node; and reporting the power consumption profile for the application.

  12. COMPUTER MATHEMATICS SYSTEMS IN STUDENTS’ LEARNING OF "INFORMATIСS"

    Directory of Open Access Journals (Sweden)

    Taras P. Kobylnyk

    2014-04-01

    Full Text Available The article describes the general characteristics of the most popular computer mathematics systems such as commercial (Maple, Mathematica, Matlab and open source (Scilab, Maxima, GRAN, Sage, as well as the conditions of use of these systems as means of fundamentalization of the educational process of bachelor of informatics. It is considered the role of CMS in bachelor of informatics training. It is identified the approaches of CMS pedagogical use while learning information and physics and mathematics disciplines. There are presented some tasks, in which we must carefully use the «responses» have been received using CMS. It is identified the promising directions of development of computer mathematics systems in high-tech environment.

  13. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  14. Computing several eigenpairs of Hermitian problems by conjugate gradient iterations

    International Nuclear Information System (INIS)

    Ovtchinnikov, E.E.

    2008-01-01

    The paper is concerned with algorithms for computing several extreme eigenpairs of Hermitian problems based on the conjugate gradient method. We analyse computational strategies employed by various algorithms of this kind reported in the literature and identify their limitations. Our criticism is illustrated by numerical tests on a set of problems from electronic structure calculations and acoustics

  15. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

    Science.gov (United States)

    Moore, Gwendolyn B.; And Others

    The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…

  16. Inclusive vision for high performance computing at the CSIR

    CSIR Research Space (South Africa)

    Gazendam, A

    2006-02-01

    Full Text Available and computationally intensive applications. A number of different technologies and standards were identified as core to the open and distributed high-performance infrastructure envisaged...

  17. Security in Service Level Agreements for Cloud Computing

    OpenAIRE

    Bernsmed, Karin; JAATUN, Martin Gilje; Undheim, Astrid

    2011-01-01

    The Cloud computing paradigm promises reliable services, accessible from anywhere in the world, in an on-demand manner. Insufficient security has been identified as a major obstacle to adopting Cloud services. To deal with the risks associated with outsourcing data and applications to the Cloud, new methods for security assurance are urgently needed. This paper presents a framework for security in Service Level Agreements for Cloud computing. The purpose is twofold; to help potential Cloud cu...

  18. Two-phase computer codes for zero-gravity applications

    International Nuclear Information System (INIS)

    Krotiuk, W.J.

    1986-10-01

    This paper discusses the problems existing in the development of computer codes which can analyze the thermal-hydraulic behavior of two-phase fluids especially in low gravity nuclear reactors. The important phenomenon affecting fluid flow and heat transfer in reduced gravity is discussed. The applicability of using existing computer codes for space applications is assessed. Recommendations regarding the use of existing earth based fluid flow and heat transfer correlations are made and deficiencies in these correlations are identified

  19. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  20. A Human/Computer Learning Network to Improve Biodiversity Conservation and Research

    OpenAIRE

    Kelling, Steve; Gerbracht, Jeff; Fink, Daniel; Lagoze, Carl; Wong, Weng-Keen; Yu, Jun; Damoulas, Theodoros; Gomes, Carla

    2012-01-01

    In this paper we describe eBird, a citizen-science project that takes advantage of the human observational capacity to identify birds to species, which is then used to accurately represent patterns of bird occurrences across broad spatial and temporal extents. eBird employs artificial intelligence techniques such as machine learning to improve data quality by taking advantage of the synergies between human computation and mechanical computation. We call this a Human-Computer Learning Network,...

  1. A Visualization Review of Cloud Computing Algorithms in the Last Decade

    OpenAIRE

    Junhu Ruan; Felix T. S. Chan; Fangwei Zhu; Xuping Wang; Jing Yang

    2016-01-01

    Cloud computing has competitive advantages—such as on-demand self-service, rapid computing, cost reduction, and almost unlimited storage—that have attracted extensive attention from both academia and industry in recent years. Some review works have been reported to summarize extant studies related to cloud computing, but few analyze these studies based on the citations. Co-citation analysis can provide scholars a strong support to identify the intellectual bases and leading edges of a specifi...

  2. DISTING: A web application for fast algorithmic computation of alternative indistinguishable linear compartmental models.

    Science.gov (United States)

    Davidson, Natalie R; Godfrey, Keith R; Alquaddoomi, Faisal; Nola, David; DiStefano, Joseph J

    2017-05-01

    We describe and illustrate use of DISTING, a novel web application for computing alternative structurally identifiable linear compartmental models that are input-output indistinguishable from a postulated linear compartmental model. Several computer packages are available for analysing the structural identifiability of such models, but DISTING is the first to be made available for assessing indistinguishability. The computational algorithms embedded in DISTING are based on advanced versions of established geometric and algebraic properties of linear compartmental models, embedded in a user-friendly graphic model user interface. Novel computational tools greatly speed up the overall procedure. These include algorithms for Jacobian matrix reduction, submatrix rank reduction, and parallelization of candidate rank computations in symbolic matrix analysis. The application of DISTING to three postulated models with respectively two, three and four compartments is given. The 2-compartment example is used to illustrate the indistinguishability problem; the original (unidentifiable) model is found to have two structurally identifiable models that are indistinguishable from it. The 3-compartment example has three structurally identifiable indistinguishable models. It is found from DISTING that the four-compartment example has five structurally identifiable models indistinguishable from the original postulated model. This example shows that care is needed when dealing with models that have two or more compartments which are neither perturbed nor observed, because the numbering of these compartments may be arbitrary. DISTING is universally and freely available via the Internet. It is easy to use and circumvents tedious and complicated algebraic analysis previously done by hand. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  4. Research foci of computing research in South Africa as reflected by publications in the South African computer journal

    CSIR Research Space (South Africa)

    Kotzé, P

    2009-01-01

    Full Text Available of research articles published in SACJ over its first 40 volumes of the journal using the ACM Computing Classification Scheme as basis. In their analysis the authors divided the publications into three cycles of more or less six years in order to identify...

  5. The Effect of Computer Game-Based Learning on FL Vocabulary Transferability

    Science.gov (United States)

    Franciosi, Stephan J.

    2017-01-01

    In theory, computer game-based learning can support several vocabulary learning affordances that have been identified in the foreign language learning research. In the observable evidence, learning with computer games has been shown to improve performance on vocabulary recall tests. However, while simple recall can be a sign of learning,…

  6. Uses of Computer and its Relevance to Teaching and Learning in ...

    African Journals Online (AJOL)

    This paper examined the uses of computer and its relevance to teaching and learning in Nigerian secondary schools. The need for computer education and its objectives in Nigerian educational system were identified and discussed. The roles the classroom teachers would play and the challenges they would have to face in ...

  7. A neural algorithm for a fundamental computing problem.

    Science.gov (United States)

    Dasgupta, Sanjoy; Stevens, Charles F; Navlakha, Saket

    2017-11-10

    Similarity search-for example, identifying similar images in a database or similar documents on the web-is a fundamental computing problem faced by large-scale information retrieval systems. We discovered that the fruit fly olfactory circuit solves this problem with a variant of a computer science algorithm (called locality-sensitive hashing). The fly circuit assigns similar neural activity patterns to similar odors, so that behaviors learned from one odor can be applied when a similar odor is experienced. The fly algorithm, however, uses three computational strategies that depart from traditional approaches. These strategies can be translated to improve the performance of computational similarity searches. This perspective helps illuminate the logic supporting an important sensory function and provides a conceptually new algorithm for solving a fundamental computational problem. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  8. Survey of computer codes applicable to waste facility performance evaluations

    International Nuclear Information System (INIS)

    Alsharif, M.; Pung, D.L.; Rivera, A.L.; Dole, L.R.

    1988-01-01

    This study is an effort to review existing information that is useful to develop an integrated model for predicting the performance of a radioactive waste facility. A summary description of 162 computer codes is given. The identified computer programs address the performance of waste packages, waste transport and equilibrium geochemistry, hydrological processes in unsaturated and saturated zones, and general waste facility performance assessment. Some programs also deal with thermal analysis, structural analysis, and special purposes. A number of these computer programs are being used by the US Department of Energy, the US Nuclear Regulatory Commission, and their contractors to analyze various aspects of waste package performance. Fifty-five of these codes were identified as being potentially useful on the analysis of low-level radioactive waste facilities located above the water table. The code summaries include authors, identification data, model types, and pertinent references. 14 refs., 5 tabs

  9. Positron emission tomography/computed tomography scanning for ...

    African Journals Online (AJOL)

    Background: Although the site of nosocomial sepsis in the critically ill ventilated patient is usually identifiable, it may remain occult, despite numerous investigations. The rapid results and precise anatomical location of the septic source using positron emission tomography (PET) scanning, in combination with computed ...

  10. Integrating publicly-available data to generate computationally ...

    Science.gov (United States)

    The adverse outcome pathway (AOP) framework provides a way of organizing knowledge related to the key biological events that result in a particular health outcome. For the majority of environmental chemicals, the availability of curated pathways characterizing potential toxicity is limited. Methods are needed to assimilate large amounts of available molecular data and quickly generate putative AOPs for further testing and use in hazard assessment. A graph-based workflow was used to facilitate the integration of multiple data types to generate computationally-predicted (cp) AOPs. Edges between graph entities were identified through direct experimental or literature information or computationally inferred using frequent itemset mining. Data from the TG-GATEs and ToxCast programs were used to channel large-scale toxicogenomics information into a cpAOP network (cpAOPnet) of over 20,000 relationships describing connections between chemical treatments, phenotypes, and perturbed pathways measured by differential gene expression and high-throughput screening targets. Sub-networks of cpAOPs for a reference chemical (carbon tetrachloride, CCl4) and outcome (hepatic steatosis) were extracted using the network topology. Comparison of the cpAOP subnetworks to published mechanistic descriptions for both CCl4 toxicity and hepatic steatosis demonstrate that computational approaches can be used to replicate manually curated AOPs and identify pathway targets that lack genomic mar

  11. Computer Security: Competing Concepts

    OpenAIRE

    Nissenbaum, Helen; Friedman, Batya; Felten, Edward

    2001-01-01

    This paper focuses on a tension we discovered in the philosophical part of our multidisciplinary project on values in web-browser security. Our project draws on the methods and perspectives of empirical social science, computer science, and philosophy to identify values embodied in existing web-browser security and also to prescribe changes to existing systems (in particular, Mozilla) so that values relevant to web-browser systems are better served than presently they are. The tension, which ...

  12. Aggregating job exit statuses of a plurality of compute nodes executing a parallel application

    Science.gov (United States)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Mundy, Michael B.

    2015-07-21

    Aggregating job exit statuses of a plurality of compute nodes executing a parallel application, including: identifying a subset of compute nodes in the parallel computer to execute the parallel application; selecting one compute node in the subset of compute nodes in the parallel computer as a job leader compute node; initiating execution of the parallel application on the subset of compute nodes; receiving an exit status from each compute node in the subset of compute nodes, where the exit status for each compute node includes information describing execution of some portion of the parallel application by the compute node; aggregating each exit status from each compute node in the subset of compute nodes; and sending an aggregated exit status for the subset of compute nodes in the parallel computer.

  13. Computational mechanics - Advances and trends; Proceedings of the Session - Future directions of Computational Mechanics of the ASME Winter Annual Meeting, Anaheim, CA, Dec. 7-12, 1986

    Science.gov (United States)

    Noor, Ahmed K. (Editor)

    1986-01-01

    The papers contained in this volume provide an overview of the advances made in a number of aspects of computational mechanics, identify some of the anticipated industry needs in this area, discuss the opportunities provided by new hardware and parallel algorithms, and outline some of the current government programs in computational mechanics. Papers are included on advances and trends in parallel algorithms, supercomputers for engineering analysis, material modeling in nonlinear finite-element analysis, the Navier-Stokes computer, and future finite-element software systems.

  14. Computer Aided Solvent Selection and Design Framework

    DEFF Research Database (Denmark)

    Mitrofanov, Igor; Conte, Elisa; Abildskov, Jens

    and computer-aided tools and methods for property prediction and computer-aided molecular design (CAMD) principles. This framework is applicable for solvent selection and design in product design as well as process design. The first module of the framework is dedicated to the solvent selection and design...... in terms of: physical and chemical properties (solvent-pure properties); Environment, Health and Safety (EHS) characteristic (solvent-EHS properties); operational properties (solvent–solute properties). 3. Performing the search. The search step consists of two stages. The first is a generation and property...... identification of solvent candidates using special software ProCAMD and ProPred, which are the implementations of computer-aided molecular techniques. The second consists of assigning the RS-indices following the reaction–solvent and then consulting the known solvent database and identifying the set of solvents...

  15. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  16. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [ORNL

    2017-08-01

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes were used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge

  17. Cloudbus Toolkit for Market-Oriented Cloud Computing

    Science.gov (United States)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  18. Computer-aided drug design at Boehringer Ingelheim

    Science.gov (United States)

    Muegge, Ingo; Bergner, Andreas; Kriegl, Jan M.

    2017-03-01

    Computer-Aided Drug Design (CADD) is an integral part of the drug discovery endeavor at Boehringer Ingelheim (BI). CADD contributes to the evaluation of new therapeutic concepts, identifies small molecule starting points for drug discovery, and develops strategies for optimizing hit and lead compounds. The CADD scientists at BI benefit from the global use and development of both software platforms and computational services. A number of computational techniques developed in-house have significantly changed the way early drug discovery is carried out at BI. In particular, virtual screening in vast chemical spaces, which can be accessed by combinatorial chemistry, has added a new option for the identification of hits in many projects. Recently, a new framework has been implemented allowing fast, interactive predictions of relevant on and off target endpoints and other optimization parameters. In addition to the introduction of this new framework at BI, CADD has been focusing on the enablement of medicinal chemists to independently perform an increasing amount of molecular modeling and design work. This is made possible through the deployment of MOE as a global modeling platform, allowing computational and medicinal chemists to freely share ideas and modeling results. Furthermore, a central communication layer called the computational chemistry framework provides broad access to predictive models and other computational services.

  19. Fundamentals of universality in one-way quantum computation

    International Nuclear Information System (INIS)

    Nest, M van den; Duer, W; Miyake, A; Briegel, H J

    2007-01-01

    In this paper, we build a framework allowing for a systematic investigation of the fundamental issue: 'Which quantum states serve as universal resources for measurement-based (one-way) quantum computation?' We start our study by re-examining what is exactly meant by 'universality' in quantum computation, and what the implications are for universal one-way quantum computation. Given the framework of a measurement-based quantum computer, where quantum information is processed by local operations only, we find that the most general universal one-way quantum computer is one which is capable of accepting arbitrary classical inputs and producing arbitrary quantum outputs-we refer to this property as CQ-universality. We then show that a systematic study of CQ-universality in one-way quantum computation is possible by identifying entanglement features that are required to be present in every universal resource. In particular, we find that a large class of entanglement measures must reach its supremum on every universal resource. These insights are used to identify several families of states as being not universal, such as one-dimensional (1D) cluster states, Greenberger-Horne-Zeilinger (GHZ) states, W states, and ground states of non-critical 1D spin systems. Our criteria are strengthened by considering the efficiency of a quantum computation, and we find that entanglement measures must obey a certain scaling law with the system size for all efficient universal resources. This again leads to examples of non-universal resources, such as, e.g. ground states of critical 1D spin systems. On the other hand, we provide several examples of efficient universal resources, namely graph states corresponding to hexagonal, triangular and Kagome lattices. Finally, we consider the more general notion of encoded CQ-universality, where quantum outputs are allowed to be produced in an encoded form. Again we provide entanglement-based criteria for encoded universality. Moreover, we present a

  20. Computer-simulated images of icosahedral, pentagonal and decagonal clusters of atoms

    International Nuclear Information System (INIS)

    Peng JuLin; Bursill, L.A.

    1989-01-01

    The aim of this work was to assess, by computer-simulation the sensitivity of high-resolution electron microscopy (HREM) images for a set of icosahedral and decagonal clusters, containing 50-400 atoms. An experimental study of both crystalline and quasy-crystalline alloys of A1(Si)Mn is presented, in which carefully-chosen electron optical conditions were established by computer simulation then used to obtain high quality images. It was concluded that while there is a very significant degree of model sensitiveness available, direct inversion from image to structure is not at realistic possibility. A reasonable procedure would be to record experimental images of known complex icosahedral alloys, in a crystalline phase, then use the computer-simulations to identify fingerprint imaging conditions whereby certain structural elements could be identified in images of quasi-crystalline or amorphous specimens. 27 refs., 12 figs., 1 tab

  1. NWChem Meeting on Science Driven Petascale Computing and Capability Development at EMSL

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.

    2007-02-19

    On January 25, and 26, 2007, an NWChem meeting was held that was attended by 65 scientists from 29 institutions including 22 universities and 5 national laboratories. The goals of the meeting were to look at major scientific challenges that could be addressed by computational modeling in environmental molecular sciences, and to identify the associated capability development needs. In addition, insights were sought into petascale computing developments in computational chemistry. During the meeting common themes were identified that will drive the need for the development of new or improved capabilities in NWChem. Crucial areas of development that the developer's team will be focusing on are (1) modeling of dynamics and kinetics in chemical transformations, (2) modeling of chemistry at interfaces and in the condensed phase, and (3) spanning longer time scales in biological processes modeled with molecular dynamics. Various computational chemistry methodologies were discussed during the meeting, which will provide the basis for the capability developments in the near or long term future of NWChem.

  2. Research directions in computer engineering. Report of a workshop

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, H

    1982-09-01

    The results of a workshop held in November 1981 in Washington, DC, to outline research directions for computer engineering are reported upon. The purpose of the workshop was to provide guidance to government research funding agencies, as well as to universities and industry, as to the directions which computer engineering research should take for the next five to ten years. A select group of computer engineers was assembled, drawn from all over the United States and with expertise in virtually every aspect of today's computer technology. Industrial organisations and universities were represented in roughly equal numbers. The panel proceeded to provide a sharper definition of computer engineering than had been in popular use previously, to identify the social and national needs which provide the basis for encouraging research, to probe for obstacles to research and seek means of overcoming them and to delineate high-priority areas in which computer engineering research should be fostered. These included experimental software engineering, architectures in support of programming style, computer graphics, pattern recognition. VLSI design tools, machine intelligence, programmable automation, architectures for speech and signal processing, computer architecture and robotics. 13 references.

  3. Computer vision syndrome: A review.

    Science.gov (United States)

    Gowrisankaran, Sowjanya; Sheedy, James E

    2015-01-01

    Computer vision syndrome (CVS) is a collection of symptoms related to prolonged work at a computer display. This article reviews the current knowledge about the symptoms, related factors and treatment modalities for CVS. Relevant literature on CVS published during the past 65 years was analyzed. Symptoms reported by computer users are classified into internal ocular symptoms (strain and ache), external ocular symptoms (dryness, irritation, burning), visual symptoms (blur, double vision) and musculoskeletal symptoms (neck and shoulder pain). The major factors associated with CVS are either environmental (improper lighting, display position and viewing distance) and/or dependent on the user's visual abilities (uncorrected refractive error, oculomotor disorders and tear film abnormalities). Although the factors associated with CVS have been identified the physiological mechanisms that underlie CVS are not completely understood. Additionally, advances in technology have led to the increased use of hand-held devices, which might impose somewhat different visual challenges compared to desktop displays. Further research is required to better understand the physiological mechanisms underlying CVS and symptoms associated with the use of hand-held and stereoscopic displays.

  4. Computed tomography angiography and perfusion to assess coronary artery stenosis causing perfusion defects by single photon emission computed tomography

    DEFF Research Database (Denmark)

    Rochitte, Carlos E; George, Richard T; Chen, Marcus Y

    2014-01-01

    AIMS: To evaluate the diagnostic power of integrating the results of computed tomography angiography (CTA) and CT myocardial perfusion (CTP) to identify coronary artery disease (CAD) defined as a flow limiting coronary artery stenosis causing a perfusion defect by single photon emission computed...... emission computed tomography (SPECT/MPI). Sixteen centres enroled 381 patients who underwent combined CTA-CTP and SPECT/MPI prior to conventional coronary angiography. All four image modalities were analysed in blinded independent core laboratories. The prevalence of obstructive CAD defined by combined ICA...... tomography (SPECT). METHODS AND RESULTS: We conducted a multicentre study to evaluate the accuracy of integrated CTA-CTP for the identification of patients with flow-limiting CAD defined by ≥50% stenosis by invasive coronary angiography (ICA) with a corresponding perfusion deficit on stress single photon...

  5. Computer Backgrounds of Soldiers in Army Units: FY00

    National Research Council Canada - National Science Library

    Fober, Gene

    2001-01-01

    .... Soldiers from four Army installations were given a survey that examined their experiences with computers, self-perceptions of their skill, and an objective test of their ability to identify Windows-based icons...

  6. Post-mortem computed tomography angiography utilizing barium sulfate to identify microvascular structures : a preliminary phantom model and case study

    NARCIS (Netherlands)

    Haakma, Wieke; Rohde, Marianne; Kuster, Lidy; Uhrenholt, Lars; Pedersen, Michael; Boel, Lene Warner Thorup

    2016-01-01

    We investigated the use of computer tomography angiography (CTA) to visualize microvascular structures in a vessel-mimicking phantom and post-mortem (PM) bodies. A contrast agent was used based on 22% barium sulfate, 20% polyethylene glycol and 58% distilled water. A vessel-mimicking phantom

  7. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    Science.gov (United States)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  8. Locative media and data-driven computing experiments

    Directory of Open Access Journals (Sweden)

    Sung-Yueh Perng

    2016-06-01

    Full Text Available Over the past two decades urban social life has undergone a rapid and pervasive geocoding, becoming mediated, augmented and anticipated by location-sensitive technologies and services that generate and utilise big, personal, locative data. The production of these data has prompted the development of exploratory data-driven computing experiments that seek to find ways to extract value and insight from them. These projects often start from the data, rather than from a question or theory, and try to imagine and identify their potential utility. In this paper, we explore the desires and mechanics of data-driven computing experiments. We demonstrate how both locative media data and computing experiments are ‘staged’ to create new values and computing techniques, which in turn are used to try and derive possible futures that are ridden with unintended consequences. We argue that using computing experiments to imagine potential urban futures produces effects that often have little to do with creating new urban practices. Instead, these experiments promote Big Data science and the prospect that data produced for one purpose can be recast for another and act as alternative mechanisms of envisioning urban futures.

  9. Recommended documentation for computer users at ANL

    Energy Technology Data Exchange (ETDEWEB)

    Heiberger, A.A.

    1992-04-01

    Recommended Documentation for Computer Users at ANL is for all users of the services available from the Argonne National Laboratory (ANL) Computing and Telecommunications Division (CTD). This document will guide you in selecting available documentation that will best fill your particular needs. Chapter 1 explains how to use this document to select documents and how to obtain them from the CTD Document Distribution Counter. Chapter 2 contains a table that categorizes available publications. Chapter 3 gives descriptions of the online DOCUMENT command for CMS, and VAX, and the Sun workstation. DOCUMENT allows you to scan for and order documentation that interests you. Chapter 4 lists publications by subject. Categories I and IX cover publications of a general nature and publications on telecommunications and networks respectively. Categories II, III, IV, V, VI, VII, VIII, and X cover publications on specific computer systems. Category XI covers publications on advanced scientific computing at Argonne. Chapter 5 contains abstracts for each publication, all arranged alphabetically. Chapter 6 describes additional publications containing bibliographies and master indexes that the user may find useful. The appendix identifies available computer systems, applications, languages, and libraries.

  10. Towards Cloud Computing: A SWOT Analysis on its Adoption in SMEs

    OpenAIRE

    Ghaffari, Kimia; Delgosha, Mohammad Soltani; Abdolvand, Neda

    2014-01-01

    Over the past few years, emergence of cloud computing has notably made an evolution in the IT industry by putting forward an "everything as a service" idea .Cloud Computing is of growing interest to companies throughout the world, but there are many barriers associated with its adoption which should be eliminated. This paper aims to investigate Cloud Computing and discusses the drivers and inhibitors of its adoption. Moreover, an attempt has been made to identify the key stakeholders of Cloud...

  11. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  12. The family of standard hydrogen monitoring system computer software design description: Revision 2

    International Nuclear Information System (INIS)

    Bender, R.M.

    1994-01-01

    In March 1990, 23 waste tanks at the Hanford Nuclear Reservation were identified as having the potential for the buildup of gas to a flammable or explosive level. As a result of the potential for hydrogen gas buildup, a project was initiated to design a standard hydrogen monitoring system (SHMS) for use at any waste tank to analyze gas samples for hydrogen content. Since it was originally deployed three years ago, two variations of the original system have been developed: the SHMS-B and SHMS-C. All three are currently in operation at the tank farms and will be discussed in this document. To avoid confusion in this document, when a feature is common to all three of the SHMS variants, it will be referred to as ''The family of SHMS.'' When it is specific to only one or two, they will be identified. The purpose of this computer software design document is to provide the following: the computer software requirements specification that documents the essential requirements of the computer software and its external interfaces; the computer software design description; the computer software user documentation for using and maintaining the computer software and any dedicated hardware; and the requirements for computer software design verification and validation

  13. Computer Assisted Language Learning (CALL) Software: Evaluation ...

    African Journals Online (AJOL)

    Evaluating the nature and extent of the influence of Computer Assisted Language Learning (CALL) on the quality of language learning is highly problematic. This is owing to the number and complexity of interacting variables involved in setting the items for teaching and learning languages. This paper identified and ...

  14. Meta-analysis of 375,000 individuals identifies 38 susceptibility loci for migraine.

    Science.gov (United States)

    Gormley, Padhraig; Anttila, Verneri; Winsvold, Bendik S; Palta, Priit; Esko, Tonu; Pers, Tune H; Farh, Kai-How; Cuenca-Leon, Ester; Muona, Mikko; Furlotte, Nicholas A; Kurth, Tobias; Ingason, Andres; McMahon, George; Ligthart, Lannie; Terwindt, Gisela M; Kallela, Mikko; Freilinger, Tobias M; Ran, Caroline; Gordon, Scott G; Stam, Anine H; Steinberg, Stacy; Borck, Guntram; Koiranen, Markku; Quaye, Lydia; Adams, Hieab H H; Lehtimäki, Terho; Sarin, Antti-Pekka; Wedenoja, Juho; Hinds, David A; Buring, Julie E; Schürks, Markus; Ridker, Paul M; Hrafnsdottir, Maria Gudlaug; Stefansson, Hreinn; Ring, Susan M; Hottenga, Jouke-Jan; Penninx, Brenda W J H; Färkkilä, Markus; Artto, Ville; Kaunisto, Mari; Vepsäläinen, Salli; Malik, Rainer; Heath, Andrew C; Madden, Pamela A F; Martin, Nicholas G; Montgomery, Grant W; Kurki, Mitja I; Kals, Mart; Mägi, Reedik; Pärn, Kalle; Hämäläinen, Eija; Huang, Hailiang; Byrnes, Andrea E; Franke, Lude; Huang, Jie; Stergiakouli, Evie; Lee, Phil H; Sandor, Cynthia; Webber, Caleb; Cader, Zameel; Muller-Myhsok, Bertram; Schreiber, Stefan; Meitinger, Thomas; Eriksson, Johan G; Salomaa, Veikko; Heikkilä, Kauko; Loehrer, Elizabeth; Uitterlinden, Andre G; Hofman, Albert; van Duijn, Cornelia M; Cherkas, Lynn; Pedersen, Linda M; Stubhaug, Audun; Nielsen, Christopher S; Männikkö, Minna; Mihailov, Evelin; Milani, Lili; Göbel, Hartmut; Esserlind, Ann-Louise; Christensen, Anne Francke; Hansen, Thomas Folkmann; Werge, Thomas; Kaprio, Jaakko; Aromaa, Arpo J; Raitakari, Olli; Ikram, M Arfan; Spector, Tim; Järvelin, Marjo-Riitta; Metspalu, Andres; Kubisch, Christian; Strachan, David P; Ferrari, Michel D; Belin, Andrea C; Dichgans, Martin; Wessman, Maija; van den Maagdenberg, Arn M J M; Zwart, John-Anker; Boomsma, Dorret I; Smith, George Davey; Stefansson, Kari; Eriksson, Nicholas; Daly, Mark J; Neale, Benjamin M; Olesen, Jes; Chasman, Daniel I; Nyholt, Dale R; Palotie, Aarno

    2016-08-01

    Migraine is a debilitating neurological disorder affecting around one in seven people worldwide, but its molecular mechanisms remain poorly understood. There is some debate about whether migraine is a disease of vascular dysfunction or a result of neuronal dysfunction with secondary vascular changes. Genome-wide association (GWA) studies have thus far identified 13 independent loci associated with migraine. To identify new susceptibility loci, we carried out a genetic study of migraine on 59,674 affected subjects and 316,078 controls from 22 GWA studies. We identified 44 independent single-nucleotide polymorphisms (SNPs) significantly associated with migraine risk (P < 5 × 10(-8)) that mapped to 38 distinct genomic loci, including 28 loci not previously reported and a locus that to our knowledge is the first to be identified on chromosome X. In subsequent computational analyses, the identified loci showed enrichment for genes expressed in vascular and smooth muscle tissues, consistent with a predominant theory of migraine that highlights vascular etiologies.

  15. Computer-based teaching module design: principles derived from learning theories.

    Science.gov (United States)

    Lau, K H Vincent

    2014-03-01

    The computer-based teaching module (CBTM), which has recently gained prominence in medical education, is a teaching format in which a multimedia program serves as a single source for knowledge acquisition rather than playing an adjunctive role as it does in computer-assisted learning (CAL). Despite empirical validation in the past decade, there is limited research into the optimisation of CBTM design. This review aims to summarise research in classic and modern multimedia-specific learning theories applied to computer learning, and to collapse the findings into a set of design principles to guide the development of CBTMs. Scopus was searched for: (i) studies of classic cognitivism, constructivism and behaviourism theories (search terms: 'cognitive theory' OR 'constructivism theory' OR 'behaviourism theory' AND 'e-learning' OR 'web-based learning') and their sub-theories applied to computer learning, and (ii) recent studies of modern learning theories applied to computer learning (search terms: 'learning theory' AND 'e-learning' OR 'web-based learning') for articles published between 1990 and 2012. The first search identified 29 studies, dominated in topic by the cognitive load, elaboration and scaffolding theories. The second search identified 139 studies, with diverse topics in connectivism, discovery and technical scaffolding. Based on their relative representation in the literature, the applications of these theories were collapsed into a list of CBTM design principles. Ten principles were identified and categorised into three levels of design: the global level (managing objectives, framing, minimising technical load); the rhetoric level (optimising modality, making modality explicit, scaffolding, elaboration, spaced repeating), and the detail level (managing text, managing devices). This review examined the literature in the application of learning theories to CAL to develop a set of principles that guide CBTM design. Further research will enable educators to

  16. An investigation into the organisation and structural design of multi-computer process-control systems

    International Nuclear Information System (INIS)

    Gertenbach, W.P.

    1981-12-01

    A multi-computer system for the collection of data and control of distributed processes has been developed. The structure and organisation of this system, a study of the general theory of systems and of modularity was used as a basis for an investigation into the organisation and structured design of multi-computer process-control systems. A multi-dimensional model of multi-computer process-control systems was developed. In this model a strict separation was made between organisational properties of multi-computer process-control systems and implementation dependant properties. The model was based on the principles of hierarchical analysis and modularity. Several notions of hierarchy were found necessary to describe fully the organisation of multi-computer systems. A new concept, that of interconnection abstraction was identified. This concept is an extrapolation of implementation techniques in the hardware implementation area to the software implementation area. A synthesis procedure which relies heavily on the above described analysis of multi-computer process-control systems is proposed. The above mentioned model, and a set of performance factors which depend on a set of identified design criteria, were used to constrain the set of possible solutions to the multi-computer process-control system synthesis-procedure

  17. An Automated Summarization Assessment Algorithm for Identifying Summarizing Strategies.

    Directory of Open Access Journals (Sweden)

    Asad Abdi

    Full Text Available Summarization is a process to select important information from a source text. Summarizing strategies are the core cognitive processes in summarization activity. Since summarization can be important as a tool to improve comprehension, it has attracted interest of teachers for teaching summary writing through direct instruction. To do this, they need to review and assess the students' summaries and these tasks are very time-consuming. Thus, a computer-assisted assessment can be used to help teachers to conduct this task more effectively.This paper aims to propose an algorithm based on the combination of semantic relations between words and their syntactic composition to identify summarizing strategies employed by students in summary writing. An innovative aspect of our algorithm lies in its ability to identify summarizing strategies at the syntactic and semantic levels. The efficiency of the algorithm is measured in terms of Precision, Recall and F-measure. We then implemented the algorithm for the automated summarization assessment system that can be used to identify the summarizing strategies used by students in summary writing.

  18. Objective Model Selection for Identifying the Human Feedforward Response in Manual Control.

    Science.gov (United States)

    Drop, Frank M; Pool, Daan M; van Paassen, Marinus Rene M; Mulder, Max; Bulthoff, Heinrich H

    2018-01-01

    Realistic manual control tasks typically involve predictable target signals and random disturbances. The human controller (HC) is hypothesized to use a feedforward control strategy for target-following, in addition to feedback control for disturbance-rejection. Little is known about human feedforward control, partly because common system identification methods have difficulty in identifying whether, and (if so) how, the HC applies a feedforward strategy. In this paper, an identification procedure is presented that aims at an objective model selection for identifying the human feedforward response, using linear time-invariant autoregressive with exogenous input models. A new model selection criterion is proposed to decide on the model order (number of parameters) and the presence of feedforward in addition to feedback. For a range of typical control tasks, it is shown by means of Monte Carlo computer simulations that the classical Bayesian information criterion (BIC) leads to selecting models that contain a feedforward path from data generated by a pure feedback model: "false-positive" feedforward detection. To eliminate these false-positives, the modified BIC includes an additional penalty on model complexity. The appropriate weighting is found through computer simulations with a hypothesized HC model prior to performing a tracking experiment. Experimental human-in-the-loop data will be considered in future work. With appropriate weighting, the method correctly identifies the HC dynamics in a wide range of control tasks, without false-positive results.

  19. Influence of intracanal post on apical periodontitis identified by cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Estrela, Carlos; Porto, Olavo Cesar Lyra; Rodrigues, Cleomar Donizeth [Federal University of Goias (UFG), Goiania, GO (Brazil). Dental School; Bueno, Mike Reis [University of Cuiaba (UNIC), MT (Brazil). Dental School; Pecora, Jesus Djalma, E-mail: estrela3@terra.com.b [University of Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Dental School

    2009-07-01

    The determination of the success of endodontic treatment has been often discussed based on outcome obtained by periapical radiography. The aim of this study was to verify the influence of intracanal post on apical periodontitis detected by cone-beam computed tomography (CBCT). A consecutive sample of 1020 images (periapical radiographs and CBCT scans) taken from 619 patients (245 men; mean age, 50.1 years) between February 2008 and September 2009 were used in this study. Presence and intracanal post length (short, medium and long) were associated with apical periodontitis (AP). Chi-square test was used for statistical analyses. Significance level was set at p<0.01. The kappa value was used to assess examiner variability. From a total of 591 intracanal posts, AP was observed in 15.06%, 18.78% and 7.95% using periapical radiographs, into the different lengths, short, medium and long, respectively (p=0.466). Considering the same posts length it was verified AP in 24.20%, 26.40% and 11.84% observed by CBCT scans, respectively (p=0.154). From a total of 1,020 teeth used in this study, AP was detected in 397 (38.92%) by periapical radiography and in 614 (60.19%) by CBCT scans (p<0.001). The distribution of intracanal posts in different dental groups showed higher prevalence in maxillary anterior teeth (54.79%). Intracanal posts lengths did not influenced AP. AP was detected more frequently when CBCT method was used. (author)

  20. Influence of intracanal post on apical periodontitis identified by cone-beam computed tomography

    International Nuclear Information System (INIS)

    Estrela, Carlos; Porto, Olavo Cesar Lyra; Rodrigues, Cleomar Donizeth; Bueno, Mike Reis; Pecora, Jesus Djalma

    2009-01-01

    The determination of the success of endodontic treatment has been often discussed based on outcome obtained by periapical radiography. The aim of this study was to verify the influence of intracanal post on apical periodontitis detected by cone-beam computed tomography (CBCT). A consecutive sample of 1020 images (periapical radiographs and CBCT scans) taken from 619 patients (245 men; mean age, 50.1 years) between February 2008 and September 2009 were used in this study. Presence and intracanal post length (short, medium and long) were associated with apical periodontitis (AP). Chi-square test was used for statistical analyses. Significance level was set at p<0.01. The kappa value was used to assess examiner variability. From a total of 591 intracanal posts, AP was observed in 15.06%, 18.78% and 7.95% using periapical radiographs, into the different lengths, short, medium and long, respectively (p=0.466). Considering the same posts length it was verified AP in 24.20%, 26.40% and 11.84% observed by CBCT scans, respectively (p=0.154). From a total of 1,020 teeth used in this study, AP was detected in 397 (38.92%) by periapical radiography and in 614 (60.19%) by CBCT scans (p<0.001). The distribution of intracanal posts in different dental groups showed higher prevalence in maxillary anterior teeth (54.79%). Intracanal posts lengths did not influenced AP. AP was detected more frequently when CBCT method was used. (author)

  1. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  2. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    OpenAIRE

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-01-01

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better stora...

  3. 10 CFR 9.19 - Segregation of exempt information and deletion of identifying details.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Segregation of exempt information and deletion of... Information Act Regulations § 9.19 Segregation of exempt information and deletion of identifying details. (a... deletions are made from parts of the record by computer, the amount of information deleted will be indicated...

  4. A Quantitative Exploration of Preservice Teachers' Intent to Use Computer-based Technology

    Science.gov (United States)

    Kim, Kioh; Jain, Sachin; Westhoff, Guy; Rezabek, Landra

    2008-01-01

    Based on Bandura's (1977) social learning theory, the purpose of this study is to identify the relationship of preservice teachers' perceptions of faculty modeling of computer-based technology and preservice teachers' intent of using computer-based technology in educational settings. There were 92 participants in this study; they were enrolled in…

  5. X-ray Computed Tomography Image Quality Indicator (IQI) Development

    Data.gov (United States)

    National Aeronautics and Space Administration — Phase one of the program is to identify suitable x-ray Computed Tomography (CT) Image Quality Indicator (IQI) design(s) that can be used to adequately capture CT...

  6. Mechanical and assembly units of viral capsids identified via quasi-rigid domain decomposition.

    Directory of Open Access Journals (Sweden)

    Guido Polles

    Full Text Available Key steps in a viral life-cycle, such as self-assembly of a protective protein container or in some cases also subsequent maturation events, are governed by the interplay of physico-chemical mechanisms involving various spatial and temporal scales. These salient aspects of a viral life cycle are hence well described and rationalised from a mesoscopic perspective. Accordingly, various experimental and computational efforts have been directed towards identifying the fundamental building blocks that are instrumental for the mechanical response, or constitute the assembly units, of a few specific viral shells. Motivated by these earlier studies we introduce and apply a general and efficient computational scheme for identifying the stable domains of a given viral capsid. The method is based on elastic network models and quasi-rigid domain decomposition. It is first applied to a heterogeneous set of well-characterized viruses (CCMV, MS2, STNV, STMV for which the known mechanical or assembly domains are correctly identified. The validated method is next applied to other viral particles such as L-A, Pariacoto and polyoma viruses, whose fundamental functional domains are still unknown or debated and for which we formulate verifiable predictions. The numerical code implementing the domain decomposition strategy is made freely available.

  7. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  8. [INVITED] Computational intelligence for smart laser materials processing

    Science.gov (United States)

    Casalino, Giuseppe

    2018-03-01

    Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training ;intelligent machine; to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.

  9. Online retrieval of patient information by asynchronous communication between general purpose computer and stand-alone personal computer

    International Nuclear Information System (INIS)

    Tsutsumi, Reiko; Takahashi, Kazuei; Sato, Toshiko; Komatani, Akio; Yamaguchi, Koichi

    1988-01-01

    Asynchronous communication was made between host (FACOM M-340) and personal computer (OLIBETTIE S-2250) to get patient's information required for RIA test registration. The retrieval system consists of a keyboad input of six numeric codes, patient's ID, and a real time reply containing six parameters for the patient. Their identified parameters are patient's name, sex, date of birth (include area), department, and out- or inpatient. Linking this program to RIA registration program for individual patient, then, operator can input name of RIA test requested. Our simple retrieval program made a useful data network between different types of host and stand-alone personal computers, and enabled us accurate and labor-saving registration for RIA test. (author)

  10. Cloud Computing Adoption Model for Universities to Increase ICT Proficiency

    Directory of Open Access Journals (Sweden)

    Safiya Okai

    2014-08-01

    Full Text Available Universities around the world especially those in developing countries are faced with the problem of delivering the level of information and communications technology (ICT needed to facilitate teaching, learning, research, and development activities ideal in a typical university, which is needed to meet educational needs in-line with advancement in technology and the growing dependence on IT. This is mainly due to the high cost involved in providing and maintaining the needed hardware and software. A technology such as cloud computing that delivers on demand provisioning of IT resources on a pay per use basis can be used to address this problem. Cloud computing promises better delivery of IT services as well as availability whenever and wherever needed at reduced costs with users paying only as much as they consume through the services of cloud service providers. The cloud technology reduces complexity while increasing speed and quality of IT services provided; however, despite these benefits the challenges that come with its adoption have left many sectors especially the higher education skeptical in committing to this technology. This article identifies the reasons for the slow rate of adoption of cloud computing at university level, discusses the challenges faced and proposes a cloud computing adoption model that contains strategic guidelines to overcome the major challenges identified and a roadmap for the successful adoption of cloud computing by universities. The model was tested in one of the universities and found to be both useful and appropriate for adopting cloud computing at university level.

  11. Remote Viewing and Computer Communications--An Experiment.

    Science.gov (United States)

    Vallee, Jacques

    1988-01-01

    A series of remote viewing experiments were run with 12 participants who communicated through a computer conferencing network. The correct target sample was identified in 8 out of 33 cases. This represented more than double the pure chance expectation. Appendices present protocol, instructions, and results of the experiments. (Author/YP)

  12. An approach to identify urban groundwater recharge

    Directory of Open Access Journals (Sweden)

    E. Vázquez-Suñé

    2010-10-01

    Full Text Available Evaluating the proportion in which waters from different origins are mixed in a given water sample is relevant for many hydrogeological problems, such as quantifying total recharge, assessing groundwater pollution risks, or managing water resources. Our work is motivated by urban hydrogeology, where waters with different chemical signature can be identified (losses from water supply and sewage networks, infiltration from surface runoff and other water bodies, lateral aquifers inflows, .... The relative contribution of different sources to total recharge can be quantified by means of solute mass balances, but application is hindered by the large number of potential origins. Hence, the need to incorporate data from a large number of conservative species, the uncertainty in sources concentrations and measurement errors. We present a methodology to compute mixing ratios and end-members composition, which consists of (i Identification of potential recharge sources, (ii Selection of tracers, (iii Characterization of the hydrochemical composition of potential recharge sources and mixed water samples, and (iv Computation of mixing ratios and reevaluation of end-members. The analysis performed in a data set from samples of the Barcelona city aquifers suggests that the main contributors to total recharge are the water supply network losses (22%, the sewage network losses (30%, rainfall, concentrated in the non-urbanized areas (17%, from runoff infiltration (20%, and the Besòs River (11%. Regarding species, halogens (chloride, fluoride and bromide, sulfate, total nitrogen, and stable isotopes (18O, 2H, and 34S behaved quite conservatively. Boron, residual alkalinity, EDTA and Zn did not. Yet, including these species in the computations did not affect significantly the proportion estimations.

  13. Fault tolerance in computational grids: perspectives, challenges, and issues.

    Science.gov (United States)

    Haider, Sajjad; Nazir, Babar

    2016-01-01

    Computational grids are established with the intention of providing shared access to hardware and software based resources with special reference to increased computational capabilities. Fault tolerance is one of the most important issues faced by the computational grids. The main contribution of this survey is the creation of an extended classification of problems that incur in the computational grid environments. The proposed classification will help researchers, developers, and maintainers of grids to understand the types of issues to be anticipated. Moreover, different types of problems, such as omission, interaction, and timing related have been identified that need to be handled on various layers of the computational grid. In this survey, an analysis and examination is also performed pertaining to the fault tolerance and fault detection mechanisms. Our conclusion is that a dependable and reliable grid can only be established when more emphasis is on fault identification. Moreover, our survey reveals that adaptive and intelligent fault identification, and tolerance techniques can improve the dependability of grid working environments.

  14. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  15. Center for Advanced Computational Technology

    Science.gov (United States)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  16. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  17. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  18. Analysis and modeling of social influence in high performance computing workloads

    KAUST Repository

    Zheng, Shuai

    2011-01-01

    Social influence among users (e.g., collaboration on a project) creates bursty behavior in the underlying high performance computing (HPC) workloads. Using representative HPC and cluster workload logs, this paper identifies, analyzes, and quantifies the level of social influence across HPC users. We show the existence of a social graph that is characterized by a pattern of dominant users and followers. This pattern also follows a power-law distribution, which is consistent with those observed in mainstream social networks. Given its potential impact on HPC workloads prediction and scheduling, we propose a fast-converging, computationally-efficient online learning algorithm for identifying social groups. Extensive evaluation shows that our online algorithm can (1) quickly identify the social relationships by using a small portion of incoming jobs and (2) can efficiently track group evolution over time. © 2011 Springer-Verlag.

  19. Indeterminate solid hepatic lesions identified on non-diagnostic contrast-enhanced computed tomography: Assessment of the additional diagnostic value of contrast-enhanced ultrasound in the non-cirrhotic liver

    International Nuclear Information System (INIS)

    Quaia, Emilio; De Paoli, Luca; Angileri, Roberta; Cabibbo, Biagio; Cova, Maria Assunta

    2014-01-01

    Objective: To assess the additional diagnostic value of contrast-enhanced ultrasound (CEUS) in the characterization of indeterminate solid hepatic lesions identified on non-diagnostic contrast-enhanced computed tomography (CT). Methods: Fifty-five solid hepatic lesions (1–4 cm in diameter) in 46 non-cirrhotic patients (26 female, 20 male; age ± SD, 55 ± 10 years) underwent CEUS after being detected on contrast-enhanced CT which was considered as non-diagnostic after on-site analysis. Two blinded independent readers assessed CT and CEUS scans and were asked to classify retrospectively each lesion as a malignant or benign based on reference diagnostic criteria for the different hepatic lesion histotypes. Diagnostic accuracy and confidence (area – A z – under ROC curve) were assessed by using gadobenate dimeglumine-enhanced magnetic resonance (MR) imaging (n = 30 lesions), histology (n = 7 lesions), or US follow-up (n = 18 lesions) as the reference standards. Results: Final diagnoses included 29 hemangiomas, 3 focal nodular hyperplasias, 1 hepatocellular adenoma, and 22 metastases. The additional review of CEUS after CT images improved significantly (P < .05) the diagnostic accuracy (before vs after CEUS review = 49% [20/55] vs 89% [49/55] – reader 1 and 43% [24/55] vs 92% [51/55] – reader 2) and confidence (A z , 95% Confidence Intervals before vs after CEUS review = .773 [.652–.895] vs .997 [.987–1] – reader 1 and .831 [.724–.938] vs .998 [.992–1] – reader 2). Conclusions: CEUS improved the characterization of indeterminate solid hepatic lesions identified on non-diagnostic contrast-enhanced CT by identifying some specific contrast enhancement patterns.

  20. Proxemics in Human-Computer Interaction

    OpenAIRE

    Greenberg, Saul; Honbaek, Kasper; Quigley, Aaron; Reiterer, Harald; Rädle, Roman

    2014-01-01

    In 1966, anthropologist Edward Hall coined the term "proxemics." Proxemics is an area of study that identifies the culturally dependent ways in which people use interpersonal distance to understand and mediate their interactions with others. Recent research has demonstrated the use of proxemics in human-computer interaction (HCI) for supporting users' explicit and implicit interactions in a range of uses, including remote office collaboration, home entertainment, and games. One promise of pro...

  1. A computational theory of the hippocampal cognitive map.

    Science.gov (United States)

    O'Keefe, J

    1990-01-01

    Evidence from single unit and lesion studies suggests that the hippocampal formation acts as a spatial or cognitive map (O'Keefe and Nadel, 1978). In this chapter, I summarise some of the unit recording data and then outline the most recent computational version of the cognitive map theory. The novel aspects of the present version of the theory are that it identifies two allocentric parameters, the centroid and the eccentricity, which can be calculated from the array of cues in an environment and which can serve as the bases for an allocentric polar co-ordinate system. Computations within this framework enable the animal to identify its location within an environment, to predict the location which will be reached as a result of any specific movement from that location, and conversely, to calculate the spatial transformation necessary to go from the current location to a desired location. Aspects of the model are identified with the information provided by cells in the hippocampus and dorsal presubiculum. The hippocampal place cells are involved in the calculation of the centroid and the presubicular direction cells in the calculation of the eccentricity.

  2. A Visualization Review of Cloud Computing Algorithms in the Last Decade

    Directory of Open Access Journals (Sweden)

    Junhu Ruan

    2016-10-01

    Full Text Available Cloud computing has competitive advantages—such as on-demand self-service, rapid computing, cost reduction, and almost unlimited storage—that have attracted extensive attention from both academia and industry in recent years. Some review works have been reported to summarize extant studies related to cloud computing, but few analyze these studies based on the citations. Co-citation analysis can provide scholars a strong support to identify the intellectual bases and leading edges of a specific field. In addition, advanced algorithms, which can directly affect the availability, efficiency, and security of cloud computing, are the key to conducting computing across various clouds. Motivated by these observations, we conduct a specific visualization review of the studies related to cloud computing algorithms using one mainstream co-citation analysis tool—CiteSpace. The visualization results detect the most influential studies, journals, countries, institutions, and authors on cloud computing algorithms and reveal the intellectual bases and focuses of cloud computing algorithms in the literature, providing guidance for interested researchers to make further studies on cloud computing algorithms.

  3. Development and application of computer codes for multidimensional thermalhydraulic analyses of nuclear reactor components

    International Nuclear Information System (INIS)

    Carver, M.B.

    1983-01-01

    Components of reactor systems and related equipment are identified in which multidimensional computational thermal hydraulics can be used to advantage to assess and improve design. Models of single- and two-phase flow are reviewed, and the governing equations for multidimensional analysis are discussed. Suitable computational algorithms are introduced, and sample results from the application of particular multidimensional computer codes are given

  4. Computer architecture fundamentals and principles of computer design

    CERN Document Server

    Dumas II, Joseph D

    2005-01-01

    Introduction to Computer ArchitectureWhat is Computer Architecture?Architecture vs. ImplementationBrief History of Computer SystemsThe First GenerationThe Second GenerationThe Third GenerationThe Fourth GenerationModern Computers - The Fifth GenerationTypes of Computer SystemsSingle Processor SystemsParallel Processing SystemsSpecial ArchitecturesQuality of Computer SystemsGenerality and ApplicabilityEase of UseExpandabilityCompatibilityReliabilitySuccess and Failure of Computer Architectures and ImplementationsQuality and the Perception of QualityCost IssuesArchitectural Openness, Market Timi

  5. Computed Tomography characterization of the Green Fiber Bottle

    DEFF Research Database (Denmark)

    Saxena, Prateek; Bissacco, Giuliano

    The work carried out in this research aims at identifying suitable ways for thorough characterization of the quality of paper bottles. Industrial X-ray Computed Tomography (XCT) is particularly advantageous in determining the quality of paper bottles and thus correlating it with the production...

  6. Role of Soft Computing Approaches in HealthCare Domain: A Mini Review.

    Science.gov (United States)

    Gambhir, Shalini; Malik, Sanjay Kumar; Kumar, Yugal

    2016-12-01

    In the present era, soft computing approaches play a vital role in solving the different kinds of problems and provide promising solutions. Due to popularity of soft computing approaches, these approaches have also been applied in healthcare data for effectively diagnosing the diseases and obtaining better results in comparison to traditional approaches. Soft computing approaches have the ability to adapt itself according to problem domain. Another aspect is a good balance between exploration and exploitation processes. These aspects make soft computing approaches more powerful, reliable and efficient. The above mentioned characteristics make the soft computing approaches more suitable and competent for health care data. The first objective of this review paper is to identify the various soft computing approaches which are used for diagnosing and predicting the diseases. Second objective is to identify various diseases for which these approaches are applied. Third objective is to categories the soft computing approaches for clinical support system. In literature, it is found that large number of soft computing approaches have been applied for effectively diagnosing and predicting the diseases from healthcare data. Some of these are particle swarm optimization, genetic algorithm, artificial neural network, support vector machine etc. A detailed discussion on these approaches are presented in literature section. This work summarizes various soft computing approaches used in healthcare domain in last one decade. These approaches are categorized in five different categories based on the methodology, these are classification model based system, expert system, fuzzy and neuro fuzzy system, rule based system and case based system. Lot of techniques are discussed in above mentioned categories and all discussed techniques are summarized in the form of tables also. This work also focuses on accuracy rate of soft computing technique and tabular information is provided for

  7. Computer code for qualitative analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Yule, H.P.

    1979-01-01

    Computer code QLN1 provides complete analysis of gamma-ray spectra observed with Ge(Li) detectors and is used at both the National Bureau of Standards and the Environmental Protection Agency. It locates peaks, resolves multiplets, identifies component radioisotopes, and computes quantitative results. The qualitative-analysis (or component identification) algorithms feature thorough, self-correcting steps which provide accurate isotope identification in spite of errors in peak centroids, energy calibration, and other typical problems. The qualitative-analysis algorithm is described in this paper

  8. Profiling an application for power consumption during execution on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Peters, Amanda E.; Ratterman, Joseph D.; Smith, Brian E.

    2012-08-21

    Methods, apparatus, and products are disclosed for profiling an application for power consumption during execution on a compute node that include: receiving an application for execution on a compute node; identifying a hardware power consumption profile for the compute node, the hardware power consumption profile specifying power consumption for compute node hardware during performance of various processing operations; determining a power consumption profile for the application in dependence upon the application and the hardware power consumption profile for the compute node; and reporting the power consumption profile for the application.

  9. Multidetector Computed Tomography and Neuroendocrine Pancreaticoduodenal Tumors

    International Nuclear Information System (INIS)

    Rappeport, E.D.; Palnaes Hansen, C.; Kjaer, A.; Knigge, U.

    2006-01-01

    Purpose: To investigate the accuracy of dedicated pancreatic multidetector computed tomography (MDCT) in the diagnosis of neuroendocrine pancreaticoduodenal tumors (NPTs). Material and Methods: MDCT and other imaging studies in patients with suspected NPTs were identified. Thirty dedicated MDCT studies were done in 23 patients. Fourteen patients (16 operations) subsequently had surgery. Imaging reports were reviewed and findings compared with surgical findings and findings in other imaging studies. Results: Patients with surgery : 19 NPTs (16 extrapancreatic gastrinomas and 3 pancreatic NPTs) were identified at surgery. MDCT identified 16 and somatostatin receptor scintigraphy (SRS) 11 out of 19 tumors. Endoscopic ultrasound detected 11 out of 14 NPTs. Patients without surgery : In 4 out of 9 patients, no NPTs were identified at MDCT. Conclusion: Dedicated MDCT of the pancreas can identify many NPTs, including small duodenal and periduodenal tumors, and the detection rate is better than reported in the older literature on CT

  10. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  11. Films, Affective Computing and Aesthetic Experience: Identifying Emotional and Aesthetic Highlights from Multimodal Signals in a Social Setting

    OpenAIRE

    Kostoulas, Theodoros; Chanel, Guillaume; Muszynski, Michal; Lombardo, Patrizia; Pun, Thierry

    2017-01-01

    Over the last years, affective computing has been strengthening its ties with the humanities, exploring and building understanding of people’s responses to specific artistic multimedia stimuli. “Aesthetic experience” is acknowledged to be the subjective part of some artistic exposure, namely, the inner affective state of a person exposed to some artistic object. In this work, we describe ongoing research activities for studying the aesthetic experience of people when exposed to movie artistic...

  12. Computational analysis of the SRS Phase III salt disposition alternatives

    International Nuclear Information System (INIS)

    Dimenna, R.A.

    2000-01-01

    In late 1997, the In-Tank Precipitation (ITP), facility was shut down and an evaluation of alternative methods to process the liquid high-level waste stored in the Savannah River Site High-Level Waste storage tanks was begun. The objective was to determine whether another process might avoid the operational difficulties encountered with ITP for a lower cost than modifying the existing structured approach to evaluating proposed alternatives on a common basis to identify the best one. Results from the computational analysis were a key part of the input used to select a primary and a secondary salt disposition alternative. This paper describes the process by which the computation needs were identified, addressed, and accomplished with a limited staff under stringent schedule constraints

  13. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Korfiati, Aigli; Theofilatos, Konstantinos A.; Likothanassis, Spiridon D.; Tsakalidis, Athanasios K.; Mavroudi, Seferina P.

    2013-01-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. © 2013 Elsevier Inc.

  14. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2013-06-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3\\'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. © 2013 Elsevier Inc.

  15. Identifying mechanisms in the control of quantum dynamics through Hamiltonian encoding

    International Nuclear Information System (INIS)

    Mitra, Abhra; Rabitz, Herschel

    2003-01-01

    A variety of means are now available to design control fields for manipulating the evolution of quantum systems. However, the underlying physical mechanisms often remain obscure, especially in the cases of strong fields and high quantum state congestion. This paper proposes a method to quantitatively determine the various pathways taken by a quantum system in going from the initial state to the final target. The mechanism is revealed by encoding a signal in the system Hamiltonian and decoding the resultant nonlinear distortion of the signal in the system time-evolution operator. The relevant interfering pathways determined by this analysis give insight into the physical mechanisms operative during the evolution of the quantum system. A hierarchy of mechanism identification algorithms with increasing ability to extract more detailed pathway information is presented. The mechanism identification concept is presented in the context of analyzing computer simulations of controlled dynamics. As illustrations of the concept, mechanisms are identified in the control of several simple, discrete-state quantum systems. The mechanism analysis tools reveal the roles of multiple interacting quantum pathways to maximally take advantage of constructive and destructive interference. Similar procedures may be applied directly in the laboratory to identify control mechanisms without resort to computer modeling, although this extension is not addressed in this paper

  16. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  17. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  18. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  19. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  20. Identifying people from gait pattern with accelerometers

    Science.gov (United States)

    Ailisto, Heikki J.; Lindholm, Mikko; Mantyjarvi, Jani; Vildjiounaite, Elena; Makela, Satu-Marja

    2005-03-01

    Protecting portable devices is becoming more important, not only because of the value of the devices themselves, but for the value of the data in them and their capability for transactions, including m-commerce and m-banking. An unobtrusive and natural method for identifying the carrier of portable devices is presented. The method uses acceleration signals produced by sensors embedded in the portable device. When the user carries the device, the acceleration signal is compared with the stored template signal. The method consists of finding individual steps, normalizing and averaging them, aligning them with the template and computing cross-correlation, which is used as a measure of similarity. Equal Error Rate of 6.4% is achieved in tentative experiments with 36 test subjects.

  1. Method of identifying features in indexed data

    Science.gov (United States)

    Jarman, Kristin H [Richland, WA; Daly, Don Simone [Richland, WA; Anderson, Kevin K [Richland, WA; Wahl, Karen L [Richland, WA

    2001-06-26

    The present invention is a method of identifying features in indexed data, especially useful for distinguishing signal from noise in data provided as a plurality of ordered pairs. Each of the plurality of ordered pairs has an index and a response. The method has the steps of: (a) providing an index window having a first window end located on a first index and extending across a plurality of indices to a second window end; (b) selecting responses corresponding to the plurality of indices within the index window and computing a measure of dispersion of the responses; and (c) comparing the measure of dispersion to a dispersion critical value. Advantages of the present invention include minimizing signal to noise ratio, signal drift, varying baseline signal and combinations thereof.

  2. Computational models of airway branching morphogenesis.

    Science.gov (United States)

    Varner, Victor D; Nelson, Celeste M

    2017-07-01

    The bronchial network of the mammalian lung consists of millions of dichotomous branches arranged in a highly complex, space-filling tree. Recent computational models of branching morphogenesis in the lung have helped uncover the biological mechanisms that construct this ramified architecture. In this review, we focus on three different theoretical approaches - geometric modeling, reaction-diffusion modeling, and continuum mechanical modeling - and discuss how, taken together, these models have identified the geometric principles necessary to build an efficient bronchial network, as well as the patterning mechanisms that specify airway geometry in the developing embryo. We emphasize models that are integrated with biological experiments and suggest how recent progress in computational modeling has advanced our understanding of airway branching morphogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Quality assurance of analytical, scientific, and design computer programs for nuclear power plants

    International Nuclear Information System (INIS)

    1994-06-01

    This Standard applies to the design and development, modification, documentation, execution, and configuration management of computer programs used to perform analytical, scientific, and design computations during the design and analysis of safety-related nuclear power plant equipment, systems, structures, and components as identified by the owner. 2 figs

  4. Quality assurance of analytical, scientific, and design computer programs for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-06-01

    This Standard applies to the design and development, modification, documentation, execution, and configuration management of computer programs used to perform analytical, scientific, and design computations during the design and analysis of safety-related nuclear power plant equipment, systems, structures, and components as identified by the owner. 2 figs.

  5. Effect of fast neutrons and gamma radiation on germination, pollen and ovule sterility and leaf variations in mung bean

    International Nuclear Information System (INIS)

    Avinash Chandra; Tewari, S.N.

    1978-01-01

    The seeds of mung bean (Phaseolus aureus Roxb.) varieties S-8 and Pusa Baisakhi were irradiated with 15, 30, 45 and 60 k rads of gamma-rays and 500, 1000, 2000 and 3000 rads of fast neutrons. The results showed that there is a gradual reduction in amount of germination of seeds, pollen and ovule fertility with increasing doses of both mutagens. These mutagens also cause leaf abnormalities such as unifoliate, bifoliate, trifoliate, tetrafoliate and pentafoliate. Both tetra and pentafoliate leaves observed on the same plant of S-8 variety under fast neutron irradiation appear to have been associated with enhanced luxuriance of the plant resulting in satisfactory pod formation. (author)

  6. A STUDY ON WEED CONTROL IN SOYBEAN

    Directory of Open Access Journals (Sweden)

    S. TJITROSEMITO

    1991-01-01

    Full Text Available Two field experiments on weed control in soybeans were carried out at BIOTROP, Bogor, Indonesia from February to June, 1989. The critical period for weed control was found to be between 20 - 40 days after planting of soybean (c. v. Wilis grown at a planting distance of 40 x 10 cm. It did not coincide with the fastest growth in terms of trifoliate leaf number. Further studies were suggested to understand the physiological growth of soybean related to weed control. Pendimethalin at 660- 1320 g a.e./ha applied one day after sowing did not cause any phytotoxic effect to soybean and had good weed control performance.

  7. Alfalfa (Medicago sativa L.).

    Science.gov (United States)

    Fu, Chunxiang; Hernandez, Timothy; Zhou, Chuanen; Wang, Zeng-Yu

    2015-01-01

    Alfalfa (Medicago sativa L.) is a high-quality forage crop widely grown throughout the world. This chapter describes an efficient protocol that allows for the generation of large number of transgenic alfalfa plants by sonication-assisted Agrobacterium-mediated transformation. Binary vectors carrying different selectable marker genes that confer resistance to phosphinothricin (bar), kanamycin (npt II), or hygromycin (hph) were used to generate transgenic alfalfa plants. Intact trifoliates collected from clonally propagated plants in the greenhouse were sterilized with bleach and then inoculated with Agrobacterium strain EHA105. More than 80 % of infected leaf pieces could produce rooted transgenic plants in 4-5 months after Agrobacterium-mediated transformation.

  8. Diversion Path Analysis handbook. Volume 3 (of 4 volumes). Computer Program 1

    International Nuclear Information System (INIS)

    Schleter, J.C.

    1978-11-01

    The FORTRAN IV computer program, DPA Computer Program 1 (DPACP-1), is used to assemble and tabulate the data for Specific Diversion Paths (SDPs) identified when performing a Diversion Path Analysis (DPA) in accord with the methodology given in Volume 1. The program requires 255498 bytes exclusive of the operating system. The data assembled and tabulated by DPACP-1 are used by the DPA team to assist in analyzing vulnerabilities, in a plant's material control and material accounting subsystems, to diversion of special nuclear material (SNM) by a knowledgable insider. Based on this analysis, the DPA team can identify, and propose to plant management, modifications to the plant's safeguards system that would eliminate, or reduce the severity of, the identified vulnerabilities. The data are also used by plant supervision when investigating a potential diversion

  9. Computer Simulation of a Hardwood Processing Plant

    Science.gov (United States)

    D. Earl Kline; Philip A. Araman

    1990-01-01

    The overall purpose of this paper is to introduce computer simulation as a decision support tool that can be used to provide managers with timely information. A simulation/animation modeling procedure is demonstrated for wood products manufacuring systems. Simulation modeling techniques are used to assist in identifying and solving problems. Animation is used for...

  10. Immediate Realities: an anthropology of computer visualisation in archaeology

    Directory of Open Access Journals (Sweden)

    Jonathan Bateman

    2000-06-01

    Full Text Available The use of computer visualisation techniques is an increasing part of archaeology's illustrative repertoire - but how do these images relate to the wider visual language that archaeologists use? This article assesses computer visualisations in the light of a range of anthropological, art historical, and cultural critiques to place them and their production squarely within the broader spectrum of the discipline's output. Moving from identifying the shortcomings in the methods and scope of existing critiques of archaeological illustrations, a comprehensive approach to understanding the visual culture of archaeology is outlined. This approach is specifically applied to computer visualisations, and identifies both the sociology of their production, and the technological nature of their creation and reproduction as key elements influencing their readings as communicators of archaeological ideas. In order to develop useful understandings of how the visual languages we employ act within the discourse of the discipline, we must be inclusive in our critiques of those languages. Until we consider the cultural products of our discipline with the same sophistication with which we examine the products of other cultures (past and present, we will struggle to use them to their full potential.

  11. Computational botany methods for automated species identification

    CERN Document Server

    Remagnino, Paolo; Wilkin, Paul; Cope, James; Kirkup, Don

    2017-01-01

    This book discusses innovative methods for mining information from images of plants, especially leaves, and highlights the diagnostic features that can be implemented in fully automatic systems for identifying plant species. Adopting a multidisciplinary approach, it explores the problem of plant species identification, covering both the concepts of taxonomy and morphology. It then provides an overview of morphometrics, including the historical background and the main steps in the morphometric analysis of leaves together with a number of applications. The core of the book focuses on novel diagnostic methods for plant species identification developed from a computer scientist’s perspective. It then concludes with a chapter on the characterization of botanists' visions, which highlights important cognitive aspects that can be implemented in a computer system to more accurately replicate the human expert’s fixation process. The book not only represents an authoritative guide to advanced computational tools fo...

  12. Computational intelligence applications in modeling and control

    CERN Document Server

    Vaidyanathan, Sundarapandian

    2015-01-01

    The development of computational intelligence (CI) systems was inspired by observable and imitable aspects of intelligent activity of human being and nature. The essence of the systems based on computational intelligence is to process and interpret data of various nature so that that CI is strictly connected with the increase of available data as well as capabilities of their processing, mutually supportive factors. Developed theories of computational intelligence were quickly applied in many fields of engineering, data analysis, forecasting, biomedicine and others. They are used in images and sounds processing and identifying, signals processing, multidimensional data visualization, steering of objects, analysis of lexicographic data, requesting systems in banking, diagnostic systems, expert systems and many other practical implementations. This book consists of 16 contributed chapters by subject experts who are specialized in the various topics addressed in this book. The special chapters have been brought ...

  13. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  14. Merging K-means with hierarchical clustering for identifying general-shaped groups.

    Science.gov (United States)

    Peterson, Anna D; Ghosh, Arka P; Maitra, Ranjan

    2018-01-01

    Clustering partitions a dataset such that observations placed together in a group are similar but different from those in other groups. Hierarchical and K -means clustering are two approaches but have different strengths and weaknesses. For instance, hierarchical clustering identifies groups in a tree-like structure but suffers from computational complexity in large datasets while K -means clustering is efficient but designed to identify homogeneous spherically-shaped clusters. We present a hybrid non-parametric clustering approach that amalgamates the two methods to identify general-shaped clusters and that can be applied to larger datasets. Specifically, we first partition the dataset into spherical groups using K -means. We next merge these groups using hierarchical methods with a data-driven distance measure as a stopping criterion. Our proposal has the potential to reveal groups with general shapes and structure in a dataset. We demonstrate good performance on several simulated and real datasets.

  15. Persist and cope: New Zealand women in computing

    Directory of Open Access Journals (Sweden)

    Alison Hunter

    Full Text Available New Zealand has a thriving computing industry but further growth is hampered by a skills shortage. A lack of women in the industry exacerbates this problem. Women are under-represented in the industry, and those who do take up computing careers experience conditions of discrimination and marginalisation. This paper reports on a qualitative study of the strategies used by women to cope with their marginalisation. Using multi-sited ethnographic methodology, data were collected using semi-structured interviews with twenty-nine computing professionals. Despite some women denying any marginalisation, all were found to employ some form of coping strategy. Seven different strategies were identified. The women interviewed were more inclined to join organisations directly relating to their roles rather than support initiatives which might improve conditions for women.

  16. ENVIRONMENTAL ANALYSIS BY AB INITIO QUANTUM MECHANICAL COMPUTATION AND GAS CHROMATOGRAPHY/FOURIER TRANSFORM INFRARED SPECTROMETRY.

    Science.gov (United States)

    Computational chemistry, in conjunction with gas chromatography/mass spectrometry/Fourier transform infrared spectrometry (GC/MS/FT-IR), was used to tentatively identify seven tetrachlorobutadiene (TCBD) isomers detected in an environmental sample. Computation of the TCBD infrare...

  17. Computer generated holographic microtags

    International Nuclear Information System (INIS)

    Sweatt, W.C.

    1998-01-01

    A microlithographic tag comprising an array of individual computer generated holographic patches having feature sizes between 250 and 75 nanometers is disclosed. The tag is a composite hologram made up of the individual holographic patches and contains identifying information when read out with a laser of the proper wavelength and at the proper angles of probing and reading. The patches are fabricated in a steep angle Littrow readout geometry to maximize returns in the -1 diffracted order. The tags are useful as anti-counterfeiting markers because of the extreme difficulty in reproducing them. 5 figs

  18. Identifying the Factors Leading to Success: How an Innovative Science Curriculum Cultivates Student Motivation

    Science.gov (United States)

    Scogin, Stephen C.

    2016-01-01

    "PlantingScience" is an award-winning program recognized for its innovation and use of computer-supported scientist mentoring. Science learners work on inquiry-based experiments in their classrooms and communicate asynchronously with practicing plant scientist-mentors about the projects. The purpose of this study was to identify specific…

  19. Analog series-based scaffolds: computational design and exploration of a new type of molecular scaffolds for medicinal chemistry

    Science.gov (United States)

    Dimova, Dilyana; Stumpfe, Dagmar; Hu, Ye; Bajorath, Jürgen

    2016-01-01

    Aim: Computational design of and systematic search for a new type of molecular scaffolds termed analog series-based scaffolds. Materials & methods: From currently available bioactive compounds, analog series were systematically extracted, key compounds identified and new scaffolds isolated from them. Results: Using our computational approach, more than 12,000 scaffolds were extracted from bioactive compounds. Conclusion: A new scaffold definition is introduced and a computational methodology developed to systematically identify such scaffolds, yielding a large freely available scaffold knowledge base. PMID:28116132

  20. Novel use of smart tablet computer for ophthalmology

    Directory of Open Access Journals (Sweden)

    Zhao-Tian Zhang

    2015-01-01

    Full Text Available AIM:To identify and categorize ophthalmology-relevant apps for the iPad tablet computer as a source for ophthalmic practices on the Apple's App Store.METHODS: The Apple's App Store was searched for ophthalmology-relevant apps from January 2013 to August 2013. Eligible apps were identified and downloaded into the iPad tablet computers, and then categorized according to the apps' initial contents and our using experiences. Methods about how to use the iPad's built-in functions of instant video call(FaceTime®and automatic data storage technology(iCloud®were also described together with the apps. Other operating systems of Microsoft's Window Phone and Google's Android were also searched for ophthalmology-relevant apps.RESULTS: The keywords for searching on the Apple's App Store were “ophthalmology” and “eye”. And we could found 111 eligible apps with the former keyword, and 452 ones with the latter one. The integrated uses of the iPad tablet computer were then categorized into five aspects. Based on our clinical practice, we finally summarized the advantages and disadvantages of the iPad tablet computer for ophthalmic practices. However, ophthalmology-relevant apps were found to be very limited in number on the other two platforms.CONCLUSION: The integrated use of self built-in apps and third-party apps can facilitate our clinical work in examination, telemedicine, reference, disease education and literature searching. More studies are needed to verify its validation and reliability in the professional fields, especially eye examinations.

  1. Computer-aided testing and operational aids for PARR-1 nuclear reactor

    International Nuclear Information System (INIS)

    Ansari, S.A.

    1990-01-01

    The utilization of the plant computer of Pakistan Research Reactor (PARR-1) for automatic periodic testing of nuclear instrumentation in the reactor is described. Computer algorithms have been developed for on-line acquisition and real-time processing of nuclear channel signals. The mean value, standard deviation, and probability distributions of nuclear channel signals are obtained in real time, and the computer generates a warning message if the signal error exceeds the maximum permissible error. In this way a faulty channel is automatically identified. Other real-time algorithms are also described that assist the operator in safe reactor operation by automatically computing approach-to-criticality during reactor start-up and the control rod worth determination

  2. Dry eye syndrome among computer users

    Science.gov (United States)

    Gajta, Aurora; Turkoanje, Daniela; Malaescu, Iosif; Marin, Catalin-Nicolae; Koos, Marie-Jeanne; Jelicic, Biljana; Milutinovic, Vuk

    2015-12-01

    Dry eye syndrome is characterized by eye irritation due to changes of the tear film. Symptoms include itching, foreign body sensations, mucous discharge and transitory vision blurring. Less occurring symptoms include photophobia and eye tiredness. Aim of the work was to determine the quality of the tear film and ocular dryness potential risk in persons who spend more than 8 hours using computers and possible correlations between severity of symptoms (dry eyes symptoms anamnesis) and clinical signs assessed by: Schirmer test I, TBUT (Tears break-up time), TFT (Tear ferning test). The results show that subjects using computer have significantly shorter TBUT (less than 5 s for 56 % of subjects and less than 10 s for 37 % of subjects), TFT type II/III in 50 % of subjects and type III 31% of subjects was found when compared to computer non users (TFT type I and II was present in 85,71% of subjects). Visual display terminal use, more than 8 hours daily, has been identified as a significant risk factor for dry eye. It's been advised to all persons who spend substantial time using computers to use artificial tears drops in order to minimize the symptoms of dry eyes syndrome and prevents serious complications.

  3. Plasma geometric optics analysis and computation

    International Nuclear Information System (INIS)

    Smith, T.M.

    1983-01-01

    Important practical applications in the generation, manipulation, and diagnosis of laboratory thermonuclear plasmas have created a need for elaborate computational capabilities in the study of high frequency wave propagation in plasmas. A reduced description of such waves suitable for digital computation is provided by the theory of plasma geometric optics. The existing theory is beset by a variety of special cases in which the straightforward analytical approach fails, and has been formulated with little attention to problems of numerical implementation of that analysis. The standard field equations are derived for the first time from kinetic theory. A discussion of certain terms previously, and erroneously, omitted from the expansion of the plasma constitutive relation is given. A powerful but little known computational prescription for determining the geometric optics field in the neighborhood of caustic singularities is rigorously developed, and a boundary layer analysis for the asymptotic matching of the plasma geometric optics field across caustic singularities is performed for the first time with considerable generality. A proper treatment of birefringence is detailed, wherein a breakdown of the fundamental perturbation theory is identified and circumvented. A general ray tracing computer code suitable for applications to radiation heating and diagnostic problems is presented and described

  4. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Science.gov (United States)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  5. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    Directory of Open Access Journals (Sweden)

    Dang Hung

    2017-07-01

    Full Text Available We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation efficiency, it is critical to keep trusted code bases lean, for large ones are unwieldy to vet and verify. In this paper, we advocate a simple approach wherein many basic algorithms (e.g., sorting can be made privacy-preserving by adding a step that securely scrambles the data before feeding it to the original algorithms. We call this approach Scramble-then-Compute (StC, and give a sufficient condition whereby existing external memory algorithms can be made privacy-preserving via StC. This approach facilitates code-reuse, and its simplicity contributes to a smaller trusted code base. It is also general, allowing algorithm designers to leverage an extensive body of known efficient algorithms for better performance. Our experiments show that StC could offer up to 4.1× speedups over known, application-specific alternatives.

  6. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  7. Identifying online user reputation of user-object bipartite networks

    Science.gov (United States)

    Liu, Xiao-Lu; Liu, Jian-Guo; Yang, Kai; Guo, Qiang; Han, Jing-Ti

    2017-02-01

    Identifying online user reputation based on the rating information of the user-object bipartite networks is important for understanding online user collective behaviors. Based on the Bayesian analysis, we present a parameter-free algorithm for ranking online user reputation, where the user reputation is calculated based on the probability that their ratings are consistent with the main part of all user opinions. The experimental results show that the AUC values of the presented algorithm could reach 0.8929 and 0.8483 for the MovieLens and Netflix data sets, respectively, which is better than the results generated by the CR and IARR methods. Furthermore, the experimental results for different user groups indicate that the presented algorithm outperforms the iterative ranking methods in both ranking accuracy and computation complexity. Moreover, the results for the synthetic networks show that the computation complexity of the presented algorithm is a linear function of the network size, which suggests that the presented algorithm is very effective and efficient for the large scale dynamic online systems.

  8. SISTEM DETEKSI WAJAH PADA OPEN SOURCE PHYSICAL COMPUTING

    Directory of Open Access Journals (Sweden)

    Yupit Sudianto

    2014-01-01

    Full Text Available Face detection is one of the interesting research area. Majority of this research implemented on a computer. Development of face detection on a computer requires a significant investment costs. In addition to having to spend the cost of procurement of computers, is also required for operational cost such as electricity use, because the computer requires large power/watt.This research is proposed to build a face detection system using Arduino. The system will be autonomous, in other word the role of computer will be replaced by Arduino. Arduino is used is Arduino Mega 2560 with specifications microcontroller AT MEGA 2560, a speed of 16 MHz, 256 KB flash memory, 8 KB SRAM, 4 KB EEPROM. So not all face detection algorithm can be implemented on the Arduino. The limitations of memory owned by the arduino will be resolved by applying the method of template matching using the facial features in the form of a template that is shaped like a mask. Detection rate achieved in this study is 80% - 100%. Where, in the Arduino's success in identifying the face are influenced by the distance between the camera with the human face and human movement.

  9. Parallel algorithms for mapping pipelined and parallel computations

    Science.gov (United States)

    Nicol, David M.

    1988-01-01

    Many computational problems in image processing, signal processing, and scientific computing are naturally structured for either pipelined or parallel computation. When mapping such problems onto a parallel architecture it is often necessary to aggregate an obvious problem decomposition. Even in this context the general mapping problem is known to be computationally intractable, but recent advances have been made in identifying classes of problems and architectures for which optimal solutions can be found in polynomial time. Among these, the mapping of pipelined or parallel computations onto linear array, shared memory, and host-satellite systems figures prominently. This paper extends that work first by showing how to improve existing serial mapping algorithms. These improvements have significantly lower time and space complexities: in one case a published O(nm sup 3) time algorithm for mapping m modules onto n processors is reduced to an O(nm log m) time complexity, and its space requirements reduced from O(nm sup 2) to O(m). Run time complexity is further reduced with parallel mapping algorithms based on these improvements, which run on the architecture for which they create the mappings.

  10. Experimentos de cavalos para citros. III

    Directory of Open Access Journals (Sweden)

    S. Moreira

    1960-01-01

    ão prosseguidas.Six rootstock experiments were planted at the Tietê Experiment Station in 1949. The scion varieties: Eureka lemon, Baianinha (Navel orange, Pêra orange, Hamlin orange, Maracanã orange and Mexerica mandarin were budded on: sour orange, caipira and pêra sweet oranges, Rangpur lime, Brazilian and Florida rough lemons, cravo and Cleopatra tangerines, red shaddock, trifoliate orange, sweet lime and Sampson tangelo. The planting was made in a silt-loam shallow soil. The tops of Baianinha, Hamlin and Maracanã oranges induced exocortis symptoms on trifoliate orange and Rangpur lime rootstocks. The Pêra orange top on Florida rough lemon and trifoliate orange showed bud-union-ring symptoms. A complete disharmony at the point of union of the Eureka lemon-trifoliate was observed, and the trees died after 5 years. The trunk diameter in 1949, 1954 and 1958; the height and circumference of the trees in 1958; the annual yield since 1951 up to 1959 are here reported. The statistical analysis of the yield data was made grouped in two periods (1951-54 and 1955-58 and also for the 8-year period. In the first period the Brazilian rough lemon rootstock gave the biggest tree sizes and yields with all top varieties, but the caipira orange rootstock exceeded it in the last years. The trigoliate orange gave the smallest trees and yields. With some scion varieties the Rangpur lime had an outstanding position as to productiveness. The Florida rough lemon and tangerines had a medium position among the rootstocks tried. These esperimerits will be maintained under observation and additional data will be reported in future publications.

  11. Creation of a system of signs to identify the user on the dynamics of ...

    African Journals Online (AJOL)

    Describes a method of creating a system of signs to identify the user for authentication on the dynamics of ink handwriting using a multi-touch sensor. The result is a reduction of the computational complexity of classification when creating access control systems. Keywords: analysis and classification of signals, user ...

  12. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  13. Towards a global monitoring system for CMS computing operations

    CERN Multimedia

    CERN. Geneva; Bauerdick, Lothar A.T.

    2012-01-01

    The operation of the CMS computing system requires a complex monitoring system to cover all its aspects: central services, databases, the distributed computing infrastructure, production and analysis workflows, the global overview of the CMS computing activities and the related historical information. Several tools are available to provide this information, developed both inside and outside of the collaboration and often used in common with other experiments. Despite the fact that the current monitoring allowed CMS to successfully perform its computing operations, an evolution of the system is clearly required, to adapt to the recent changes in the data and workload management tools and models and to address some shortcomings that make its usage less than optimal. Therefore, a recent and ongoing coordinated effort was started in CMS, aiming at improving the entire monitoring system by identifying its weaknesses and the new requirements from the stakeholders, rationalise and streamline existing components and ...

  14. Diversion Path Analysis handbook. Volume 4 (of 4 volumes). Computer Program 2

    International Nuclear Information System (INIS)

    Schleter, J.C.

    1978-11-01

    The FORTRAN IV computer program, DPA Computer Program 2 (DPACP-2) is used to produce tables and statistics on modifications identified when performing a Diversion Path Analysis (DPA) in accord with the methodology given in Volume 1. The program requires 259088 bytes exclusive of the operating system. The data assembled and tabulated by DPACP-2 assist the DPA team in analyzing and evaluating modifications to the plant's safeguards system that would eliminate, or reduce the severity of, vulnerabilities identified by means of the DPA. These vulnerabilities relate to the capability of the plant's material control and material accounting subsystems to indicate diversion of special nuclear material (SNM) by a knowledgeable insider

  15. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  16. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  17. A Systematic Literature Review of Empirical Evidence on Computer Games and Serious Games

    Science.gov (United States)

    Connolly, Thomas M.; Boyle, Elizabeth A.; MacArthur, Ewan; Hainey, Thomas; Boyle, James M.

    2012-01-01

    This paper examines the literature on computer games and serious games in regard to the potential positive impacts of gaming on users aged 14 years or above, especially with respect to learning, skill enhancement and engagement. Search terms identified 129 papers reporting empirical evidence about the impacts and outcomes of computer games and…

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  19. How can computers support, enrich, and transform collaborative creativity

    DEFF Research Database (Denmark)

    Dalsgaard, Peter; Inie, Nanna; Hansen, Nicolai Brodersen

    2017-01-01

    The aim of the workshop is to examine and discuss how computers can support, enrich, and transform collaborative creative processes. By exploring and combining methodological, theoretical, and design- oriented perspectives, we wish to examine the implications, potentials, and limitations of diffe......The aim of the workshop is to examine and discuss how computers can support, enrich, and transform collaborative creative processes. By exploring and combining methodological, theoretical, and design- oriented perspectives, we wish to examine the implications, potentials, and limitations...... of different approaches to providing digital support for collaborative creativity. Participation in the workshop requires participants to actively document and identify salient themes in one or more examples of computer- supported collaborative creativity, and the resulting material will serve as the empirical...

  20. 3rd International Symposium on Big Data and Cloud Computing Challenges

    CERN Document Server

    Neelanarayanan, V

    2016-01-01

    This proceedings volume contains selected papers that were presented in the 3rd International Symposium on Big data and Cloud Computing Challenges, 2016 held at VIT University, India on March 10 and 11. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data and Cloud Computing are identified and presented throughout the book, which is intended for researchers, scholars, students, software developers and practitioners working at the forefront in their field. This book acts as a platform for exchanging ideas, setting questions for discussion, and sharing the experience in Big Data and Cloud Computing domain.

  1. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  2. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  3. ESP and NOAH: computer programs for flood-risk analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Montague, D.F.; Rooney, J.J.; Fussell, J.B.; Baker, L.S.

    1982-06-01

    This report describes a computer program package that aids in assessing the impact of floods on risk from nuclear power plants. The package consists of two distinct computer programs: ESP and NOAH. The ESP program improves the efficiency of a flood analysis by screening accident sequences and identifying accident sequences that are potentially significant contributors to risk in the event of a flood. Input to ESP includes accident sequences from an existing risk assessment and flood screening criteria. The NOAH program provides detailed qualitative analysis of the plant systems identified by ESP. NOAH performs a qualitative flood simulation of the fault tree

  4. Xtalk: a path-based approach for identifying crosstalk between signaling pathways

    Science.gov (United States)

    Tegge, Allison N.; Sharp, Nicholas; Murali, T. M.

    2016-01-01

    Motivation: Cells communicate with their environment via signal transduction pathways. On occasion, the activation of one pathway can produce an effect downstream of another pathway, a phenomenon known as crosstalk. Existing computational methods to discover such pathway pairs rely on simple overlap statistics. Results: We present Xtalk, a path-based approach for identifying pairs of pathways that may crosstalk. Xtalk computes the statistical significance of the average length of multiple short paths that connect receptors in one pathway to the transcription factors in another. By design, Xtalk reports the precise interactions and mechanisms that support the identified crosstalk. We applied Xtalk to signaling pathways in the KEGG and NCI-PID databases. We manually curated a gold standard set of 132 crosstalking pathway pairs and a set of 140 pairs that did not crosstalk, for which Xtalk achieved an area under the receiver operator characteristic curve of 0.65, a 12% improvement over the closest competing approach. The area under the receiver operator characteristic curve varied with the pathway, suggesting that crosstalk should be evaluated on a pathway-by-pathway level. We also analyzed an extended set of 658 pathway pairs in KEGG and to a set of more than 7000 pathway pairs in NCI-PID. For the top-ranking pairs, we found substantial support in the literature (81% for KEGG and 78% for NCI-PID). We provide examples of networks computed by Xtalk that accurately recovered known mechanisms of crosstalk. Availability and implementation: The XTALK software is available at http://bioinformatics.cs.vt.edu/~murali/software. Crosstalk networks are available at http://graphspace.org/graphs?tags=2015-bioinformatics-xtalk. Contact: ategge@vt.edu, murali@cs.vt.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26400040

  5. How can computers support, enrich, and transform collaborative creativity

    DEFF Research Database (Denmark)

    Dalsgaard, Peter; Inie, Nanna; Hansen, Nicolai Brodersen

    2017-01-01

    of different approaches to providing digital support for collaborative creativity. Participation in the workshop requires participants to actively document and identify salient themes in one or more examples of computer- supported collaborative creativity, and the resulting material will serve as the empirical...

  6. Computation of periods of acoustical oscillations of the sun

    International Nuclear Information System (INIS)

    Vorontsov, S.V.; Zharkov, V.N.

    1977-01-01

    It is stated that regular pulsations of the Sun were first reported in 1975-76 by several investigators (see Nature 259:87 and 92 (1976)), and that these oscillations were difficult to identify. It was decided to compute the periods of some acoustical modes using experience gained in calculations of free oscillations of Jupiter and Saturn, employing some complete solar models for the interior, the convective zone and the solar atmosphere. The equations employed and the methods of computations are described, and the results are given. (U.K.)

  7. Internet messenger based smart virtual class learning using ubiquitous computing

    Science.gov (United States)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  8. Algorithmic mechanisms for reliable crowdsourcing computation under collusion.

    Science.gov (United States)

    Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A; Pareja, Daniel

    2015-01-01

    We consider a computing system where a master processor assigns a task for execution to worker processors that may collude. We model the workers' decision of whether to comply (compute the task) or not (return a bogus result to save the computation cost) as a game among workers. That is, we assume that workers are rational in a game-theoretic sense. We identify analytically the parameter conditions for a unique Nash Equilibrium where the master obtains the correct result. We also evaluate experimentally mixed equilibria aiming to attain better reliability-profit trade-offs. For a wide range of parameter values that may be used in practice, our simulations show that, in fact, both master and workers are better off using a pure equilibrium where no worker cheats, even under collusion, and even for colluding behaviors that involve deviating from the game.

  9. Use of the computer program in a cloud computing

    Directory of Open Access Journals (Sweden)

    Radovanović Sanja

    2013-01-01

    Full Text Available Cloud computing represents a specific networking, in which a computer program simulates the operation of one or more server computers. In terms of copyright, all technological processes that take place within the cloud computing are covered by the notion of copying computer programs, and exclusive right of reproduction. However, this right suffers some limitations in order to allow normal use of computer program by users. Based on the fact that the cloud computing is virtualized network, the issue of normal use of the computer program requires to put all aspects of the permitted copying into the context of a specific computing environment and specific processes within the cloud. In this sense, the paper pointed out that the user of a computer program in cloud computing, needs to obtain the consent of the right holder for any act which he undertakes using the program. In other words, the copyright in the cloud computing is a full scale, and thus the freedom of contract (in the case of this particular restriction as well.

  10. "Life" and Education Policy: Intervention, Augmentation and Computation

    Science.gov (United States)

    Gulson, Kalervo N.; Webb, P. Taylor

    2018-01-01

    In this paper, we are interested in the notion of multiple ways of thinking, knowing and transforming life, namely an increasing capacity to intervene in "life" as a "molecular biopolitics," and the changing ways in which "life" can be understood computationally. We identify and speculate on the ways different ideas…

  11. Survey of operating experience from LERs to identify aging trends

    International Nuclear Information System (INIS)

    Murphy, G.A.

    1985-01-01

    The results of a study using the Oak Ridge National Laboratory's Nuclear Operations Analysis Center computer files of operating experience reports [licensee event reports (LERs), abnormal occurrences, etc.] are summarized in this study, specific time-related degradation mechanisms are identified as possible causes of a reportable occurrence. Data collected on domestic commercial nuclear power plants covering 1969 to 1982 yielded over 5800 events attributable to possible age related failures. Of these events, 2795 were attributable to instrument drift and are addressed separately in the report. The remaining events (3098) were reviewed, and data were collected for each event, which identified the specific system, component, and subpart: the information included the age-related mechanism, severity of the failure, and method of detection of the failure. About two-thirds of the failures were judged to be degraded, with one-third listed as catastrophic

  12. Quantum Computing and the Limits of the Efficiently Computable

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I'll discuss how computational complexity---the study of what can and can't be feasibly computed---has been interacting with physics in interesting and unexpected ways. I'll first give a crash course about computer science's P vs. NP problem, as well as about the capabilities and limits of quantum computers. I'll then touch on speculative models of computation that would go even beyond quantum computers, using (for example) hypothetical nonlinearities in the Schrodinger equation. Finally, I'll discuss BosonSampling ---a proposal for a simple form of quantum computing, which nevertheless seems intractable to simulate using a classical computer---as well as the role of computational complexity in the black hole information puzzle.

  13. Comportamento da laranjera 'Folha Murcha' em sete porta-enxertos no noroeste do Paraná Performance of 'Folha Murcha' orange on seven rootstocks in northwest of Parana

    Directory of Open Access Journals (Sweden)

    Neusa Maria Colauto Stenzel

    2005-12-01

    Full Text Available Este trabalho avaliou por um período de 14 anos, em Paranavaí-PR, o comportamento de plantas de laranjeira 'Folha Murcha' enxertadas nos porta-enxertos: limoeiros 'Cravo' (Citrus limonia, 'Rugoso da África' (Citrus jambhiri e 'Volkameriano' (Citrus volkameriana, citrangeiro 'C-13' (Citrus sinensis x Poncirus trifoliata, trifoliata (Poncirus trifoliata, tangerineiras 'Sunki' (Citrus sunki e 'Cleópatra' (Citrus reshni. O delineamento experimental utilizado foi o de blocos ao acaso, com sete tratamentos (porta-enxertos e quatro repetições, com três plantas por parcela. Os volumes das copas de plantas em tangerineira 'Cleópatra' e limoeiro 'Rugoso da África' foram significativamente maiores. Plantas em limoeiro 'Cravo' apresentaram a menor diferença entre os diâmetros dos troncos do porta-enxerto e da copa. A produção acumulada foi superior nas plantas em limoeiro 'Rugoso da África' e tangerineira 'Cleópatra' e menor em plantas sobre o trifoliata. A alternância da produção não foi acentuada nas plantas sobre os porta-enxertos avaliados. O teor de sólidos solúveis totais foi significativamente superior nos frutos obtidos de plantas enxertadas em trifoliata e menor em limoeiro 'Rugoso da África'. A qualidade do suco apresentou-se dentro dos padrões aceitáveis para variedades-copa de laranjeiras. A tangerineira 'Cleópatra' e o limoeiro 'Rugoso da África' são porta-enxertos promissores para a laranjeira 'Folha Murcha' nas condições avaliadas.This work evaluated for 14 years, in Paranavaí, PR, Brazil, the performance of 'Folha Murcha' orange trees budded on the following rootstocks: 'Rangpur' lime (Citrus limonia, 'African' rough lemon (Citrus jambhiri, 'Volkamer' lemon (Citrus volkameriana, 'C-13' citrange (Citrus sinensis × Poncirus trifoliata, trifoliate orange (Poncirus trifoliata, 'Sunki' mandarin (Citrus sunki and 'Cleopatra' mandarin (Citrus reshni. The experimental design was in blocks, with seven treatments

  14. Computer tomographic investigation of subcutaneous adipose tissue as an indicator of body composition

    DEFF Research Database (Denmark)

    McEvoy, Fintan; Madsen, Mads T.; Nielsen, Mai B.

    2009-01-01

    Background Modern computer tomography (CT) equipment can be used to acquire whole-body data from large animals such as pigs in minutes or less. In some circumstances, computer assisted analysis of the resulting image data can identify and measure anatomical features. The thickness of subcutaneous...... adipose tissue at a specific site measured by ultrasound, is used in the pig industry to assess adiposity and inform management decisions that have an impact on reproduction, food conversion performance and sow longevity. The measurement site, called "P2", is used throughout the industry. We propose...... and expressed as a proportion of total volume (fat-index). A computer algorithm was used to determined 10,201 subcutaneous adipose thickness measurements in each pig for each scan. From these data, sites were selected where correlation with fat-index was optimal. Results Image analysis correctly identified...

  15. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  16. A comparison of visual and quantitative methods to identify interstitial lung abnormalities

    OpenAIRE

    Kliment, Corrine R.; Araki, Tetsuro; Doyle, Tracy J.; Gao, Wei; Dupuis, Jos?e; Latourelle, Jeanne C.; Zazueta, Oscar E.; Fernandez, Isis E.; Nishino, Mizuki; Okajima, Yuka; Ross, James C.; Est?par, Ra?l San Jos?; Diaz, Alejandro A.; Lederer, David J.; Schwartz, David A.

    2015-01-01

    Background: Evidence suggests that individuals with interstitial lung abnormalities (ILA) on a chest computed tomogram (CT) may have an increased risk to develop a clinically significant interstitial lung disease (ILD). Although methods used to identify individuals with ILA on chest CT have included both automated quantitative and qualitative visual inspection methods, there has been not direct comparison between these two methods. To investigate this relationship, we created lung density met...

  17. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  18. The development of bronchiectasis on chest computed tomography in children with cystic fibrosis: can pre-stages be identified?

    Energy Technology Data Exchange (ETDEWEB)

    Tepper, Leonie A. [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Sophia Children' s Hospital, Department of Radiology, Erasmus MC, Rotterdam (Netherlands); Caudri, Daan [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Perez Rovira, Adria [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Erasmus MC, Biomedical Imaging Group Rotterdam, Departments of Radiology and Medical Informatics, Rotterdam (Netherlands); Tiddens, Harm A.W.M. [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Sophia Children' s Hospital, Department of Radiology, Erasmus MC, Rotterdam (Netherlands); Sophia Children' s Hospital, Department of Pediatric Pulmonology and Radiology, Erasmus Medical Center, Rotterdam (Netherlands); Bruijne, Marleen de [Erasmus MC, Biomedical Imaging Group Rotterdam, Departments of Radiology and Medical Informatics, Rotterdam (Netherlands); University of Copenhagen, Department of Computer Science, Copenhagen (Denmark)

    2016-12-15

    Bronchiectasis is an important component of cystic fibrosis (CF) lung disease but little is known about its development. We aimed to study the development of bronchiectasis and identify determinants for rapid progression of bronchiectasis on chest CT. Forty-three patients with CF with at least four consecutive biennial volumetric CTs were included. Areas with bronchiectasis on the most recent CT were marked as regions of interest (ROIs). These ROIs were generated on all preceding CTs using deformable image registration. Observers indicated whether: bronchiectasis, mucus plugging, airway wall thickening, atelectasis/consolidation or normal airways were present in the ROIs. We identified 362 ROIs on the most recent CT. In 187 (51.7 %) ROIs bronchiectasis was present on all preceding CTs, while 175 ROIs showed development of bronchiectasis. In 139/175 (79.4 %) no pre-stages of bronchiectasis were identified. In 36/175 (20.6 %) bronchiectatic airways the following pre-stages were identified: mucus plugging (17.7 %), airway wall thickening (1.7 %) or atelectasis/consolidation (1.1 %). Pancreatic insufficiency was more prevalent in the rapid progressors compared to the slow progressors (p = 0.05). Most bronchiectatic airways developed within 2 years without visible pre-stages, underlining the treacherous nature of CF lung disease. Mucus plugging was the most frequent pre-stage. (orig.)

  19. The development of bronchiectasis on chest computed tomography in children with cystic fibrosis: can pre-stages be identified?

    International Nuclear Information System (INIS)

    Tepper, Leonie A.; Caudri, Daan; Perez Rovira, Adria; Tiddens, Harm A.W.M.; Bruijne, Marleen de

    2016-01-01

    Bronchiectasis is an important component of cystic fibrosis (CF) lung disease but little is known about its development. We aimed to study the development of bronchiectasis and identify determinants for rapid progression of bronchiectasis on chest CT. Forty-three patients with CF with at least four consecutive biennial volumetric CTs were included. Areas with bronchiectasis on the most recent CT were marked as regions of interest (ROIs). These ROIs were generated on all preceding CTs using deformable image registration. Observers indicated whether: bronchiectasis, mucus plugging, airway wall thickening, atelectasis/consolidation or normal airways were present in the ROIs. We identified 362 ROIs on the most recent CT. In 187 (51.7 %) ROIs bronchiectasis was present on all preceding CTs, while 175 ROIs showed development of bronchiectasis. In 139/175 (79.4 %) no pre-stages of bronchiectasis were identified. In 36/175 (20.6 %) bronchiectatic airways the following pre-stages were identified: mucus plugging (17.7 %), airway wall thickening (1.7 %) or atelectasis/consolidation (1.1 %). Pancreatic insufficiency was more prevalent in the rapid progressors compared to the slow progressors (p = 0.05). Most bronchiectatic airways developed within 2 years without visible pre-stages, underlining the treacherous nature of CF lung disease. Mucus plugging was the most frequent pre-stage. (orig.)

  20. Quantum Computing's Classical Problem, Classical Computing's Quantum Problem

    OpenAIRE

    Van Meter, Rodney

    2013-01-01

    Tasked with the challenge to build better and better computers, quantum computing and classical computing face the same conundrum: the success of classical computing systems. Small quantum computing systems have been demonstrated, and intermediate-scale systems are on the horizon, capable of calculating numeric results or simulating physical systems far beyond what humans can do by hand. However, to be commercially viable, they must surpass what our wildly successful, highly advanced classica...

  1. Standards for collection of identifying information for health record keeping

    International Nuclear Information System (INIS)

    Carpenter, M.; Fair, M.E.; Lalonde, P.; Scott, T.

    1988-09-01

    A new recommended guideline for the standard data collection of individual identifying information has been developed and tested by Statistics Canada. The purpose of developing a standard method is to improve health record keeping in Canada: in particular for long term medical follow-up studies of individuals exposed to potentially hazardous agents for detection of possible health risks or delayed harm, e.g. individuals exposed to radiation through occupations, the environment, emergencies, or therapeutic practice. A data collection standard is also useful for epidemiological follow-up studies for other occupation groups such as chemical workers and miners, or for lifestyle, genetic and other studies. Statistics Canada, Health Division, Occupational and Environmental Health Research Unit (OEHRU), from their experience with long term health studies using the Canadian Mortality Data Base, has prepared a 'Data Collection Package' to include the developed and tested data collection guideline. It is anticipated this will help produce more thorough and comparable on-going record keeping while saving costs and time for many organizations e.g. Atomic Energy Control Board licensees who report radiation doses to the National Dose Registry, as well as for other companies and organizations across the country where long term medical follow-up studies are anticipated now or in the future. It may also allow for broader industrial, national and international comparisons. The guideline consists of a two page Individual Identity Summary (IIS): the first page for completion by the individual/employee to give unique identifying information; the second page for the study organizer/employer to include essential additional information (work history etc.). A third optional page can be used by organizations wishing to collect data on children. The Data Collection Package also includes brief explanatory notes, a suggested file record layout and detailed computer coding advice for entering

  2. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  3. Assessing the use of computers in industrial occupational health departments.

    Science.gov (United States)

    Owen, J P

    1995-04-01

    Computers are widely used in business and industry and the benefits of computerizing occupational health (OH) departments have been advocated by several authors. The requirements for successful computerization of an OH department are reviewed. Having identified the theoretical benefits, the real picture in industry is assessed by surveying 52 firms with over 1000 employees in a large urban area. Only 15 (29%) of the companies reported having any OH service, of which six used computers in the OH department, reflecting the business priorities of most of the companies. The types of software systems used and their main use are examined, along with perceived benefits or disadvantages. With the decreasing costs of computers and increasingly 'user-friendly' software, there is a real cost benefit to be gained from using computers in OH departments, although the concept may have to be 'sold' to management.

  4. The Influence of Computer-Mediated Communication Systems on Community

    Science.gov (United States)

    Rockinson-Szapkiw, Amanda J.

    2012-01-01

    As higher education institutions enter the intense competition of the rapidly growing global marketplace of online education, the leaders within these institutions are challenged to identify factors critical for developing and for maintaining effective online courses. Computer-mediated communication (CMC) systems are considered critical to…

  5. Design requirements for ubiquitous computing environments for healthcare professionals.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2004-01-01

    Ubiquitous computing environments can support clinical administrative routines in new ways. The aim of such computing approaches is to enhance routine physical work, thus it is important to identify specific design requirements. We studied healthcare professionals in an emergency room and developed the computer-augmented environment NOSTOS to support teamwork in that setting. NOSTOS uses digital pens and paper-based media as the primary input interface for data capture and as a means of controlling the system. NOSTOS also includes a digital desk, walk-up displays, and sensor technology that allow the system to track documents and activities in the workplace. We propose a set of requirements and discuss the value of tangible user interfaces for healthcare personnel. Our results suggest that the key requirements are flexibility in terms of system usage and seamless integration between digital and physical components. We also discuss how ubiquitous computing approaches like NOSTOS can be beneficial in the medical workplace.

  6. Computed tomography of human joints and radioactive waste drums

    International Nuclear Information System (INIS)

    Martz, Harry E.; Roberson, G. Patrick; Hollerbach, Karin; Logan, Clinton M.; Ashby, Elaine; Bernardi, Richard

    1999-01-01

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have seen increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed, 1.) Our computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. 2.) We are developing NDE and NDA techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity

  7. Image Visual Realism: From Human Perception to Machine Computation.

    Science.gov (United States)

    Fan, Shaojing; Ng, Tian-Tsong; Koenig, Bryan L; Herberg, Jonathan S; Jiang, Ming; Shen, Zhiqi; Zhao, Qi

    2017-08-30

    Visual realism is defined as the extent to which an image appears to people as a photo rather than computer generated. Assessing visual realism is important in applications like computer graphics rendering and photo retouching. However, current realism evaluation approaches use either labor-intensive human judgments or automated algorithms largely dependent on comparing renderings to reference images. We develop a reference-free computational framework for visual realism prediction to overcome these constraints. First, we construct a benchmark dataset of 2520 images with comprehensive human annotated attributes. From statistical modeling on this data, we identify image attributes most relevant for visual realism. We propose both empirically-based (guided by our statistical modeling of human data) and CNN-learned features to predict visual realism of images. Our framework has the following advantages: (1) it creates an interpretable and concise empirical model that characterizes human perception of visual realism; (2) it links computational features to latent factors of human image perception.

  8. PENGEMBANGAN MODEL COMPUTER-BASED E-LEARNING UNTUK MENINGKATKAN KEMAMPUAN HIGH ORDER MATHEMATICAL THINKING SISWA SMA

    OpenAIRE

    Jarnawi Afgani Dahlan; Yaya Sukjaya Kusumah; Mr Heri Sutarno

    2011-01-01

    The focus of this research is on the development of mathematics teaching and learning activity which is based on the application of computer software. The aim of research is as follows : 1) to identify some mathematics topics which feasible to be presented by computer-based e-learning, 2) design, develop, and implement computer-based e-learning on mathematics, and 3) analyze the impact of computer-based e-learning in the enhancement of SMA students’ high order mathematical thinking. All activ...

  9. Effect of Physical Education Teachers' Computer Literacy on Technology Use in Physical Education

    Science.gov (United States)

    Kretschmann, Rolf

    2015-01-01

    Teachers' computer literacy has been identified as a factor that determines their technology use in class. The aim of this study was to investigate the relationship between physical education (PE) teachers' computer literacy and their technology use in PE. The study group consisted of 57 high school level in-service PE teachers. A survey was used…

  10. Use of computed tomography to evaluate the intestinal tract of adult llamas

    International Nuclear Information System (INIS)

    Van Hoogmoed, L.; Roberts, G.; Snyder, J.R.; Yarbrough, T.; Haromon, F.

    1998-01-01

    In the llama, signs of colic are obscure and may be exhibited as persistent sternal recumbency and anorexia even in the presence of a surgical lesion. Diagnostic methods for evaluation of abdominal disorders are limited. As a result, surgical intervention may be prolonged and increase the risk of mortality and postoperative complications. The objective of this study was to determine the feasibility of computed tomography to evaluate the llama intestinal tract. Eighteen hours prior to the computed tomography scan, six llamas were given barium sulfate (15%) via an orogastric tube. Following induction of general anesthesia, the llamas were positioned in sternal recumbency, and 10 mm contiguous slices were obtained from the diaphragm to the tuber ischiadicum. Structures that were consistently identified included the first, second, and third compartments (C1, 2, and 3), small intestine, spiral colon, and ascending colon. C1 was easily identified in the cranial aspect of the abdomen due to its large size relative to the other compartments and characteristic saccules. C2 was located cranial, ventral, and to the right of C1, while C3 was visualized as a tubular structure to the right and ventral to C1 and C2, C3 was traced caudally until it turned dorsally and continued cranially to a dilated ampulla in the right cranial abdomen delineating the entrance to the small intestine. The spiral colon was identified consistently in the left ventral caudal abdomen. Structures that could not be conclusively identified included the cecum and mesenteric lymph nodes. Computed tomography allowed a consistent evaluation of the major intestinal structures associated with colic in the llama. Thus, computed tomography is a potentially valuable noninvasive diagnostic tool to effectively evaluate the abdominal cavity and differentiate medical from surgical lesions in the llama

  11. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  12. Environmental computing compendium - background and motivation

    Science.gov (United States)

    Heikkurinen, Matti; Kranzlmüller, Dieter

    2017-04-01

    The emerging discipline of environmental computing brings together experts in applied, advanced environmental modelling. The application domains address several fundamental societal challenges, ranging from disaster risk reduction to sustainability issues (such as food security on the global scale). The community has used an Intuitive, pragmatic approach when determining which initiatives are considered to "belong to the discipline". The community's growth is based on sharing of experiences and tools provides opportunities for reusing solutions or applying knowledge in new settings. Thus, limiting possible synergies by applying an arbitrary, formal definition to exclude some of the sources of solutions and knowledge would be counterproductive. However, the number of individuals and initiatives involved has grown to the level where a survey of initiatives and sub-themes they focus on is of interest. By surveying the project landscape and identifying common themes and building a shared vocabulary to describe them we can both communicate the relevance of the new discipline to the general public more easily and make it easier for the new members of the community to find the most promising collaboration partners. This talk presents the methodology and initial findings of the initial survey of the environmental computing initiatives and organisations, as well as approaches that could lead to an environmental computing compendium that would be a collaborative maintained shared resource of the environmental computing community.

  13. Test of the Center for Automated Processing of Hardwoods' Auto-Image Detection and Computer-Based Grading and Cutup System

    Science.gov (United States)

    Philip A. Araman; Janice K. Wiedenbeck

    1995-01-01

    Automated lumber grading and yield optimization using computer controlled saws will be plausible for hardwoods if and when lumber scanning systems can reliably identify all defects by type. Existing computer programs could then be used to grade the lumber, identify the best cut-up solution, and control the sawing machines. The potential value of a scanning grading...

  14. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Directory of Open Access Journals (Sweden)

    Marijan Beg

    2017-05-01

    Full Text Available Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i the re-compilation of source code, (ii the use of configuration files, (iii the graphical user interface, and (iv embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF. We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  15. An Integrated Bioinformatics and Computational Biology Approach Identifies New BH3-Only Protein Candidates.

    Science.gov (United States)

    Hawley, Robert G; Chen, Yuzhong; Riz, Irene; Zeng, Chen

    2012-05-04

    In this study, we utilized an integrated bioinformatics and computational biology approach in search of new BH3-only proteins belonging to the BCL2 family of apoptotic regulators. The BH3 (BCL2 homology 3) domain mediates specific binding interactions among various BCL2 family members. It is composed of an amphipathic α-helical region of approximately 13 residues that has only a few amino acids that are highly conserved across all members. Using a generalized motif, we performed a genome-wide search for novel BH3-containing proteins in the NCBI Consensus Coding Sequence (CCDS) database. In addition to known pro-apoptotic BH3-only proteins, 197 proteins were recovered that satisfied the search criteria. These were categorized according to α-helical content and predictive binding to BCL-xL (encoded by BCL2L1) and MCL-1, two representative anti-apoptotic BCL2 family members, using position-specific scoring matrix models. Notably, the list is enriched for proteins associated with autophagy as well as a broad spectrum of cellular stress responses such as endoplasmic reticulum stress, oxidative stress, antiviral defense, and the DNA damage response. Several potential novel BH3-containing proteins are highlighted. In particular, the analysis strongly suggests that the apoptosis inhibitor and DNA damage response regulator, AVEN, which was originally isolated as a BCL-xL-interacting protein, is a functional BH3-only protein representing a distinct subclass of BCL2 family members.

  16. Computer Architecture for Energy Efficient SFQ

    Science.gov (United States)

    2014-08-27

    IBM Corporation (T.J. Watson Research Laboratory) 1101 Kitchawan Road Yorktown Heights, NY 10598 -0000 2 ABSTRACT Number of Papers published in peer...accomplished during this ARO-sponsored project at IBM Research to identify and model an energy efficient SFQ-based computer architecture. The... IBM Windsor Blue (WB), illustrated schematically in Figure 2. The basic building block of WB is a "tile" comprised of a 64-bit arithmetic logic unit

  17. Identifying the most infectious lesions in pulmonary tuberculosis by high-resolution multi-detector computed tomography

    International Nuclear Information System (INIS)

    Yeh, Jun Jun; Chen, Solomon Chih-Cheng; Teng, Wen-Bao; Chou, Chun-Hsiung; Hsieh, Shih-Peng; Lee, Tsung-Lung; Wu, Ming-Ting

    2010-01-01

    This study aimed to determine whether characteristics detected by multi-detector computed tomography (MDCT) were predictive of highly infectious, smear-positive, active pulmonary tuberculosis (PTB). Among 124 patients with active PTB, 84 had positive (group 1) and 40 had negative (group 2) smear results for acid-fast bacilli. Multiplanar MDCT, axial conventional CT and chest X-ray images were analysed retrospectively for morphology, number, and segmental (lobe) distribution of lesions. By multivariate analysis, consolidation over any segment of the upper, middle, or lingual lobes, cavitations, and clusters of nodules were associated with group 1, while centrilobular nodules were predictive of group 2. Using five independent variables associated with risk in group 1, a prediction model was created to distinguish between group 1 and group 2. ROC curve analysis showed an area under the curve of 0.951 ± 0.021 for this prediction model. With the ideal cutoff point score of 1, the sensitivity, specificity, and positive predictive values were 84.5%, 97.5%, and 98.0%, respectively. A model to predict smear-positive active PTB on the basis of findings from MDCT may be a useful tool for clinical decisions about isolating patients pending sputum smear results. (orig.)

  18. Identifying the most infectious lesions in pulmonary tuberculosis by high-resolution multi-detector computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, Jun Jun [Pingtung Christian Hospital, Pingtung (China); Mei-Ho Institute of Technology, Pingtung (China); China Medical University, Taichung (China); Chen, Solomon Chih-Cheng [Pingtung Christian Hospital, Pingtung (China); National Taiwan University, Institute of Occupational Medicine and Industrial Hygiene, College of Public Health, Taipei (China); Teng, Wen-Bao; Chou, Chun-Hsiung; Hsieh, Shih-Peng; Lee, Tsung-Lung [Pingtung Christian Hospital, Pingtung (China); Wu, Ming-Ting [National Yang Ming University, Faculty of Medicine, School of Medicine, Taipei (China); Kaohsiung Veterans General Hospital, Section of Thoracic and Circulation Imaging, Department of Radiology, Kaohsiung (China)

    2010-09-15

    This study aimed to determine whether characteristics detected by multi-detector computed tomography (MDCT) were predictive of highly infectious, smear-positive, active pulmonary tuberculosis (PTB). Among 124 patients with active PTB, 84 had positive (group 1) and 40 had negative (group 2) smear results for acid-fast bacilli. Multiplanar MDCT, axial conventional CT and chest X-ray images were analysed retrospectively for morphology, number, and segmental (lobe) distribution of lesions. By multivariate analysis, consolidation over any segment of the upper, middle, or lingual lobes, cavitations, and clusters of nodules were associated with group 1, while centrilobular nodules were predictive of group 2. Using five independent variables associated with risk in group 1, a prediction model was created to distinguish between group 1 and group 2. ROC curve analysis showed an area under the curve of 0.951 {+-} 0.021 for this prediction model. With the ideal cutoff point score of 1, the sensitivity, specificity, and positive predictive values were 84.5%, 97.5%, and 98.0%, respectively. A model to predict smear-positive active PTB on the basis of findings from MDCT may be a useful tool for clinical decisions about isolating patients pending sputum smear results. (orig.)

  19. Computational and experimental analysis identified 6-diazo-5-oxonorleucine as a potential agent for treating infection by Plasmodium falciparum.

    Science.gov (United States)

    Plaimas, Kitiporn; Wang, Yulin; Rotimi, Solomon O; Olasehinde, Grace; Fatumo, Segun; Lanzer, Michael; Adebiyi, Ezekiel; König, Rainer

    2013-12-01

    Plasmodium falciparum (PF) is the most severe malaria parasite. It is developing resistance quickly to existing drugs making it indispensable to discover new drugs. Effective drugs have been discovered targeting metabolic enzymes of the parasite. In order to predict new drug targets, computational methods can be used employing database information of metabolism. Using this data, we performed recently a computational network analysis of metabolism of PF. We analyzed the topology of the network to find reactions which are sensitive against perturbations, i.e., when a single enzyme is blocked by drugs. We now used a refined network comprising also the host enzymes which led to a refined set of the five targets glutamyl-tRNA (gln) amidotransferase, hydroxyethylthiazole kinase, deoxyribose-phophate aldolase, pseudouridylate synthase, and deoxyhypusine synthase. It was shown elsewhere that glutamyl-tRNA (gln) amidotransferase of other microorganisms can be inhibited by 6-diazo-5-oxonorleucine. Performing a half maximal inhibitory concentration (IC50) assay, we showed, that 6-diazo-5-oxonorleucine is also severely affecting viability of PF in blood plasma of the human host. We confirmed this by an in vivo study observing Plasmodium berghei infected mice. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Analysis and Modeling of Social In uence in High Performance Computing Workloads

    KAUST Repository

    Zheng, Shuai

    2011-06-01

    High Performance Computing (HPC) is becoming a common tool in many research areas. Social influence (e.g., project collaboration) among increasing users of HPC systems creates bursty behavior in underlying workloads. This bursty behavior is increasingly common with the advent of grid computing and cloud computing. Mining the user bursty behavior is important for HPC workloads prediction and scheduling, which has direct impact on overall HPC computing performance. A representative work in this area is the Mixed User Group Model (MUGM), which clusters users according to the resource demand features of their submissions, such as duration time and parallelism. However, MUGM has some difficulties when implemented in real-world system. First, representing user behaviors by the features of their resource demand is usually difficult. Second, these features are not always available. Third, measuring the similarities among users is not a well-defined problem. In this work, we propose a Social Influence Model (SIM) to identify, analyze, and quantify the level of social influence across HPC users. The advantage of the SIM model is that it finds HPC communities by analyzing user job submission time, thereby avoiding the difficulties of MUGM. An offline algorithm and a fast-converging, computationally-efficient online learning algorithm for identifying social groups are proposed. Both offline and online algorithms are applied on several HPC and grid workloads, including Grid 5000, EGEE 2005 and 2007, and KAUST Supercomputing Lab (KSL) BGP data. From the experimental results, we show the existence of a social graph, which is characterized by a pattern of dominant users and followers. In order to evaluate the effectiveness of identified user groups, we show the pattern discovered by the offline algorithm follows a power-law distribution, which is consistent with those observed in mainstream social networks. We finally conclude the thesis and discuss future directions of our work.

  1. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  2. The Role of Computer-Aided Instruction in Science Courses and the Relevant Misconceptions of Pre-Service Teachers

    Science.gov (United States)

    Aksakalli, Ayhan; Turgut, Umit; Salar, Riza

    2016-01-01

    This research aims to investigate the ways in which pre-service physics teachers interact with computers, which, as an indispensable means of today's technology, are of major value in education and training, and to identify any misconceptions said teachers may have about computer-aided instruction. As part of the study, computer-based physics…

  3. 'I'm good, but not that good': digitally-skilled young people's identity in computing

    Science.gov (United States)

    Wong, Billy

    2016-12-01

    Computers and information technology are fast becoming a part of young people's everyday life. However, there remains a difference between the majority who can use computers and the minority who are computer scientists or professionals. Drawing on 32 semi-structured interviews with digitally skilled young people (aged 13-19), we explore their views and aspirations in computing, with a focus on the identities and discourses that these youngsters articulate in relation to this field. Our findings suggest that, even among digitally skilled young people, traditional identities of computing as people who are clever but antisocial still prevail, which can be unattractive for youths, especially girls. Digitally skilled youths identify with computing in different ways and for different reasons. Most enjoy doing computing but few aspired to being a computer person. Implications of our findings for computing education are discussed especially the continued need to broaden identities in computing, even for the digitally skilled.

  4. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  5. Quantum computing with defects.

    Science.gov (United States)

    Weber, J R; Koehl, W F; Varley, J B; Janotti, A; Buckley, B B; Van de Walle, C G; Awschalom, D D

    2010-05-11

    Identifying and designing physical systems for use as qubits, the basic units of quantum information, are critical steps in the development of a quantum computer. Among the possibilities in the solid state, a defect in diamond known as the nitrogen-vacancy (NV(-1)) center stands out for its robustness--its quantum state can be initialized, manipulated, and measured with high fidelity at room temperature. Here we describe how to systematically identify other deep center defects with similar quantum-mechanical properties. We present a list of physical criteria that these centers and their hosts should meet and explain how these requirements can be used in conjunction with electronic structure theory to intelligently sort through candidate defect systems. To illustrate these points in detail, we compare electronic structure calculations of the NV(-1) center in diamond with those of several deep centers in 4H silicon carbide (SiC). We then discuss the proposed criteria for similar defects in other tetrahedrally coordinated semiconductors.

  6. Positron emission computed tomography

    International Nuclear Information System (INIS)

    Grover, M.; Schelbert, H.R.

    1985-01-01

    Regional mycardial blood flow and substrate metabolism can be non-invasively evaluated and quantified with positron emission computed tomography (Positron-CT). Tracers of exogenous glucose utilization and fatty acid metabolism are available and have been extensively tested. Specific tracer kinetic models have been developed or are being tested so that glucose and fatty acid metabolism can be measured quantitatively by Positron-CT. Tracers of amino acid and oxygen metabolism are utilized in Positron-CT studies of the brain and development of such tracers for cardiac studies are in progress. Methods to quantify regional myocardial blood flow are also being developed. Previous studies have demonstrated the ability of Positron-/CT to document myocardial infarction. Experimental and clinical studies have begun to identify metabolic markers of reversibly ischemic myocardium. The potential of Positron-CT to reliably detect potentially salvageable myocardium and, hence, to identify appropriate therapeutic interventions is one of the most exciting applications of the technique

  7. Regional Platform on Personal Computer Electronic Waste in Latin ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Regional Platform on Personal Computer Electronic Waste in Latin America and the Caribbean. Donation of ... This project aims to identify environmentally responsible and sustainable solutions to the problem of e-waste. ... Policy in Focus publishes a special issue profiling evidence to empower women in the labour market.

  8. Management and Valorization of Electronic and Computer Wastes in ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine the issue of electronic and computer waste and its management, and endeavor to identify feasible and sustainable strategies for ... IDRC congratulates first cohort of Women in Climate Change Science Fellows ... titled “Climate change and adaptive water management: Innovative solutions from the ...

  9. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  10. Matrine Is Identified as a Novel Macropinocytosis Inducer by a Network Target Approach

    Directory of Open Access Journals (Sweden)

    Bo Zhang

    2018-01-01

    Full Text Available Comprehensively understanding pharmacological functions of natural products is a key issue to be addressed for the discovery of new drugs. Unlike some single-target drugs, natural products always exert diverse therapeutic effects through acting on a “network” that consists of multiple targets, making it necessary to develop a systematic approach, e.g., network pharmacology, to reveal pharmacological functions of natural products and infer their mechanisms of action. In this work, to identify the “network target” of a natural product, we perform a functional analysis of matrine, a marketed drug in China extracted from a medical herb Ku-Shen (Radix Sophorae Flavescentis. Here, the network target of matrine was firstly predicted by drugCIPHER, a genome-wide target prediction method. Based on the network target of matrine, we performed a functional gene set enrichment analysis to computationally identify the potential pharmacological functions of matrine, most of which are supported by the literature evidence, including neurotoxicity and neuropharmacological activities of matrine. Furthermore, computational results demonstrated that matrine has the potential for the induction of macropinocytosis and the regulation of ATP metabolism. Our experimental data revealed that the large vesicles induced by matrine are consistent with the typical characteristics of macropinosome. Our verification results also suggested that matrine could decrease cellular ATP level. These findings demonstrated the availability and effectiveness of the network target strategy for identifying the comprehensive pharmacological functions of natural products.

  11. Explorations in computing an introduction to computer science

    CERN Document Server

    Conery, John S

    2010-01-01

    Introduction Computation The Limits of Computation Algorithms A Laboratory for Computational ExperimentsThe Ruby WorkbenchIntroducing Ruby and the RubyLabs environment for computational experimentsInteractive Ruby Numbers Variables Methods RubyLabs The Sieve of EratosthenesAn algorithm for finding prime numbersThe Sieve Algorithm The mod Operator Containers Iterators Boolean Values and the delete if Method Exploring the Algorithm The sieve Method A Better Sieve Experiments with the Sieve A Journey of a Thousand MilesIteration as a strategy for solving computational problemsSearching and Sortin

  12. Performance Measurements in a High Throughput Computing Environment

    CERN Document Server

    AUTHOR|(CDS)2145966; Gribaudo, Marco

    The IT infrastructures of companies and research centres are implementing new technologies to satisfy the increasing need of computing resources for big data analysis. In this context, resource profiling plays a crucial role in identifying areas where the improvement of the utilisation efficiency is needed. In order to deal with the profiling and optimisation of computing resources, two complementary approaches can be adopted: the measurement-based approach and the model-based approach. The measurement-based approach gathers and analyses performance metrics executing benchmark applications on computing resources. Instead, the model-based approach implies the design and implementation of a model as an abstraction of the real system, selecting only those aspects relevant to the study. This Thesis originates from a project carried out by the author within the CERN IT department. CERN is an international scientific laboratory that conducts fundamental researches in the domain of elementary particle physics. The p...

  13. EXPERIENCE OF USING CLOUD COMPUTING IN NETWORK PRODUCTS FOR SCHOOL EDUCATION

    Directory of Open Access Journals (Sweden)

    L. Sokolova

    2011-05-01

    Full Text Available We study data on the use of sites in the middle grades, secondary school, and their influence on the formation of information culture of students and their level of training. Sites use a technology called "cloud computing in Google, accessible from any internet-connected computer and do not require the use of resources of the computer itself. Sites are devoid of any advertising, does not require periodic backup, protection and general operation of the system administrator. This simplifies their use in the educational process for schools of different levels. A statistical analysis of the site was done, identified the main trends of their use.

  14. Examining Student Opinions on Computer Use Based on the Learning Styles in Mathematics Education

    Science.gov (United States)

    Ozgen, Kemal; Bindak, Recep

    2012-01-01

    The purpose of this study is to identify the opinions of high school students, who have different learning styles, related to computer use in mathematics education. High school students' opinions on computer use in mathematics education were collected with both qualitative and quantitative approaches in the study conducted with a survey model. For…

  15. An Efficient Approach for Identifying Stable Lobes with Discretization Method

    Directory of Open Access Journals (Sweden)

    Baohai Wu

    2013-01-01

    Full Text Available This paper presents a new approach for quick identification of chatter stability lobes with discretization method. Firstly, three different kinds of stability regions are defined: absolute stable region, valid region, and invalid region. Secondly, while identifying the chatter stability lobes, three different regions within the chatter stability lobes are identified with relatively large time intervals. Thirdly, stability boundary within the valid regions is finely calculated to get exact chatter stability lobes. The proposed method only needs to test a small portion of spindle speed and cutting depth set; about 89% computation time is savedcompared with full discretization method. It spends only about10 minutes to get exact chatter stability lobes. Since, based on discretization method, the proposed method can be used for different immersion cutting including low immersion cutting process, the proposed method can be directly implemented in the workshop to promote machining parameters selection efficiency.

  16. Reducing uncertainty at minimal cost: a method to identify important input parameters and prioritize data collection

    NARCIS (Netherlands)

    Uwizeye, U.A.; Groen, E.A.; Gerber, P.J.; Schulte, Rogier P.O.; Boer, de I.J.M.

    2016-01-01

    The study aims to illustrate a method to identify important input parameters that explain most of the output variance ofenvironmental assessment models. The method is tested for the computation of life-cycle nitrogen (N) use efficiencyindicators among mixed dairy production systems in Rwanda. We

  17. Computer Networking Laboratory for Undergraduate Computer Technology Program

    National Research Council Canada - National Science Library

    Naghedolfeizi, Masoud

    2000-01-01

    ...) To improve the quality of education in the existing courses related to computer networks and data communications as well as other computer science courses such programming languages and computer...

  18. Mathematics, Physics and Computer Sciences The computation of ...

    African Journals Online (AJOL)

    Mathematics, Physics and Computer Sciences The computation of system matrices for biquadraticsquare finite ... Global Journal of Pure and Applied Sciences ... The computation of system matrices for biquadraticsquare finite elements.

  19. Computability, complexity, and languages fundamentals of theoretical computer science

    CERN Document Server

    Davis, Martin D; Rheinboldt, Werner

    1983-01-01

    Computability, Complexity, and Languages: Fundamentals of Theoretical Computer Science provides an introduction to the various aspects of theoretical computer science. Theoretical computer science is the mathematical study of models of computation. This text is composed of five parts encompassing 17 chapters, and begins with an introduction to the use of proofs in mathematics and the development of computability theory in the context of an extremely simple abstract programming language. The succeeding parts demonstrate the performance of abstract programming language using a macro expa

  20. An Exploratory Study of the Implementation of Computer Technology in an American Islamic Private School

    Science.gov (United States)

    Saleem, Mohammed M.

    2009-01-01

    This exploratory study of the implementation of computer technology in an American Islamic private school leveraged the case study methodology and ethnographic methods informed by symbolic interactionism and the framework of the Muslim Diaspora. The study focused on describing the implementation of computer technology and identifying the…