WorldWideScience

Sample records for computationally identified trifoliate

  1. MiR-RACE, a new efficient approach to determine the precise sequences of computationally identified trifoliate orange (Poncirus trifoliata microRNAs.

    Directory of Open Access Journals (Sweden)

    Changnian Song

    Full Text Available BACKGROUND: Among the hundreds of genes encoding miRNAs in plants reported, much more were predicted by numerous computational methods. However, unlike protein-coding genes defined by start and stop codons, the ends of miRNA molecules do not have characteristics that can be used to define the mature miRNAs exactly, which made computational miRNA prediction methods often cannot predict the accurate location of the mature miRNA in a precursor with nucleotide-level precision. To our knowledge, there haven't been reports about comprehensive strategies determining the precise sequences, especially two termini, of these miRNAs. METHODS: In this study, we report an efficient method to determine the precise sequences of computationally predicted microRNAs (miRNAs that combines miRNA-enriched library preparation, two specific 5' and 3' miRNA RACE (miR-RACE PCR reactions, and sequence-directed cloning, in which the most challenging step is the two specific gene specific primers designed for the two RACE reactions. miRNA-mediated mRNA cleavage by RLM-5' RACE and sequencing were carried out to validate the miRNAs detected. Real-time PCR was used to analyze the expression of each miRNA. RESULTS: The efficiency of this newly developed method was validated using nine trifoliate orange (Poncirus trifoliata miRNAs predicted computationally. The miRNAs computationally identified were validated by miR-RACE and sequencing. Quantitative analysis showed that they have variable expression. Eight target genes have been experimentally verified by detection of the miRNA-mediated mRNA cleavage in Poncirus trifoliate. CONCLUSION: The efficient and powerful approach developed herein can be successfully used to validate the sequences of miRNAs, especially the termini, which depict the complete miRNA sequence in the computationally predicted precursor.

  2. Identifying Causal Effects with Computer Algebra

    CERN Document Server

    García-Puente, Luis David; Sullivant, Seth

    2010-01-01

    The long-standing identification problem for causal effects in graphical models has many partial results but lacks a systematic study. We show how computer algebra can be used to either prove that a causal effect can be identified, generically identified, or show that the effect is not generically identifiable. We report on the results of our computations for linear structural equation models, where we determine precisely which causal effects are generically identifiable for all graphs on three and four vertices.

  3. Effect of delayed harvesting and pre-treatment methods on the antinutritional contents of trifoliate yam flour.

    Science.gov (United States)

    Abiodun, Olufunmilola Adunni; Akinoso, Rahman

    2014-03-01

    Effects of delayed harvesting and pre-treatment methods on the anti-nutritional contents of trifoliate yam flour were examined. Trifoliate yam tubers were washed, peeled, sliced and subjected to pre-treatment methods, such as soaking, pre-cooking and blanching/soaking. The phenols, phytate, oxalate, tannin and alkaloid profiles of the flours were evaluated and the values of phenols, tannin, oxalate and phytate contents were 0.02-0.32, 0.04-0.53, 0.11-4.32 and 0.20-1.05mg/100g, respectively. The predominant alkaloids in trifoliate yam flour were dioscorine and dihydrodioscorine. The white trifoliate yam flour had higher levels of anti-nutrients than the yellow trifoliate yam flour. Alkaloid contents of trifoliate yam flour increased slightly with delayed harvesting periods. Blanching/soaking method drastically reduced the anti-nutrient contents of trifoliate yam flour than other methods. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Identification of flowering-related genes between early flowering trifoliate orange mutant and wild-type trifoliate orange (Poncirus trifoliata L. Raf.) by suppression subtraction hybridization (SSH) and macroarray.

    Science.gov (United States)

    Zhang, Jin-Zhi; Li, Zhi-Min; Yao, Jia-Ling; Hu, Chun-Gen

    2009-02-01

    To gain a better understanding of gene expression in early flowering trifoliate orange mutant (precocious trifoliate orange, Poncirus trifoliata L. Raf.), we performed suppression subtractive hybridization, which allowed identification of flowering-related genes in the mutant and the wild type in the juvenile phase. Using macroarray analysis, we identified 125 and 149 non-redundant expressed sequence tags (ESTs) in the forward-subtracted and the reverse-subtracted library. These cDNAs covered a broad repertoire of flowering development related genes, provided helpful information for understanding genetic mechanism underlying the signaling and regulation in transition from the vegetative to reproductive phase. We have investigated the temporal and spatial expression pattern of some SSH-enriched flowering-related genes in the mutant and the wild type. Of these genes, three genes (BARELY ANY MERITED, FLOWERING LOCUS T and TERMINAL FLOWER1) encoding proteins previously reported to be associated with, or involved in, developmental processes in other species were identified and further investigated by in situ hybridization. Specific spatial and/or temporal patterns were detected, and differences were observed between the mutant and the wild type during flower development. Meanwhile, the temporal expression of these genes was further examined by real-time PCR, the results showed that FT and BAM transcripts accumulated to higher levels and TFL1 transcripts accumulated to lower levels in mutant juvenile tissues relative to wild-type juvenile tissues. In the adult stage, FT, BAM and TFL1 expression patterns were closely correlated with flowering development, suggesting that these three genes may play a critical role in the early flowering process of precocious trifoliate orange.

  5. Mycorrhiza alters the profile of root hairs in trifoliate orange.

    Science.gov (United States)

    Wu, Qiang-Sheng; Liu, Chun-Yan; Zhang, De-Jian; Zou, Ying-Ning; He, Xin-Hua; Wu, Qing-Hua

    2016-04-01

    Root hairs and arbuscular mycorrhiza (AM) coexist in root systems for nutrient and water absorption, but the relation between AM and root hairs is poorly known. A pot study was performed to evaluate the effects of four different AM fungi (AMF), namely, Claroideoglomus etunicatum, Diversispora versiformis, Funneliformis mosseae, and Rhizophagus intraradices on root hair development in trifoliate orange (Poncirus trifoliata) seedlings grown in sand. Mycorrhizal seedlings showed significantly higher root hair density than non-mycorrhizal seedlings, irrespective of AMF species. AMF inoculation generally significantly decreased root hair length in the first- and second-order lateral roots but increased it in the third- and fourth-order lateral roots. AMF colonization induced diverse responses in root hair diameter of different order lateral roots. Considerably greater concentrations of phosphorus (P), nitric oxide (NO), glucose, sucrose, indole-3-acetic acid (IAA), and methyl jasmonate (MeJA) were found in roots of AM seedlings than in non-AM seedlings. Levels of P, NO, carbohydrates, IAA, and MeJA in roots were correlated with AM formation and root hair development. These results suggest that AMF could alter the profile of root hairs in trifoliate orange through modulation of physiological activities. F. mosseae, which had the greatest positive effects, could represent an efficient AM fungus for increasing fruit yields or decreasing fertilizer inputs in citrus production.

  6. Effect of harvesting periods on the chemical and pasting properties of trifoliate yam flour.

    Science.gov (United States)

    Abiodun, O A; Akinoso, R

    2014-01-01

    The effects of delayed harvesting on the chemical and pasting properties of trifoliate yam flour were studied. The tubers were harvested at 7, 8, 9, 10 and 11months after maturity and were processed into flours. Chemical and pasting properties of the flours were determined. White trifoliate yam flour at 11months was significantly different (p0.05) from yellow trifoliate yam flour at 11months. Amylose and starch contents decreased while the sugar contents increased with harvesting periods. Yellow trifoliate yam flour had higher amylose at 10months while the white trifoliate yam flour had higher starch at 9months and sugar contents at 11months. Potassium and sodium were the major minerals found in the yam with higher values in yellow trifoliate yam flours. Peak viscosity and breakdown decreased while the holding strength and final viscosities increased with harvesting periods. Harvesting trifoliate yam tubers at 7-9months produced flour with high quality and prevents post harvest losses. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Deep sequencing discovery of novel and conserved microRNAs in trifoliate orange (Citrus trifoliata

    Directory of Open Access Journals (Sweden)

    Yu Huaping

    2010-07-01

    Full Text Available Abstract Background MicroRNAs (miRNAs play a critical role in post-transcriptional gene regulation and have been shown to control many genes involved in various biological and metabolic processes. There have been extensive studies to discover miRNAs and analyze their functions in model plant species, such as Arabidopsis and rice. Deep sequencing technologies have facilitated identification of species-specific or lowly expressed as well as conserved or highly expressed miRNAs in plants. Results In this research, we used Solexa sequencing to discover new microRNAs in trifoliate orange (Citrus trifoliata which is an important rootstock of citrus. A total of 13,106,753 reads representing 4,876,395 distinct sequences were obtained from a short RNA library generated from small RNA extracted from C. trifoliata flower and fruit tissues. Based on sequence similarity and hairpin structure prediction, we found that 156,639 reads representing 63 sequences from 42 highly conserved miRNA families, have perfect matches to known miRNAs. We also identified 10 novel miRNA candidates whose precursors were all potentially generated from citrus ESTs. In addition, five miRNA* sequences were also sequenced. These sequences had not been earlier described in other plant species and accumulation of the 10 novel miRNAs were confirmed by qRT-PCR analysis. Potential target genes were predicted for most conserved and novel miRNAs. Moreover, four target genes including one encoding IRX12 copper ion binding/oxidoreductase and three genes encoding NB-LRR disease resistance protein have been experimentally verified by detection of the miRNA-mediated mRNA cleavage in C. trifoliata. Conclusion Deep sequencing of short RNAs from C. trifoliata flowers and fruits identified 10 new potential miRNAs and 42 highly conserved miRNA families, indicating that specific miRNAs exist in C. trifoliata. These results show that regulatory miRNAs exist in agronomically important trifoliate orange

  8. Genome-wide screening and characterization of long non-coding RNAs involved in flowering development of trifoliate orange (Poncirus trifoliata L. Raf.)

    Science.gov (United States)

    Wang, Chen-Yang; Liu, Sheng-Rui; Zhang, Xiao-Yu; Ma, Yu-Jiao; Hu, Chun-Gen; Zhang, Jin-Zhi

    2017-01-01

    Long non-coding RNAs (lncRNAs) have been demonstrated to play critical regulatory roles in post-transcriptional and transcriptional regulation in Arabidopsis. However, lncRNAs and their functional roles remain poorly characterized in woody plants, including citrus. To identify lncRNAs and investigate their role in citrus flowering, paired-end strand-specific RNA sequencing was performed for precocious trifoliate orange and its wild-type counterpart. A total of 6,584 potential lncRNAs were identified, 51.6% of which were from intergenic regions. Additionally, 555 lncRNAs were significantly up-regulated and 276 lncRNAs were down-regulated in precocious trifoliate orange, indicating that lncRNAs could be involved in the regulation of trifoliate orange flowering. Comparisons between lncRNAs and coding genes indicated that lncRNAs tend to have shorter transcripts and lower expression levels and that they display significant expression specificity. More importantly, 59 and 7 lncRNAs were identified as putative targets and target mimics of citrus miRNAs, respectively. In addition, the targets of Pt-miR156 and Pt-miR396 were confirmed using the regional amplification reverse-transcription polymerase chain reaction method. Furthermore, overexpression of Pt-miR156a1 and Pt-miR156a1 in Arabidopsis resulted in an extended juvenile phase, short siliques, and smaller leaves in transgenic plants compared with control plants. These findings provide important insight regarding citrus lncRNAs, thus enabling in-depth functional analyses. PMID:28233798

  9. Textural and sensory properties of trifoliate yam (Dioscorea dumetorum) flour and stiff dough 'amala'.

    Science.gov (United States)

    Abiodun, O A; Akinoso, R

    2015-05-01

    The use of trifoliate yam (Dioscorea dumetorum) flour for stiff dough 'amala' production is one of the ways to curb under-utilization of the tuber. The study evaluates the textural and sensory properties of trifoliate yam flour and stiff dough. Freshly harvested trifoliate yam tubers were peeled, washed, sliced and blanched (60 (°)C for 10 min). The sliced yam were soaked in water for 12 h, dried and milled into flour. Pasting viscosities, functional properties, brown index and sensory attributes of the flour and stiff dough were analyzed. Peak, holding strength and final viscosities ranged from 84.09 to 213.33 RVU, 81.25 to 157.00 RVU and 127.58 to 236.17 RVU respectively. White raw flour had higher viscosity than the yellow flours. The swelling index, water absorption capacity and bulk density ranged from 1.46 to 2.28, 2.11 to 2.92 ml H2O/g and 0.71 to 0.88 g/cm(3) respectively. Blanching method employed improved the swelling index and water absorption capacity of flour. The brown index values of flour and stiff dough ranged from 6.73 to 18.36 and 14.63-46.72 respectively. Sensory evaluation revealed significant differences in the colour, odour and general acceptability of the product when compared with the stiff dough from white yam.

  10. Identifying failure in a tree network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.

    2010-08-24

    Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.

  11. Effects of Exogenous Putrescine on Mycorrhiza, Root System Architecture, and Physiological Traits of Glomus mosseae-Colonized Trifoliate Orange Seedlings

    Directory of Open Access Journals (Sweden)

    Qiang-Sheng WU

    2012-11-01

    Full Text Available Putresince (Put as one of the important polyamines (PAs has been identified to regulate mycorrhizal development of citrus plants. The present study was to screen an efficient concentration of Put application at the range of 0.05-1 mM on the trifoliate orange (Poncirus trifoliata seedlings colonized by Glomus mosseae, in terms of growth, root system architecture, and chlorophyll and carbohydrate contents. Compared to the non-Put treatment, all the Put treatments, especially 0.05 mM Put, significantly increased mycorrhizal colonization of tap root in addition to first, second, and third order lateral roots. The mycorrhizal seedlings treated by 0.05, 0.1, and 1 mM Put showed greater growth (stem diameter, height, leaf number, and fresh mass and root morphological properties ( tap root length, projected and surface areas, and volume and higher numbers of first, second, and third order lateral roots. Bio-molecules like chlorophyll a, total chlorophyll, and carotenoid contents of the seedlings were significantly increased by the Put treatments at 0.05-1 mM. All exogenous Put application at the range of 0.05-1 mM significantly decreased sucrose contents but increased glucose contents of leaves and roots. This study suggests that exogenous Put can significantly improve growth performance and root system architecture, besides changes in physiological traits of AMF seedlings. The 0.05 mM concentration of Put showed the best effects.

  12. Computer Competency: A 7-Year Study to Identify Gaps in Student Computer Skills

    Science.gov (United States)

    Shuster, George F.; Pearl, Mona

    2011-01-01

    Computer competency is crucial to student success in higher education. Assessment of student knowledge related to specific computer competencies can provide faculty with important information about the strengths and weaknesses of their students' computer competency skills. The purpose of this study was to identify the competency level of two…

  13. Identifying and relating nurses' attitudes toward computer use.

    Science.gov (United States)

    Burkes, M

    1991-01-01

    The purpose of this study was to measure nurses' attitudes toward computer use based on an adaptation of Vroom's expectancy theory, and identify variables that may correlate with these attitudes. Content validity and reliability for internal consistency were determined for the developed attitude questionnaire. Nurses' individual characteristics and computer-use satisfaction, beliefs, and motivation were correlated. Data analysis revealed that nurses' attitudes were significantly related (satisfaction to beliefs, r = 0.783, p less than 0.001; satisfaction to motivation, r = 0.598, p less than 0.001; and beliefs to motivation r = 0.651, p less than 0.001), supporting the model based on Vroom's expectancy theory. Computer knowledge significantly related to computer-use beliefs (r = 0.229, p less than 0.05). Length of computer experience (r = -0.265, p less than 0.05) and nursing experience (r = -0.239, p less than 0.05) related negatively to nurses' computer-use satisfaction.

  14. Identifying barriers for implementation of computer based nursing documentation.

    Science.gov (United States)

    Vollmer, Anna-Maria; Prokosch, Hans-Ulrich; Bürkle, Thomas

    2014-01-01

    This study was undertaken in the planning phase for the introduction of a comprehensive computer based nursing documentation system at Erlangen University Hospital. There, we expect a wide range of difficult organizational changes, because the nurses currently neither used computer based nursing documentation nor did they follow strongly the nursing process model within paper based documentation. Thus we were eager to recognize potential pitfalls early and to identify potential barriers for digital nursing documentation. In a questionnaire study we surveyed all German university hospitals for their experience with the implementation of computer based nursing documentation implementation. We received answers from 11 of the 23 hospitals. Furthermore we performed a questionnaire study about expectations and fears among the nurses of four pilot wards of our hospital. Most respondents stated a positive attitude towards the nursing process documentation, but many respondents note technical (e.g. bad performance of the software) and organizational barriers (e.g. lack of time).

  15. Technological Properties of Wheat/Trifoliate Yam (Dioscorea dumetorum Hardened Tubers Composite Flours

    Directory of Open Access Journals (Sweden)

    Véronique Josette Essa’a

    2015-01-01

    Full Text Available The ability of trifoliate hardened-yam flours to partially substitute wheat flour in food formulations was assessed. Three varieties of hardened-yam flour were incorporated in wheat flour in proportions of 0, 10, 20, 30, 40, and 50% (w/w. Samples were evaluated for protein content, Zeleny sedimentation index, Hagberg falling number, functional properties (WAC, WSI, and OAC, and some rheological properties including dough rupture pressure (P, extensibility (L, stability (P/L, and deformation energy (W. Results showed that trifoliate hardened-yam flours do not have acceptable baking properties as pictured by the low Zeleny sedimentation index and the low Hagberg falling number. Protein quality (Zeleny index, 31 of wheat flour helped to compensate gluten deficit of yam flours, but the amylasic activity determined by the Hagberg falling number could not be adjusted, which resulted in a loss of extensibility (L of the paste at 10% substitution. Multivariate analysis of experimental data regrouped wheat flour and all wheat/hardened-yam treated with kanwa composite flours in one homogeneous cluster. Although wheat/hardened-yam treated with kanwa composite flours had physicochemical and functional properties similar to wheat, the inadequate diastasic activity makes them inappropriate for bread making, marking the strongest influence of that parameter.

  16. Effects of Common Mycorrhizal Network on Plant Carbohydrates and Soil Properties in Trifoliate Orange-White Clover Association.

    Directory of Open Access Journals (Sweden)

    Ze-Zhi Zhang

    Full Text Available Common mycorrhizal network (CMN allows nutrients and signals to pass between two or more plants. In this study, trifoliate orange (Poncirus trifoliata and white clover (Trifolium repens were planted in a two-compartmented rootbox, separated by a 37-μm nylon mesh and then inoculated with an arbuscular mycorrhizal fungus (AMF, Diversispora spurca. Inoculation with D. spurca resulted in formation of a CMN between trifoliate orange and white clover, whilst the best AM colonization occurred in the donor trifoliate orange-receptor white clover association. In the trifoliate orange-white clover association, the mycorrhizal colonization of receptor plant by extraradical hyphae originated from the donor plant significantly increased shoot and root fresh weight and chlorophyll concentration of the receptor plant. Enzymatic activity of soil β-glucoside hydrolase, protease, acid and neutral phosphatase, water-stable aggregate percentage at 2-4 and 0.5-1 mm size, and mean weight diameter in the rhizosphere of the receptor plant also increased. The hyphae of CMN released more easily-extractable glomalin-related soil protein and total glomalin-related soil protein into the receptor rhizosphere, which represented a significantly positive correlation with aggregate stability. AMF inoculation exhibited diverse changes in leaf and root sucrose concentration in the donor plant, and AM colonization by CMN conferred a significant increase of root glucose in the receptor plant. These results suggested that CMN formed in the trifoliate orange-white clover association, and root AM colonization by CMN promoted plant growth, root glucose accumulation, and rhizospheric soil properties in the receptor plant.

  17. Identifying logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Kristan D.; Faraj, Daniel

    2016-05-03

    In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.

  18. Identifying logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Kristan D.; Faraj, Daniel A.

    2016-03-01

    In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.

  19. A comparison of computational methods for identifying virulence factors.

    Directory of Open Access Journals (Sweden)

    Lu-Lu Zheng

    Full Text Available Bacterial pathogens continue to threaten public health worldwide today. Identification of bacterial virulence factors can help to find novel drug/vaccine targets against pathogenicity. It can also help to reveal the mechanisms of the related diseases at the molecular level. With the explosive growth in protein sequences generated in the postgenomic age, it is highly desired to develop computational methods for rapidly and effectively identifying virulence factors according to their sequence information alone. In this study, based on the protein-protein interaction networks from the STRING database, a novel network-based method was proposed for identifying the virulence factors in the proteomes of UPEC 536, UPEC CFT073, P. aeruginosa PAO1, L. pneumophila Philadelphia 1, C. jejuni NCTC 11168 and M. tuberculosis H37Rv. Evaluated on the same benchmark datasets derived from the aforementioned species, the identification accuracies achieved by the network-based method were around 0.9, significantly higher than those by the sequence-based methods such as BLAST, feature selection and VirulentPred. Further analysis showed that the functional associations such as the gene neighborhood and co-occurrence were the primary associations between these virulence factors in the STRING database. The high success rates indicate that the network-based method is quite promising. The novel approach holds high potential for identifying virulence factors in many other various organisms as well because it can be easily extended to identify the virulence factors in many other bacterial species, as long as the relevant significant statistical data are available for them.

  20. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  1. Identifying Nursing Computer Training Requirements using Web-based Assessment

    Directory of Open Access Journals (Sweden)

    Naser Ghazi

    2011-12-01

    Full Text Available Our work addresses issues of inefficiency and ineffectiveness in the training of nurses in computer literacy by developing an adaptive questionnaire system. This system works to identify the most effective training modules by evaluating applicants for pre-training and post-training. Our system, Systems Knowledge Assessment Tool (SKAT, aims to increase training proficiency, decrease training time and reduce costs associated with training by identifying areas of training required, and those which are not required for training, targeted to each individual. Based on the project’s requirements, a number of HTML documents were designed to be used as templates in the implementation stage. During this stage, the milestone principle was used, in which a series of coding and testing was performed to generate an error-free product.The decision-making process and it is components, as well as knowing the priority of each attribute in the application is responsible for determining the required training for each applicant. Thus, the decision-making process is an essential aspect of system design and greatly affects the training results of the applicant. The SKAT system has been evaluated to ensure that the system meets the project’s requirements. The evaluation stage was an important part of the project and required a number of nurses with different roles to evaluate the system. Based on their feedback, changes were made.

  2. Identifying Key Challenges in Performance Issues in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ashraf Zia

    2012-10-01

    Full Text Available Cloud computing is a harbinger to a newer era in the field of computing where distributed and centralized services are used in a unique way. In cloud computing, the computational resources of different vendors and IT services providers are managed for providing an enormous and a scalable computing services platform that offers efficient data processing coupled with better QoS at a lower cost. The on-demand dynamic and scalable resource allocation is the main motif behind the development and deployment of cloud computing. The potential growth in this area and the presence of some dominant organizations with abundant resources (like Google, Amazon, Salesforce, Rackspace, Azure, GoGrid, make the field of cloud computing more fascinating. All the cloud computing processes need to be in unanimity to dole out better QoS i.e., to provide better software functionality, meet the tenant’s requirements for their desired processing power and to exploit elevated bandwidth.. However, several technical and functional e.g., pervasive access to resources, dynamic discovery, on the fly access and composition of resources pose serious challenges for cloud computing. In this study, the performance issues in cloud computing are discussed. A number of schemes pertaining to QoS issues are critically analyzed to point out their strengths and weaknesses. Some of the performance parameters at the three basic layers of the cloud — Infrastructure as a Service, Platform as a Service and Software as a Service — are also discussed in this paper.

  3. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    Science.gov (United States)

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  4. Evolutionary Computational Methods for Identifying Emergent Behavior in Autonomous Systems

    Science.gov (United States)

    Terrile, Richard J.; Guillaume, Alexandre

    2011-01-01

    A technique based on Evolutionary Computational Methods (ECMs) was developed that allows for the automated optimization of complex computationally modeled systems, such as autonomous systems. The primary technology, which enables the ECM to find optimal solutions in complex search spaces, derives from evolutionary algorithms such as the genetic algorithm and differential evolution. These methods are based on biological processes, particularly genetics, and define an iterative process that evolves parameter sets into an optimum. Evolutionary computation is a method that operates on a population of existing computational-based engineering models (or simulators) and competes them using biologically inspired genetic operators on large parallel cluster computers. The result is the ability to automatically find design optimizations and trades, and thereby greatly amplify the role of the system engineer.

  5. 76 FR 37111 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Science.gov (United States)

    2011-06-24

    ... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Access to Confidential Business Information by Computer Sciences Corporation and Its Identified... contractor, Computer Sciences Corporation of Chantilly, VA and Its Identified Subcontractors, to access...

  6. Mycorrhizas alter sucrose and proline metabolism in trifoliate orange exposed to drought stress

    Science.gov (United States)

    Wu, Hui-Hui; Zou, Ying-Ning; Rahman, Mohammed Mahabubur; Ni, Qiu-Dan; Wu, Qiang-Sheng

    2017-01-01

    Arbuscular mycorrhizal fungi (AMF) can enhance drought tolerance in plants, whereas little is known regarding AMF contribution to sucrose and proline metabolisms under drought stress (DS). In this study, Funneliformis mosseae and Paraglomus occultum were inoculated into trifoliate orange (Poncirus trifoliata) under well watered and DS. Although the 71-days DS notably (P < 0.05) inhibited mycorrhizal colonization, AMF seedlings showed significantly (P < 0.05) higher plant growth performance and leaf relative water content, regardless of soil water status. AMF inoculation significantly (P < 0.05) increased leaf sucrose, glucose and fructose concentration under DS, accompanied with a significant increase of leaf sucrose phosphate synthase, neutral invertase, and net activity of sucrose-metabolized enzymes and a decrease in leaf acid invertase and sucrose synthase activity. AMF inoculation produced no change in leaf ornithine-δ-aminotransferase activity, but significantly (P < 0.05) increased leaf proline dehydrogenase activity and significantly (P < 0.05) decreased leaf both Δ1-pyrroline-5-carboxylate reductase and Δ1-pyrroline-5-carboxylate synthetase activity, resulting in lower proline accumulation in AMF plants under DS. Our results therefore suggest that AMF strongly altered leaf sucrose and proline metabolism through regulating sucrose- and proline-metabolized enzyme activities, which is important for osmotic adjustment of the host plant. PMID:28181575

  7. Mycorrhizal trifoliate orange has greater root adaptation of morphology and phytohormones in response to drought stress

    Science.gov (United States)

    Zou, Ying-Ning; Wang, Peng; Liu, Chun-Yan; Ni, Qiu-Dan; Zhang, De-Jian; Wu, Qiang-Sheng

    2017-01-01

    Plant roots are the first parts of plants to face drought stress (DS), and thus root modification is important for plants to adapt to drought. We hypothesized that the roots of arbuscular mycorrhizal (AM) plants exhibit better adaptation in terms of morphology and phytohormones under DS. Trifoliate orange seedlings inoculated with Diversispora versiformis were subjected to well-watered (WW) and DS conditions for 6 weeks. AM seedlings exhibited better growth performance and significantly greater number of 1st, 2nd, and 3rd order lateral roots, root length, area, average diameter, volume, tips, forks, and crossings than non-AM seedlings under both WW and DS conditions. AM fungal inoculation considerably increased root hair density under both WW and DS and root hair length under DS, while dramatically decreased root hair length under WW but there was no change in root hair diameter. AM plants had greater concentrations of indole-3-acetic acid, methyl jasmonate, nitric oxide, and calmodulin in roots, which were significantly correlated with changes in root morphology. These results support the hypothesis that AM plants show superior adaptation in root morphology under DS that is potentially associated with indole-3-acetic acid, methyl jasmonate, nitric oxide, and calmodulin levels. PMID:28106141

  8. Identifying a Computer Forensics Expert: A Study to Measure the Characteristics of Forensic Computer Examiners

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2010-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The usage of digital evidence from electronic devices has been rapidly expanding within litigation, and along with this increased usage, the reliance upon forensic computer examiners to acquire, analyze, and report upon this evidence is also rapidly growing. This growing demand for forensic computer examiners raises questions concerning the selection of individuals qualified to perform this work. While courts have mechanisms for qualifying witnesses that provide testimony based on scientific data, such as digital data, the qualifying criteria covers a wide variety of characteristics including, education, experience, training, professional certifications, or other special skills. In this study, we compare task performance responses from forensic computer examiners with an expert review panel and measure the relationship with the characteristics of the examiners to their quality responses. The results of this analysis provide insight into identifying forensic computer examiners that provide high-quality responses. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  9. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  10. 75 FR 70672 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Science.gov (United States)

    2010-11-18

    ... AGENCY Access to Confidential Business Information by Computer Sciences Corporation and Its Identified... contractor, Computer Sciences Corporation (CSC) of Chantilly, VA and Its Identified Subcontractors, to access... required to support OPPT computer applications; OPPT staff; and their development staff. Specific types of...

  11. Computer-Based Procedures for Field Workers - Identified Benefits

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Le Blanc, Katya L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    The Idaho National Laboratory (INL) computer-based procedure (CBP) research team is exploring how best to design a CBP system that will deliver the intended benefits of increased efficiency and improved human performance. It is important to note that no “off-the-shelf” technology exists for the type of CBP system that is investigated and developed by the INL researchers. As more technology is integrated into the procedure process the importance of an appropriate and methodological approach to the design of the procedure system increases. Technological advancements offer great opportunities for efficiency and safety gains, however if the system is not designed correctly there is a large risk of unintentionally introducing new opportunities for human errors. The INL research team is breaking new ground in the area of CBPs with the prototype they have developed. Current electronic procedure systems are most commonly electronic versions of the paper-based procedures with hyperlinks to other procedures, limited user input functionality, and the ability to mark steps completed. These systems do not fully exploit the advantages digital technology. It is a part of the INL researchers’ role to develop and validate new CBP technologies that greatly increase the benefits of a CBP system to the nuclear industry.

  12. Effects of arbuscular mycorrhizal fungi on leaf solutes and root absorption areas of trifoliate orange seedlings under water stress conditions

    Institute of Scientific and Technical Information of China (English)

    WU Qiangsheng; XIA Renxue

    2006-01-01

    The effects of the arbuscular mycorrhizal (AM)fungus Glomus mosseae on plant growth,leaf solutes and root absorption area of trifoliate orange (Poncirus trifoliata (L.) Raf.) seedlings were studied in potted culture under water stress conditions.Inoculation with G.mosseae increased plant height,stem diameter,leaf area,shoot dry weight,root dry weight and plant dry weight,when the soil water content was 20%,16% and 12%.AM inoculation also promoted the active and total absorption area of root system and absorption of phosphorus from the rhizosphere,enhanced the content of soluble sugar in leaves and roots,and reduced proline content in leaves.AM seedlings had higher plant water use efficiency and higher drought tolerance than non-AM seedlings.Effects of G.mosseae inoculation on trifoliate orange seedlings under 20% and 16% soil water content were more significant than under 12% soil water content.AM infection was severely restrained by 12% soil water content.Thus,effects of AM fungi on plants were probably positively related to the extent of root colonization by AM fungi.The mechanism of AM fungi in enhancing drought resistance of host plants ascribed to greater osmotic adjustment and greater absorption area of root system by AM colonization.

  13. Comparative 28-day repeated oral toxicity of Longdan Xieganwan, Akebia trifoliate (Thunb.) koidz., Akebia quinata (Thunb.) Decne. and Caulis aristolochiae manshuriensis in mice.

    Science.gov (United States)

    Xue, Xiang; Xiao, Ying; Gong, Likun; Guan, Shuhong; Liu, Yongzhen; Lu, Henglei; Qi, Xinming; Zhang, Yunhai; Li, Yan; Wu, Xiongfei; Ren, Jin

    2008-09-02

    Longdan Xieganwan, which contains Aristolochia species, is a traditional Chinese prescription. It has been used for thousands of years to "enhance liver". However, many cases of Longdan Xieganwan induced nephropathy were reported recently. This study was designed to compare the possible toxic effects of Longdan Xieganwan and three different Aristolochia species, i.e. Akebia trifoliate (Thunb.) koid (Akebia trifoliate), Akebia quinata (Thunb.) Decne. (Akebia quinata) and Caulis aristolochiae manshuriensis (Aristolochia manshuriensis). Mice were orally administered these drugs for 28 days. Clinical signs, body weights, serum biochemistry, organ weights and histopathology were examined. Significantly decreased body weights and obvious nephropathy were noticed in the Aristolochia manshuriensis groups at doses higher than 0.24 g/kg/d. A few endothelial cell degenerations in renal glomerulus were observed in the Akebia trifoliate group at a high-dose of 2.00 g/kg/d. No significant changes were observed in the other groups. The no-observed-adverse-effect levels (NOAELs) for Aristolochia manshuriensis, Akebia trifoliate, Akebia quinata and Longdan Xieganwan in this study for mice were 0.06 g/kg/d, 0.40 g/kg/d, higher than 3.00 g/kg/d and higher than 10.00 g/kg/d, which were equivalent to 0.25 times, 5 times, 25 times and 10 times of normal human dose in clinical prescription, respectively.

  14. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    Science.gov (United States)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  15. Possible involvement of locus-specific methylation on expression regulation of leafy homologous gene (CiLFY during precocious trifoliate orange phase change process.

    Directory of Open Access Journals (Sweden)

    Jin-Zhi Zhang

    Full Text Available DNA methylation plays an essential role in regulating plant development. Here, we described an early flowering trifoliate orange (precocious trifoliate orange, Poncirus trifoliata L. Raf was treated with 5-azacytidine and displayed a number of phenotypic and developmental abnormalities. These observations suggested that DNA methylation might play an important role in regulating many developmental pathways including early flowering trait, and then the expression level of five key or integrated citrus flowering genes were analyzed. Our results showed that flowering locus T (CiFT relative expression level was increased with the increasing concentrations of 5-AzaC. However, leafy (CiLFY, APETELA1 (CiAP1, terminal flower1 (CiTFL1, and flowering locus C (CiFLC showed highest relative expression levels at 250 µΜ treatment, while decreased sharply at higher concentrations. In order to further confirm DNA methylation affects the expression of these genes, their full-length sequences were isolated by genome-walker method, and then was analyzed by using bioinformatics tools. However, only one locus-specific methylation site was observed in CiLFY sequence. Therefore, DNA methylation level of the CiLFY was investigated both at juvenile and adult stages of precocious trifoliate orange by bisulfate sequencing PCR; it has been shown that the level of DNA methylation was altered during phase change. In addition, spatial and temporal expression patterns of CiLFY promoter and a series of 5' deletions were investigated by driving the expression of a β-glucuronidase reporter gene in Arabidopsis. Exogenous GA3 treatment on transgenic Arabidopsis revealed that GA3 might be involved in the developmental regulation of CiLFY during flowering process of precocious trifoliate orange. These results provided insights into the molecular regulation of CiLFY gene expression, which would be helpful for studying citrus flowering.

  16. High Volume Throughput Computing: Identifying and Characterizing Throughput Oriented Workloads in Data Centers

    CERN Document Server

    Zhan, Jianfeng; Sun, Ninghui; Wang, Lei; Jia, Zhen; Luo, Chunjie

    2012-01-01

    For the first time, this paper systematically identifies three categories of throughput oriented workloads in data centers: services, data processing applications, and interactive real-time applications, whose targets are to increase the volume of throughput in terms of processed requests or data, or supported maximum number of simultaneous subscribers, respectively, and we coins a new term high volume throughput computing (in short HVC) to describe those workloads and data center systems designed for them. We characterize and compare HVC with other computing paradigms, e.g., high throughput computing, warehouse-scale computing, and cloud computing, in terms of levels, workloads, metrics, coupling degree, data scales, and number of jobs or service instances. We also preliminarily report our ongoing work on the metrics and benchmarks for HVC systems, which is the foundation of designing innovative data center systems for HVC workloads.

  17. A new computational strategy for identifying essential proteins based on network topological properties and biological information.

    Science.gov (United States)

    Qin, Chao; Sun, Yongqi; Dong, Yadong

    2017-01-01

    Essential proteins are the proteins that are indispensable to the survival and development of an organism. Deleting a single essential protein will cause lethality or infertility. Identifying and analysing essential proteins are key to understanding the molecular mechanisms of living cells. There are two types of methods for predicting essential proteins: experimental methods, which require considerable time and resources, and computational methods, which overcome the shortcomings of experimental methods. However, the prediction accuracy of computational methods for essential proteins requires further improvement. In this paper, we propose a new computational strategy named CoTB for identifying essential proteins based on a combination of topological properties, subcellular localization information and orthologous protein information. First, we introduce several topological properties of the protein-protein interaction (PPI) network. Second, we propose new methods for measuring orthologous information and subcellular localization and a new computational strategy that uses a random forest prediction model to obtain a probability score for the proteins being essential. Finally, we conduct experiments on four different Saccharomyces cerevisiae datasets. The experimental results demonstrate that our strategy for identifying essential proteins outperforms traditional computational methods and the most recently developed method, SON. In particular, our strategy improves the prediction accuracy to 89, 78, 79, and 85 percent on the YDIP, YMIPS, YMBD and YHQ datasets at the top 100 level, respectively.

  18. Global identifiability of linear compartmental models--a computer algebra algorithm.

    Science.gov (United States)

    Audoly, S; D'Angiò, L; Saccomani, M P; Cobelli, C

    1998-01-01

    A priori global identifiability deals with the uniqueness of the solution for the unknown parameters of a model and is, thus, a prerequisite for parameter estimation of biological dynamic models. Global identifiability is however difficult to test, since it requires solving a system of algebraic nonlinear equations which increases both in nonlinearity degree and number of terms and unknowns with increasing model order. In this paper, a computer algebra tool, GLOBI (GLOBal Identifiability) is presented, which combines the topological transfer function method with the Buchberger algorithm, to test global identifiability of linear compartmental models. GLOBI allows for the automatic testing of a priori global identifiability of general structure compartmental models from general multi input-multi output experiments. Examples of usage of GLOBI to analyze a priori global identifiability of some complex biological compartmental models are provided.

  19. Alleviation of drought stress by mycorrhizas is related to increased root H2O2 efflux in trifoliate orange

    Science.gov (United States)

    Huang, Yong-Ming; Zou, Ying-Ning; Wu, Qiang-Sheng

    2017-01-01

    The Non-invasive Micro-test Technique (NMT) is used to measure dynamic changes of specific ions/molecules non-invasively, but information about hydrogen peroxide (H2O2) fluxes in different classes of roots by mycorrhiza is scarce in terms of NMT. Effects of Funneliformis mosseae on plant growth, H2O2, superoxide radical (O2·−), malondialdehyde (MDA) concentrations, and H2O2 fluxes in the taproot (TR) and lateral roots (LRs) of trifoliate orange seedlings under well-watered (WW) and drought stress (DS) conditions were studied. DS strongly inhibited mycorrhizal colonization in the TR and LRs, whereas mycorrhizal inoculation significantly promoted plant growth and biomass production. H2O2, O2·−, and MDA concentrations in leaves and roots were dramatically lower in mycorrhizal seedlings than in non-mycorrhizal seedlings under DS. Compared with non-mycorrhizal seedlings, mycorrhizal seedlings had relatively higher net root H2O2 effluxes in the TR and LRs especially under WW, as well as significantly higher total root H2O2 effluxes in the TR and LRs under WW and DS. Total root H2O2 effluxes were significantly positively correlated with root colonization but negatively with root H2O2 and MDA concentrations. It suggested that mycorrhizas induces more H2O2 effluxes of the TR and LRs, thus, alleviating oxidative damage of DS in the host plant. PMID:28176859

  20. Computational Approaches for Mining GRO-Seq Data to Identify and Characterize Active Enhancers.

    Science.gov (United States)

    Nagari, Anusha; Murakami, Shino; Malladi, Venkat S; Kraus, W Lee

    2017-01-01

    Transcriptional enhancers are DNA regulatory elements that are bound by transcription factors and act to positively regulate the expression of nearby or distally located target genes. Enhancers have many features that have been discovered using genomic analyses. Recent studies have shown that active enhancers recruit RNA polymerase II (Pol II) and are transcribed, producing enhancer RNAs (eRNAs). GRO-seq, a method for identifying the location and orientation of all actively transcribing RNA polymerases across the genome, is a powerful approach for monitoring nascent enhancer transcription. Furthermore, the unique pattern of enhancer transcription can be used to identify enhancers in the absence of any information about the underlying transcription factors. Here, we describe the computational approaches required to identify and analyze active enhancers using GRO-seq data, including data pre-processing, alignment, and transcript calling. In addition, we describe protocols and computational pipelines for mining GRO-seq data to identify active enhancers, as well as known transcription factor binding sites that are transcribed. Furthermore, we discuss approaches for integrating GRO-seq-based enhancer data with other genomic data, including target gene expression and function. Finally, we describe molecular biology assays that can be used to confirm and explore further the function of enhancers that have been identified using genomic assays. Together, these approaches should allow the user to identify and explore the features and biological functions of new cell type-specific enhancers.

  1. Identifying a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Science.gov (United States)

    Davis, Kristan D.; Faraj, Daniel A.

    2016-07-12

    In a parallel computer, a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: identifying, by each compute node of the subcommunicator, all logical planes that include the compute node; calculating, by each compute node for each identified logical plane that includes the compute node, an area of the identified logical plane; initiating, by a root node of the subcommunicator, a gather operation; receiving, by the root node from each compute node of the subcommunicator, each node's calculated areas as contribution data to the gather operation; and identifying, by the root node in dependence upon the received calculated areas, a logical plane of the subcommunicator having the greatest area.

  2. Identifying a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Kristan D.; Faraj, Daniel A.

    2016-07-12

    In a parallel computer, a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: identifying, by each compute node of the subcommunicator, all logical planes that include the compute node; calculating, by each compute node for each identified logical plane that includes the compute node, an area of the identified logical plane; initiating, by a root node of the subcommunicator, a gather operation; receiving, by the root node from each compute node of the subcommunicator, each node's calculated areas as contribution data to the gather operation; and identifying, by the root node in dependence upon the received calculated areas, a logical plane of the subcommunicator having the greatest area.

  3. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words

    Directory of Open Access Journals (Sweden)

    Bingkun Wang

    2015-01-01

    Full Text Available With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods.

  4. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words.

    Science.gov (United States)

    Wang, Bingkun; Huang, Yongfeng; Wu, Xian; Li, Xing

    2015-01-01

    With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods.

  5. A Computational Procedure for Identifying Bilinear Representations of Nonlinear Systems Using Volterra Kernels

    Science.gov (United States)

    Kvaternik, Raymond G.; Silva, Walter A.

    2008-01-01

    A computational procedure for identifying the state-space matrices corresponding to discrete bilinear representations of nonlinear systems is presented. A key feature of the method is the use of first- and second-order Volterra kernels (first- and second-order pulse responses) to characterize the system. The present method is based on an extension of a continuous-time bilinear system identification procedure given in a 1971 paper by Bruni, di Pillo, and Koch. The analytical and computational considerations that underlie the original procedure and its extension to the title problem are presented and described, pertinent numerical considerations associated with the process are discussed, and results obtained from the application of the method to a variety of nonlinear problems from the literature are presented. The results of these exploratory numerical studies are decidedly promising and provide sufficient credibility for further examination of the applicability of the method.

  6. Identifiability and self-presentation: computer-mediated communication and intergroup interaction.

    Science.gov (United States)

    Douglas, K M; McGarty, C

    2001-09-01

    This research investigated the intergroup properties of hostile 'flaming' behaviour in computer-mediated communication and how flaming language is affected by Internet identifiability, or identifiability by name and e-mail address/geographical location as is common to Internet communication. According to the Social Identity Model of Deindividuation Effects (SIDE; e.g. Reicher, Spears, & Postmes, 1995) there may be strategic reasons for identifiable groups members to act in a more group-normative manner in the presence of an audience, to gain acceptance from the in-group, to avoid punishment from the out-group, or to assert their identity to the out-group. For these reasons, it was predicted that communicators would produce more stereotype-consistent (group-normative) descriptions of out-group members' behaviours when their descriptions were identifiable to an audience. In one archival and three experimental studies, it was found that identifiability to an in-group audience was associated with higher levels of stereotype-consistent language when communicators described anonymous out-group targets. These results extend SIDE and suggest the importance of an in-group audience for the expression of stereotypical views.

  7. Real time method and computer system for identifying radioactive materials from HPGe gamma-ray spectroscopy

    Science.gov (United States)

    Rowland, Mark S.; Howard, Douglas E.; Wong, James L.; Jessup, James L.; Bianchini, Greg M.; Miller, Wayne O.

    2007-10-23

    A real-time method and computer system for identifying radioactive materials which collects gamma count rates from a HPGe gamma-radiation detector to produce a high-resolution gamma-ray energy spectrum. A library of nuclear material definitions ("library definitions") is provided, with each uniquely associated with a nuclide or isotope material and each comprising at least one logic condition associated with a spectral parameter of a gamma-ray energy spectrum. The method determines whether the spectral parameters of said high-resolution gamma-ray energy spectrum satisfy all the logic conditions of any one of the library definitions, and subsequently uniquely identifies the material type as that nuclide or isotope material associated with the satisfied library definition. The method is iteratively repeated to update the spectrum and identification in real time.

  8. Computational Approach to Identify Enzymes That Are Potential Therapeutic Candidates for Psoriasis

    Directory of Open Access Journals (Sweden)

    Daeui Park

    2011-01-01

    Full Text Available Psoriasis is well known as a chronic inflammatory dermatosis. The disease affects persons of all ages and is a burden worldwide. Psoriasis is associated with various diseases such as arthritis. The disease is characterized by well-demarcated lesions on the skin of the elbows and knees. Various genetic and environmental factors are related to the pathogenesis of psoriasis. In order to identify enzymes that are potential therapeutic targets for psoriasis, we utilized a computational approach, combining microarray analysis and protein interaction prediction. We found 6,437 genes (3,264 upregulated and 3,173 downregulated that have significant differences in expression between regions with and without lesions in psoriasis patients. We identified potential candidates through protein-protein interaction predictions made using various protein interaction resources. By analyzing the hub protein of the networks with metrics such as degree and centrality, we detected 32 potential therapeutic candidates. After filtering these candidates through the ENZYME nomenclature database, we selected 5 enzymes: DNA helicase (RUVBL2, proteasome endopeptidase complex (PSMA2, nonspecific protein-tyrosine kinase (ZAP70, I-kappa-B kinase (IKBKE, and receptor protein-tyrosine kinase (EGFR. We adopted a computational approach to detect potential therapeutic targets; this approach may become an effective strategy for the discovery of new drug targets for psoriasis.

  9. Computational modeling identifies key gene regulatory interactions underlying phenobarbital-mediated tumor promotion

    Science.gov (United States)

    Luisier, Raphaëlle; Unterberger, Elif B.; Goodman, Jay I.; Schwarz, Michael; Moggs, Jonathan; Terranova, Rémi; van Nimwegen, Erik

    2014-01-01

    Gene regulatory interactions underlying the early stages of non-genotoxic carcinogenesis are poorly understood. Here, we have identified key candidate regulators of phenobarbital (PB)-mediated mouse liver tumorigenesis, a well-characterized model of non-genotoxic carcinogenesis, by applying a new computational modeling approach to a comprehensive collection of in vivo gene expression studies. We have combined our previously developed motif activity response analysis (MARA), which models gene expression patterns in terms of computationally predicted transcription factor binding sites with singular value decomposition (SVD) of the inferred motif activities, to disentangle the roles that different transcriptional regulators play in specific biological pathways of tumor promotion. Furthermore, transgenic mouse models enabled us to identify which of these regulatory activities was downstream of constitutive androstane receptor and β-catenin signaling, both crucial components of PB-mediated liver tumorigenesis. We propose novel roles for E2F and ZFP161 in PB-mediated hepatocyte proliferation and suggest that PB-mediated suppression of ESR1 activity contributes to the development of a tumor-prone environment. Our study shows that combining MARA with SVD allows for automated identification of independent transcription regulatory programs within a complex in vivo tissue environment and provides novel mechanistic insights into PB-mediated hepatocarcinogenesis. PMID:24464994

  10. Subclinical coronary atherosclerosis identified by coronary computed tomographic angiography in asymptomatic morbidly obese patients

    Directory of Open Access Journals (Sweden)

    Peter A. McCullough

    2010-09-01

    Full Text Available Obesity is a common public health problem and obese individuals in particular have a disproportionate incidence of acute coronary events. This study was undertaken to identify coronary artery lesions as well as associated clinical features, risk factors and demographics in patients with a body mass index (BMI >40 kg/m2 without known coronary artery disease (CAD. Morbidly obese subjects were prospectively recruited to undergo coronary computed tomographic angiography (CCTA using a dual-source computed tomography (CT system. CAD was defined as the presence of any atherosclerotic lesion in any one coronary artery segment. The presence, location, and severity of atherosclerosis were related to patient characteristics. Forty-one patients (28 women, mean age, 50.4±10.0 years, mean BMI, 43.8±4.8 kg/m2 served as the study population. Of these, 25 patients (61% had at least one coronary stenosis. All but 2 patients within the CAD cohort had coronary artery calcium (CAC scores >0, and most plaques identified (75.4% were non-calcified. There was a predilection of calcified and non-calcified atherosclerosis involving the left anterior descending (LAD coronary artery compared with other coronary segments. Univariate predictors of CAD included older age, dyslipidemia, and diabetes. In this preliminary study of young morbidly obese patients, CCTA detected a high prevalence of calcified and non-calcified CAD, although the later predominated.

  11. Rapid, computer vision-enabled murine screening system identifies neuropharmacological potential of two new mechanisms

    Directory of Open Access Journals (Sweden)

    Steven L Roberds

    2011-09-01

    Full Text Available The lack of predictive in vitro models for behavioral phenotypes impedes rapid advancement in neuropharmacology and psychopharmacology. In vivo behavioral assays are more predictive of activity in human disorders, but such assays are often highly resource-intensive. Here we describe the successful application of a computer vision-enabled system to identify potential neuropharmacological activity of two new mechanisms. The analytical system was trained using multiple drugs that are used clinically to treat depression, schizophrenia, anxiety, and other psychiatric or behavioral disorders. During blinded testing the PDE10 inhibitor TP-10 produced a signature of activity suggesting potential antipsychotic activity. This finding is consistent with TP-10’s activity in multiple rodent models that is similar to that of clinically used antipsychotic drugs. The CK1ε inhibitor PF-670462 produced a signature consistent with anxiolytic activity and, at the highest dose tested, behavioral effects similar to that of opiate analgesics. Neither TP-10 nor PF-670462 was included in the training set. Thus, computer vision-based behavioral analysis can facilitate drug discovery by identifying neuropharmacological effects of compounds acting through new mechanisms.

  12. An Experimentally Based Computer Search Identifies Unstructured Membrane-binding Sites in Proteins

    Science.gov (United States)

    Brzeska, Hanna; Guag, Jake; Remmert, Kirsten; Chacko, Susan; Korn, Edward D.

    2010-01-01

    Programs exist for searching protein sequences for potential membrane-penetrating segments (hydrophobic regions) and for lipid-binding sites with highly defined tertiary structures, such as PH, FERM, C2, ENTH, and other domains. However, a rapidly growing number of membrane-associated proteins (including cytoskeletal proteins, kinases, GTP-binding proteins, and their effectors) bind lipids through less structured regions. Here, we describe the development and testing of a simple computer search program that identifies unstructured potential membrane-binding sites. Initially, we found that both basic and hydrophobic amino acids, irrespective of sequence, contribute to the binding to acidic phospholipid vesicles of synthetic peptides that correspond to the putative membrane-binding domains of Acanthamoeba class I myosins. Based on these results, we modified a hydrophobicity scale giving Arg- and Lys-positive, rather than negative, values. Using this basic and hydrophobic scale with a standard search algorithm, we successfully identified previously determined unstructured membrane-binding sites in all 16 proteins tested. Importantly, basic and hydrophobic searches identified previously unknown potential membrane-binding sites in class I myosins, PAKs and CARMIL (capping protein, Arp2/3, myosin I linker; a membrane-associated cytoskeletal scaffold protein), and synthetic peptides and protein domains containing these newly identified sites bound to acidic phospholipids in vitro. PMID:20018884

  13. Integration of experimental and computational methods for identifying geometric, thermal and diffusive properties of biomaterials

    Science.gov (United States)

    Weres, Jerzy; Kujawa, Sebastian; Olek, Wiesław; Czajkowski, Łukasz

    2016-04-01

    Knowledge of physical properties of biomaterials is important in understanding and designing agri-food and wood processing industries. In the study presented in this paper computational methods were developed and combined with experiments to enhance identification of agri-food and forest product properties, and to predict heat and water transport in such products. They were based on the finite element model of heat and water transport and supplemented with experimental data. Algorithms were proposed for image processing, geometry meshing, and inverse/direct finite element modelling. The resulting software system was composed of integrated subsystems for 3D geometry data acquisition and mesh generation, for 3D geometry modelling and visualization, and for inverse/direct problem computations for the heat and water transport processes. Auxiliary packages were developed to assess performance, accuracy and unification of data access. The software was validated by identifying selected properties and using the estimated values to predict the examined processes, and then comparing predictions to experimental data. The geometry, thermal conductivity, specific heat, coefficient of water diffusion, equilibrium water content and convective heat and water transfer coefficients in the boundary layer were analysed. The estimated values, used as an input for simulation of the examined processes, enabled reduction in the uncertainty associated with predictions.

  14. Influence of Cultivar on the Postharvest Hardening of Trifoliate Yam (Dioscorea dumetorum Tubers

    Directory of Open Access Journals (Sweden)

    Christian Siadjeu

    2016-01-01

    Full Text Available The influence of cultivar on the postharvest hardening of Dioscorea dumetorum tubers was assessed. 32 cultivars of D. dumetorum tubers were planted in April 2014, harvested at physiological maturity, and stored under prevailing tropical ambient conditions (19–28°C, 60–85% RH for 0, 5, 14, 21, and 28 days. Samples were evaluated for cooked hardness. Results showed that one cultivar, Ibo sweet 3, was not affected by the hardening phenomenon. The remaining 31 were all subject to the hardening phenomenon at different degree. Cooked hardness increased more rapidly in cultivars with many roots on the tuber surface compared to cultivars with few roots on the tuber surface. When both the characteristics flesh colour and number of roots on tuber surface were associated, cooked hardness in cultivars with yellow flesh and many roots increased more rapidly than in cultivars with white flesh and many roots, whereas cooked hardness in cultivars with yellow flesh and few roots increased more slowly than in cultivars with white flesh and few roots. Accessions collected in high altitude increased more rapidly compared to accessions collected in low altitude. The cultivar Ibo sweet 3 identified in this study could provide important information for breeding program of D. dumetorum against postharvest hardening phenomenon.

  15. A computational technique to identify the optimal stiffness matrix for a discrete nuclear fuel assembly model

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu, E-mail: nkpark@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Kim, Kyoung-Joo, E-mail: kyoungjoo@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Kim, Kyoung-Hong, E-mail: kyounghong@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Suh, Jung-Min, E-mail: jmsuh@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of)

    2013-02-15

    Highlights: ► An identification method of the optimal stiffness matrix for a fuel assembly structure is discussed. ► The least squares optimization method is introduced, and a closed form solution of the problem is derived. ► The method can be expanded to the system with the limited number of modes. ► Identification error due to the perturbed mode shape matrix is analyzed. ► Verification examples show that the proposed procedure leads to a reliable solution. -- Abstract: A reactor core structural model which is used to evaluate the structural integrity of the core contains nuclear fuel assembly models. Since the reactor core consists of many nuclear fuel assemblies, the use of a refined fuel assembly model leads to a considerable amount of computing time for performing nonlinear analyses such as the prediction of seismic induced vibration behaviors. The computational time could be reduced by replacing the detailed fuel assembly model with a simplified model that has fewer degrees of freedom, but the dynamic characteristics of the detailed model must be maintained in the simplified model. Such a model based on an optimal design method is proposed in this paper. That is, when a mass matrix and a mode shape matrix are given, the optimal stiffness matrix of a discrete fuel assembly model can be estimated by applying the least squares minimization method. The verification of the method is completed by comparing test results and simulation results. This paper shows that the simplified model's dynamic behaviors are quite similar to experimental results and that the suggested method is suitable for identifying reliable mathematical model for fuel assemblies.

  16. Identifying Learning Trajectories While Playing a Learning-to-Learn Computer Game in Different Children and Instruction Types

    NARCIS (Netherlands)

    de Koning-Veenstra, Baukje; Timmerman, Marieke; van Geert, Paul; van der Meulen, Bieuwe

    2014-01-01

    This research focuses on identifying learning trajectories expressed among children playing a learning-to-learn computer game and examining the relationships between the learning trajectories and individual characteristics such as developmental age, prior knowledge, and instruction type (adult- and/

  17. Identifying Learning Trajectories While Playing a Learning-to-Learn Computer Game in Different Children and Instruction Types

    NARCIS (Netherlands)

    de Koning-Veenstra, Baukje; Timmerman, Marieke; van Geert, Paul; van der Meulen, Bieuwe

    2014-01-01

    This research focuses on identifying learning trajectories expressed among children playing a learning-to-learn computer game and examining the relationships between the learning trajectories and individual characteristics such as developmental age, prior knowledge, and instruction type (adult- and/

  18. Identification and comparative profiling of miRNAs in an early flowering mutant of trifoliate orange and its wild type by genome-wide deep sequencing.

    Directory of Open Access Journals (Sweden)

    Lei-Ming Sun

    Full Text Available MicroRNAs (miRNAs are a new class of small, endogenous RNAs that play a regulatory role in various biological and metabolic processes by negatively affecting gene expression at the post-transcriptional level. While the number of known Arabidopsis and rice miRNAs is continuously increasing, information regarding miRNAs from woody plants such as citrus remains limited. Solexa sequencing was performed at different developmental stages on both an early flowering mutant of trifoliate orange (precocious trifoliate orange, Poncirus trifoliata L. Raf. and its wild-type in this study, resulting in the obtainment of 141 known miRNAs belonging to 99 families and 75 novel miRNAs in four libraries. A total of 317 potential target genes were predicted based on the 51 novel miRNAs families, GO and KEGG annotation revealed that high ranked miRNA-target genes are those implicated in diverse cellular processes in plants, including development, transcription, protein degradation and cross adaptation. To characterize those miRNAs expressed at the juvenile and adult development stages of the mutant and its wild-type, further analysis on the expression profiles of several miRNAs through real-time PCR was performed. The results revealed that most miRNAs were down-regulated at adult stage compared with juvenile stage for both the mutant and its wild-type. These results indicate that both conserved and novel miRNAs may play important roles in citrus growth and development, stress responses and other physiological processes.

  19. Identifying the Factors that Influence Computer Use in the Early Childhood Classroom

    Science.gov (United States)

    Edwards, Suzy

    2005-01-01

    Computers have become an increasingly accepted learning tool in the early childhood classroom. Despite initial concerns regarding the effect of computers on children's development, past research has indicated that computer use by young children can support their learning and developmental outcomes (Siraj-Blatchford & Whitebread, 2003; Yelland,…

  20. Transcriptator: An Automated Computational Pipeline to Annotate Assembled Reads and Identify Non Coding RNA.

    Directory of Open Access Journals (Sweden)

    Kumar Parijat Tripathi

    Full Text Available RNA-seq is a new tool to measure RNA transcript counts, using high-throughput sequencing at an extraordinary accuracy. It provides quantitative means to explore the transcriptome of an organism of interest. However, interpreting this extremely large data into biological knowledge is a problem, and biologist-friendly tools are lacking. In our lab, we developed Transcriptator, a web application based on a computational Python pipeline with a user-friendly Java interface. This pipeline uses the web services available for BLAST (Basis Local Search Alignment Tool, QuickGO and DAVID (Database for Annotation, Visualization and Integrated Discovery tools. It offers a report on statistical analysis of functional and Gene Ontology (GO annotation's enrichment. It helps users to identify enriched biological themes, particularly GO terms, pathways, domains, gene/proteins features and protein-protein interactions related informations. It clusters the transcripts based on functional annotations and generates a tabular report for functional and gene ontology annotations for each submitted transcript to the web server. The implementation of QuickGo web-services in our pipeline enable the users to carry out GO-Slim analysis, whereas the integration of PORTRAIT (Prediction of transcriptomic non coding RNA (ncRNA by ab initio methods helps to identify the non coding RNAs and their regulatory role in transcriptome. In summary, Transcriptator is a useful software for both NGS and array data. It helps the users to characterize the de-novo assembled reads, obtained from NGS experiments for non-referenced organisms, while it also performs the functional enrichment analysis of differentially expressed transcripts/genes for both RNA-seq and micro-array experiments. It generates easy to read tables and interactive charts for better understanding of the data. The pipeline is modular in nature, and provides an opportunity to add new plugins in the future. Web application is

  1. New drug candidates for liposomal delivery identified by computer modeling of liposomes' remote loading and leakage.

    Science.gov (United States)

    Cern, Ahuva; Marcus, David; Tropsha, Alexander; Barenholz, Yechezkel; Goldblum, Amiram

    2017-02-16

    Remote drug loading into nano-liposomes is in most cases the best method for achieving high concentrations of active pharmaceutical ingredients (API) per nano-liposome that enable therapeutically viable API-loaded nano-liposomes, referred to as nano-drugs. This approach also enables controlled drug release. Recently, we constructed computational models to identify APIs that can achieve the desired high concentrations in nano-liposomes by remote loading. While those previous models included a broad spectrum of experimental conditions and dealt only with loading, here we reduced the scope to the molecular characteristics alone. We model and predict API suitability for nano-liposomal delivery by fixing the main experimental conditions: liposome lipid composition and size to be similar to those of Doxil® liposomes. On that basis, we add a prediction of drug leakage from the nano-liposomes during storage. The latter is critical for having pharmaceutically viable nano-drugs. The "load and leak" models were used to screen two large molecular databases in search of candidate APIs for delivery by nano-liposomes. The distribution of positive instances in both loading and leakage models was similar in the two databases screened. The screening process identified 667 molecules that were positives by both loading and leakage models (i.e., both high-loading and stable). Among them, 318 molecules received a high score in both properties and of these, 67 are FDA-approved drugs. This group of molecules, having diverse pharmacological activities, may be the basis for future liposomal drug development.

  2. Transcriptator: An Automated Computational Pipeline to Annotate Assembled Reads and Identify Non Coding RNA

    Science.gov (United States)

    Zuccaro, Antonio; Guarracino, Mario Rosario

    2015-01-01

    RNA-seq is a new tool to measure RNA transcript counts, using high-throughput sequencing at an extraordinary accuracy. It provides quantitative means to explore the transcriptome of an organism of interest. However, interpreting this extremely large data into biological knowledge is a problem, and biologist-friendly tools are lacking. In our lab, we developed Transcriptator, a web application based on a computational Python pipeline with a user-friendly Java interface. This pipeline uses the web services available for BLAST (Basis Local Search Alignment Tool), QuickGO and DAVID (Database for Annotation, Visualization and Integrated Discovery) tools. It offers a report on statistical analysis of functional and Gene Ontology (GO) annotation’s enrichment. It helps users to identify enriched biological themes, particularly GO terms, pathways, domains, gene/proteins features and protein—protein interactions related informations. It clusters the transcripts based on functional annotations and generates a tabular report for functional and gene ontology annotations for each submitted transcript to the web server. The implementation of QuickGo web-services in our pipeline enable the users to carry out GO-Slim analysis, whereas the integration of PORTRAIT (Prediction of transcriptomic non coding RNA (ncRNA) by ab initio methods) helps to identify the non coding RNAs and their regulatory role in transcriptome. In summary, Transcriptator is a useful software for both NGS and array data. It helps the users to characterize the de-novo assembled reads, obtained from NGS experiments for non-referenced organisms, while it also performs the functional enrichment analysis of differentially expressed transcripts/genes for both RNA-seq and micro-array experiments. It generates easy to read tables and interactive charts for better understanding of the data. The pipeline is modular in nature, and provides an opportunity to add new plugins in the future. Web application is freely

  3. Role of peripheral quantitative computed tomography in identifying disuse osteoporosis in paraplegia

    Energy Technology Data Exchange (ETDEWEB)

    Coupaud, Sylvie [University of Glasgow, Centre for Rehabilitation Engineering, Department of Mechanical Engineering, Glasgow (United Kingdom); Southern General Hospital, Queen Elizabeth National Spinal Injuries Unit, Glasgow (United Kingdom); McLean, Alan N.; Allan, David B. [Southern General Hospital, Queen Elizabeth National Spinal Injuries Unit, Glasgow (United Kingdom)

    2009-10-15

    Disuse osteoporosis is a major long-term health consequence of spinal cord injury (SCI) that still needs to be addressed. Its management in SCI should begin with accurate diagnosis, followed by targeted treatments in the most vulnerable subgroups. We present data quantifying disuse osteoporosis in a cross-section of the Scottish paraplegic population to identify subgroups with lowest bone mineral density (BMD). Forty-seven people with chronic SCI at levels T2-L2 were scanned using peripheral quantitative computed tomography at four tibial sites and two femoral sites, at the Queen Elizabeth National Spinal Injuries Unit, Glasgow (UK). At the distal epiphyses, trabecular BMD (BMDtrab), total BMD, total bone cross-sectional area (CSA) and bone mineral content (BMC) were determined. In the diaphyses, cortical BMD, total bone CSA, cortical CSA and BMC were calculated. Bone, muscle and fat CSAs were estimated in the lower leg and thigh. BMDtrab decreased exponentially with time since injury at different rates in the tibia and femur. At most sites, female paraplegics had significantly lower BMC, total bone CSA and muscle CSA than male paraplegics. Subjects with lumbar SCI tended to have lower bone values and smaller muscle CSAs than in thoracic SCI. At the distal epiphyses of the tibia and femur, there is generally a rapid and extensive reduction in BMDtrab after SCI. Female subjects, and those with lumbar SCI, tend to have lower bone values than males or those with thoracic SCI, respectively. (orig.)

  4. A computational method to help identify and measure metal lines in high resolution QSO spectra

    Institute of Scientific and Technical Information of China (English)

    Xi-Heng Shi; David Tytler; Jin-Liang Hou; David Kirkman; Jeffery Lee; Benjamin Ou

    2011-01-01

    A computational code is developed to help identify metal absorption lines in high resolution QSO spectra,especially in the Lyα forest.The input to the code includes a list of line central wavelengths,column densities and Doppler widths.The code then searches for candidate metal absorption systems and assesses the probability that each system could be real.The framework of the strategy we employ is described in detail and we discuss how to estimate the errors in line profile fitting that are essential to identification.A series of artificial spectra is constructed to calibrate the performance of the code.Due to the effects of blending and noise on Voigt profile fitting,the completeness of the identification depends on the column density of absorbers.For intermediate and strong artificial metal absorbers,more than 90% could be confirmed by the code.The results of applying the code to the real spectra of QSOs HS0757+5218 and Q0100+1300 are also presented.

  5. [Key effect genes responding to nerve injury identified by gene ontology and computer pattern recognition].

    Science.gov (United States)

    Pan, Qian; Peng, Jin; Zhou, Xue; Yang, Hao; Zhang, Wei

    2012-07-01

    In order to screen out important genes from large gene data of gene microarray after nerve injury, we combine gene ontology (GO) method and computer pattern recognition technology to find key genes responding to nerve injury, and then verify one of these screened-out genes. Data mining and gene ontology analysis of gene chip data GSE26350 was carried out through MATLAB software. Cd44 was selected from screened-out key gene molecular spectrum by comparing genes' different GO terms and positions on score map of principal component. Function interferences were employed to influence the normal binding of Cd44 and one of its ligands, chondroitin sulfate C (CSC), to observe neurite extension. Gene ontology analysis showed that the first genes on score map (marked by red *) mainly distributed in molecular transducer activity, receptor activity, protein binding et al molecular function GO terms. Cd44 is one of six effector protein genes, and attracted us with its function diversity. After adding different reagents into the medium to interfere the normal binding of CSC and Cd44, varying-degree remissions of CSC's inhibition on neurite extension were observed. CSC can inhibit neurite extension through binding Cd44 on the neuron membrane. This verifies that important genes in given physiological processes can be identified by gene ontology analysis of gene chip data.

  6. Identifying human disease genes: advances in molecular genetics and computational approaches.

    Science.gov (United States)

    Bakhtiar, S M; Ali, A; Baig, S M; Barh, D; Miyoshi, A; Azevedo, V

    2014-07-04

    The human genome project is one of the significant achievements that have provided detailed insight into our genetic legacy. During the last two decades, biomedical investigations have gathered a considerable body of evidence by detecting more than 2000 disease genes. Despite the imperative advances in the genetic understanding of various diseases, the pathogenesis of many others remains obscure. With recent advances, the laborious methodologies used to identify DNA variations are replaced by direct sequencing of genomic DNA to detect genetic changes. The ability to perform such studies depends equally on the development of high-throughput and economical genotyping methods. Currently, basically for every disease whose origen is still unknown, genetic approaches are available which could be pedigree-dependent or -independent with the capacity to elucidate fundamental disease mechanisms. Computer algorithms and programs for linkage analysis have formed the foundation for many disease gene detection projects, similarly databases of clinical findings have been widely used to support diagnostic decisions in dysmorphology and general human disease. For every disease type, genome sequence variations, particularly single nucleotide polymorphisms are mapped by comparing the genetic makeup of case and control groups. Methods that predict the effects of polymorphisms on protein stability are useful for the identification of possible disease associations, whereas structural effects can be assessed using methods to predict stability changes in proteins using sequence and/or structural information.

  7. Computed tomography vs magnetic resonance imaging for identifying acute lesions in pediatric traumatic brain injury.

    Science.gov (United States)

    Buttram, Sandra D W; Garcia-Filion, Pamela; Miller, Jeffrey; Youssfi, Mostafa; Brown, S Danielle; Dalton, Heidi J; Adelson, P David

    2015-02-01

    Pediatric traumatic brain injury (TBI) is a leading cause of morbidity and mortality in children. Computed tomography (CT) is the modality of choice to screen for brain injuries. MRI may provide more clinically relevant information. The purpose of this study was to compare lesion detection between CT and MRI after TBI. Retrospective cohort of children (0-21 years) with TBI between 2008 and 2010 at a Level 1 pediatric trauma center with a head CT scan on day of injury and a brain MRI scan within 2 weeks of injury. Agreement between CT and MRI was determined by κ statistic and stratified by injury mechanism. One hundred five children were studied. Of these, 78% had mild TBI. The MRI scan was obtained a median of 1 day (interquartile range, 1-2) after CT. Overall, CT and MRI demonstrated poor agreement (κ=-0.083; P=.18). MRI detected a greater number of intraparenchymal lesions (n=36; 34%) compared with CT (n=16; 15%) (P<.001). Among patients with abusive head trauma, MRI detected intraparenchymal lesions in 16 (43%), compared with only 4 (11%) lesions with CT (P=.03). Of 8 subjects with a normal CT scan, 6 out of 8 had abnormal lesions on MRI. Compared with CT, MRI identified significantly more intraparenchymal lesions in pediatric TBI, particularly in children with abusive head trauma. The prognostic value of identification of intraparenchymal lesions by MRI is unknown but warrants additional inquiry. Risks and benefits from early MRI (including sedation, time, and lack of radiation exposure) compared with CT should be weighed by clinicians. Copyright © 2015 by the American Academy of Pediatrics.

  8. Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.

    Science.gov (United States)

    Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P

    2010-01-15

    A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.

  9. Identifying controlling variables for math computation fluency through experimental analysis: the interaction of stimulus control and reinforcing consequences.

    Science.gov (United States)

    Hofstadter-Duke, Kristi L; Daly, Edward J

    2015-03-01

    This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.

  10. The Ulam Index: Methods of Theoretical Computer Science Help in Identifying Chemical Substances

    Science.gov (United States)

    Beltran, Adriana; Salvador, James

    1997-01-01

    In this paper, we show how methods developed for solving a theoretical computer problem of graph isomorphism are used in structural chemistry. We also discuss potential applications of these methods to exobiology: the search for life outside Earth.

  11. Identifying Students' Reasons for Selecting a Computer-Mediated or Lecture Class

    Science.gov (United States)

    Kinney, D. Patrick; Robertson, Douglas F.

    2005-01-01

    Students in this study were enrolled in either an Introductory Algebra or Intermediate Algebra class taught through computer-mediated instruction or lecture. In the first year of the study, students were asked what they believed helped them learn mathematics in the instructional format in which they were enrolled. They were also asked what they…

  12. Precision of identifying cephalometric landmarks with cone beam computed tomography in vivo

    NARCIS (Netherlands)

    Hassan, B.; Nijkamp, P.; Verheij, H.; Tairie, J.; Vink, C.; van der Stelt, P.; van Beek, H.

    2013-01-01

    The study aims were to assess the precision and time required to conduct cephalometric analysis with cone-beam computed tomography (CBCT) in vivo on both three-dimensional (3D) surface models and multi-planar reformations (MPR) images. Datasets from 10 patients scanned with CBCT were used to create

  13. A Computer Vision System forLocating and Identifying Internal Log Defects Using CT Imagery

    Science.gov (United States)

    Dongping Zhu; Richard W. Conners; Frederick Lamb; Philip A. Araman

    1991-01-01

    A number of researchers have shown the ability of magnetic resonance imaging (MRI) and computer tomography (CT) imaging to detect internal defects in logs. However, if these devices are ever to play a role in the forest products industry, automatic methods for analyzing data from these devices must be developed. This paper reports research aimed at developing a...

  14. Numerical research on heat transfer and flow resistance performance of twisted trifoliate tube%扭曲三叶管传热与流阻性能的数值研究

    Institute of Scientific and Technical Information of China (English)

    王定标; 王宏斌; 梁珍祥

    2012-01-01

    Twisted tube heat exchanger is a high efficiency self-supporting heat exchanger based on traditional shell and tube heat exchanger. Twisted tubes are used outside of tube, forming self-supporting structure on shell side instead of baffles. In this study, a new type of twisted trifoliate tube with three half oval and transitional arc was developed according to the heat transfer enhancement mechanism of twisted elliptic tube. The standard k-ω model was used for numerical calculation of turbulent flow in circle tube and twisted elliptic tube, and the relative error was acceptable in engineering applications. Then a numerical study on heat transfer and flow resistance of twisted trifoliate tube was carried out with this turbulence model. The results showed that the heat transfer of twisted trifoliate tube was enhanced, by comparison of Nusselt number with the twisted elliptic tube. Although the pressure drop increased a lot, comprehensive performance of twisted trifoliate tube was about 13% higher than that of twisted elliptic tube. Due to the special shape of cross section, the helical flow in the twisted trifoliate tube was more complicated than that in the twisted elliptic tube, and the synergy between velocity field and temperature gradient field was better. The effect of Reynolds number in the range of 4000-20000 was also studied. As Reynolds number increased, the Nusselt number and pressure drop increased, but comprehensive performance was worse.The heat transfer enhancement of twisted trifoliate tube was evident, especially at low Reynolds number. The influence of inscribed circle diameter on Nusselt number and pressure drop was greater than that of transitional arc diameter. Comprehensive performance of twisted trifoliate tube was better at smaller inscribed circle diameter and transitional arc diameter.%扭曲管换热器是一种新型高效换热器.在扭曲管强化传热机理研究的基础上,提出了一种新的扭曲管管型——三叶管.验证

  15. Use of cone beam computed tomography in identifying postmenopausal women with osteoporosis.

    Science.gov (United States)

    Brasileiro, C B; Chalub, L L F H; Abreu, M H N G; Barreiros, I D; Amaral, T M P; Kakehasi, A M; Mesquita, R A

    2017-12-01

    The aim of this study is to correlate radiometric indices from cone beam computed tomography (CBCT) images and bone mineral density (BMD) in postmenopausal women. Quantitative CBCT indices can be used to screen for women with low BMD. Osteoporosis is a disease characterized by the deterioration of bone tissue and the consequent decrease in BMD and increase in bone fragility. Several studies have been performed to assess radiometric indices in panoramic images as low-BMD predictors. The aim of this study is to correlate radiometric indices from CBCT images and BMD in postmenopausal women. Sixty postmenopausal women with indications for dental implants and CBCT evaluation were selected. Dual-energy X-ray absorptiometry (DXA) was performed, and the patients were divided into normal, osteopenia, and osteoporosis groups, according to the World Health Organization (WHO) criteria. Cross-sectional images were used to evaluate the computed tomography mandibular index (CTMI), the computed tomography index (inferior) (CTI (I)) and computed tomography index (superior) (CTI (S)). Student's t test was used to compare the differences between the indices of the groups' intraclass correlation coefficient (ICC). Statistical analysis showed a high degree of interobserver and intraobserver agreement for all measurements (ICC > 0.80). The mean values of CTMI, CTI (S), and CTI (I) were lower in the osteoporosis group than in osteopenia and normal patients (p < 0.05). In comparing normal patients and women with osteopenia, there was no statistically significant difference in the mean value of CTI (I) (p = 0.075). Quantitative CBCT indices may help dentists to screen for women with low spinal and femoral bone mineral density so that they can refer postmenopausal women for bone densitometry.

  16. The concept of computer software designed to identify and analyse logistics costs in agricultural enterprises

    Directory of Open Access Journals (Sweden)

    Karol Wajszczyk

    2009-01-01

    Full Text Available The study comprised research, development and computer programming works concerning the development of a concept for the IT tool to be used in the identification and analysis of logistics costs in agricultural enterprises in terms of the process-based approach. As a result of research and programming work an overall functional and IT concept of software was developed for the identification and analysis of logistics costs for agricultural enterprises.

  17. Identifying a few foot-and-mouth disease virus signature nucleotide strings for computational genotyping

    Directory of Open Access Journals (Sweden)

    Xu Lizhe

    2008-06-01

    Full Text Available Abstract Background Serotypes of the Foot-and-Mouth disease viruses (FMDVs were generally determined by biological experiments. The computational genotyping is not well studied even with the availability of whole viral genomes, due to uneven evolution among genes as well as frequent genetic recombination. Naively using sequence comparison for genotyping is only able to achieve a limited extent of success. Results We used 129 FMDV strains with known serotype as training strains to select as many as 140 most serotype-specific nucleotide strings. We then constructed a linear-kernel Support Vector Machine classifier using these 140 strings. Under the leave-one-out cross validation scheme, this classifier was able to assign correct serotype to 127 of these 129 strains, achieving 98.45% accuracy. It also assigned serotype correctly to an independent test set of 83 other FMDV strains downloaded separately from NCBI GenBank. Conclusion Computational genotyping is much faster and much cheaper than the wet-lab based biological experiments, upon the availability of the detailed molecular sequences. The high accuracy of our proposed method suggests the potential of utilizing a few signature nucleotide strings instead of whole genomes to determine the serotypes of novel FMDV strains.

  18. Identifying Ghanaian Pre-Service Teachers' Readiness for Computer Use: A Technology Acceptance Model Approach

    Science.gov (United States)

    Gyamfi, Stephen Adu

    2016-01-01

    This study extends the technology acceptance model to identify factors that influence technology acceptance among pre-service teachers in Ghana. Data from 380 usable questionnaires were tested against the research model. Utilising the extended technology acceptance model (TAM) as a research framework, the study found that: pre-service teachers'…

  19. SABER: a computational method for identifying active sites for new reactions.

    Science.gov (United States)

    Nosrati, Geoffrey R; Houk, K N

    2012-05-01

    A software suite, SABER (Selection of Active/Binding sites for Enzyme Redesign), has been developed for the analysis of atomic geometries in protein structures, using a geometric hashing algorithm (Barker and Thornton, Bioinformatics 2003;19:1644-1649). SABER is used to explore the Protein Data Bank (PDB) to locate proteins with a specific 3D arrangement of catalytic groups to identify active sites that might be redesigned to catalyze new reactions. As a proof-of-principle test, SABER was used to identify enzymes that have the same catalytic group arrangement present in o-succinyl benzoate synthase (OSBS). Among the highest-scoring scaffolds identified by the SABER search for enzymes with the same catalytic group arrangement as OSBS were L-Ala D/L-Glu epimerase (AEE) and muconate lactonizing enzyme II (MLE), both of which have been redesigned to become effective OSBS catalysts, demonstrated by experiments. Next, we used SABER to search for naturally existing active sites in the PDB with catalytic groups similar to those present in the designed Kemp elimination enzyme KE07. From over 2000 geometric matches to the KE07 active site, SABER identified 23 matches that corresponded to residues from known active sites. The best of these matches, with a 0.28 Å catalytic atom RMSD to KE07, was then redesigned to be compatible with the Kemp elimination using RosettaDesign. We also used SABER to search for potential Kemp eliminases using a theozyme predicted to provide a greater rate acceleration than the active site of KE07, and used Rosetta to create a design based on the proteins identified.

  20. An Integrated Bioinformatics and Computational Biology Approach Identifies New BH3-Only Protein Candidates.

    Science.gov (United States)

    Hawley, Robert G; Chen, Yuzhong; Riz, Irene; Zeng, Chen

    2012-05-04

    In this study, we utilized an integrated bioinformatics and computational biology approach in search of new BH3-only proteins belonging to the BCL2 family of apoptotic regulators. The BH3 (BCL2 homology 3) domain mediates specific binding interactions among various BCL2 family members. It is composed of an amphipathic α-helical region of approximately 13 residues that has only a few amino acids that are highly conserved across all members. Using a generalized motif, we performed a genome-wide search for novel BH3-containing proteins in the NCBI Consensus Coding Sequence (CCDS) database. In addition to known pro-apoptotic BH3-only proteins, 197 proteins were recovered that satisfied the search criteria. These were categorized according to α-helical content and predictive binding to BCL-xL (encoded by BCL2L1) and MCL-1, two representative anti-apoptotic BCL2 family members, using position-specific scoring matrix models. Notably, the list is enriched for proteins associated with autophagy as well as a broad spectrum of cellular stress responses such as endoplasmic reticulum stress, oxidative stress, antiviral defense, and the DNA damage response. Several potential novel BH3-containing proteins are highlighted. In particular, the analysis strongly suggests that the apoptosis inhibitor and DNA damage response regulator, AVEN, which was originally isolated as a BCL-xL-interacting protein, is a functional BH3-only protein representing a distinct subclass of BCL2 family members.

  1. Automated computation of arbor densities: a step toward identifying neuronal cell types.

    Science.gov (United States)

    Sümbül, Uygar; Zlateski, Aleksandar; Vishwanathan, Ashwin; Masland, Richard H; Seung, H Sebastian

    2014-01-01

    The shape and position of a neuron convey information regarding its molecular and functional identity. The identification of cell types from structure, a classic method, relies on the time-consuming step of arbor tracing. However, as genetic tools and imaging methods make data-driven approaches to neuronal circuit analysis feasible, the need for automated processing increases. Here, we first establish that mouse retinal ganglion cell types can be as precise about distributing their arbor volumes across the inner plexiform layer as they are about distributing the skeletons of the arbors. Then, we describe an automated approach to computing the spatial distribution of the dendritic arbors, or arbor density, with respect to a global depth coordinate based on this observation. Our method involves three-dimensional reconstruction of neuronal arbors by a supervised machine learning algorithm, post-processing of the enhanced stacks to remove somata and isolate the neuron of interest, and registration of neurons to each other using automatically detected arbors of the starburst amacrine interneurons as fiducial markers. In principle, this method could be generalizable to other structures of the CNS, provided that they allow sparse labeling of the cells and contain a reliable axis of spatial reference.

  2. Multistep greedy algorithm identifies community structure in real-world and computer-generated networks

    CERN Document Server

    Schuetz, Philipp

    2008-01-01

    We have recently introduced a multistep extension of the greedy algorithm for modularity optimization. The extension is based on the idea that merging l pairs of communities (l>1) at each iteration prevents premature condensation into few large communities. Here, an empirical formula is presented for the choice of the step width l that generates partitions with (close to) optimal modularity for 17 real-world and 1100 computer-generated networks. Furthermore, an in-depth analysis of the communities of two real-world networks (the metabolic network of the bacterium E. coli and the graph of coappearing words in the titles of papers coauthored by Martin Karplus) provides evidence that the partition obtained by the multistep greedy algorithm is superior to the one generated by the original greedy algorithm not only with respect to modularity but also according to objective criteria. In other words, the multistep extension of the greedy algorithm reduces the danger of getting trapped in local optima of modularity a...

  3. Automated computation of arbor densities: a step toward identifying neuronal cell types

    Directory of Open Access Journals (Sweden)

    Uygar eSümbül

    2014-11-01

    Full Text Available The shape and position of a neuron convey information regarding its molecular and functional identity. The identification of cell types from structure, a classic method, relies on the time-consuming step of arbor tracing. However, as genetic tools and imaging methods make data-driven approaches to neuronal circuit analysis feasible, the need for automated processing increases. Here, we first establish that mouse retinal ganglion cell types can be as precise about distributing their arbor volumes across the inner plexiform layer as they are about distributing the skeletons of the arbors. Then, we describe an automated approach to computing the spatial distribution of the dendritic arbors, or arbor density, with respect to a global depth coordinate based on this observation. Our method involves three-dimensional reconstruction of neuronal arbors by a supervised machine learning algorithm, post-processing of the enhanced stacks to remove somata and isolate the neuron of interest, and registration of neurons to each other using automatically detected arbors of the starburst amacrine interneurons as fiducial markers. In principle, this method could be generalizable to other structures of the CNS, provided that they allow sparse labeling of the cells and contain a reliable axis of spatial reference.

  4. Impact of Hybrid Intelligent Computing in Identifying Constructive Weather Parameters for Modeling Effective Rainfall Prediction

    Directory of Open Access Journals (Sweden)

    M. Sudha

    2015-12-01

    Full Text Available Uncertain atmosphere is a prevalent factor affecting the existing prediction approaches. Rough set and fuzzy set theories as proposed by Pawlak and Zadeh have become an effective tool for handling vagueness and fuzziness in the real world scenarios. This research work describes the impact of Hybrid Intelligent System (HIS for strategic decision support in meteorology. In this research a novel exhaustive search based Rough set reduct Selection using Genetic Algorithm (RSGA is introduced to identify the significant input feature subset. The proposed model could identify the most effective weather parameters efficiently than other existing input techniques. In the model evaluation phase two adaptive techniques were constructed and investigated. The proposed Artificial Neural Network based on Back Propagation learning (ANN-BP and Adaptive Neuro Fuzzy Inference System (ANFIS was compared with existing Fuzzy Unordered Rule Induction Algorithm (FURIA, Structural Learning Algorithm on Vague Environment (SLAVE and Particle Swarm OPtimization (PSO. The proposed rainfall prediction models outperformed when trained with the input generated using RSGA. A meticulous comparison of the performance indicates ANN-BP model as a suitable HIS for effective rainfall prediction. The ANN-BP achieved 97.46% accuracy with a nominal misclassification rate of 0.0254 %.

  5. Influence of intracanal post on apical periodontitis identified by cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Estrela, Carlos; Porto, Olavo Cesar Lyra; Rodrigues, Cleomar Donizeth [Federal University of Goias (UFG), Goiania, GO (Brazil). Dental School; Bueno, Mike Reis [University of Cuiaba (UNIC), MT (Brazil). Dental School; Pecora, Jesus Djalma, E-mail: estrela3@terra.com.b [University of Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Dental School

    2009-07-01

    The determination of the success of endodontic treatment has been often discussed based on outcome obtained by periapical radiography. The aim of this study was to verify the influence of intracanal post on apical periodontitis detected by cone-beam computed tomography (CBCT). A consecutive sample of 1020 images (periapical radiographs and CBCT scans) taken from 619 patients (245 men; mean age, 50.1 years) between February 2008 and September 2009 were used in this study. Presence and intracanal post length (short, medium and long) were associated with apical periodontitis (AP). Chi-square test was used for statistical analyses. Significance level was set at p<0.01. The kappa value was used to assess examiner variability. From a total of 591 intracanal posts, AP was observed in 15.06%, 18.78% and 7.95% using periapical radiographs, into the different lengths, short, medium and long, respectively (p=0.466). Considering the same posts length it was verified AP in 24.20%, 26.40% and 11.84% observed by CBCT scans, respectively (p=0.154). From a total of 1,020 teeth used in this study, AP was detected in 397 (38.92%) by periapical radiography and in 614 (60.19%) by CBCT scans (p<0.001). The distribution of intracanal posts in different dental groups showed higher prevalence in maxillary anterior teeth (54.79%). Intracanal posts lengths did not influenced AP. AP was detected more frequently when CBCT method was used. (author)

  6. A computational study of the Warburg effect identifies metabolic targets inhibiting cancer migration.

    Science.gov (United States)

    Yizhak, Keren; Le Dévédec, Sylvia E; Rogkoti, Vasiliki Maria; Baenke, Franziska; de Boer, Vincent C; Frezza, Christian; Schulze, Almut; van de Water, Bob; Ruppin, Eytan

    2014-08-01

    Over the last decade, the field of cancer metabolism has mainly focused on studying the role of tumorigenic metabolic rewiring in supporting cancer proliferation. Here, we perform the first genome-scale computational study of the metabolic underpinnings of cancer migration. We build genome-scale metabolic models of the NCI-60 cell lines that capture the Warburg effect (aerobic glycolysis) typically occurring in cancer cells. The extent of the Warburg effect in each of these cell line models is quantified by the ratio of glycolytic to oxidative ATP flux (AFR), which is found to be highly positively associated with cancer cell migration. We hence predicted that targeting genes that mitigate the Warburg effect by reducing the AFR may specifically inhibit cancer migration. By testing the anti-migratory effects of silencing such 17 top predicted genes in four breast and lung cancer cell lines, we find that up to 13 of these novel predictions significantly attenuate cell migration either in all or one cell line only, while having almost no effect on cell proliferation. Furthermore, in accordance with the predictions, a significant reduction is observed in the ratio between experimentally measured ECAR and OCR levels following these perturbations. Inhibiting anti-migratory targets is a promising future avenue in treating cancer since it may decrease cytotoxic-related side effects that plague current anti-proliferative treatments. Furthermore, it may reduce cytotoxic-related clonal selection of more aggressive cancer cells and the likelihood of emerging resistance. © 2014 The Authors. Published under the terms of the CC BY 4.0 license.

  7. Identifying Mechanisms Behind the Tullio Phenomenon: a Computational Study Based on First Principles.

    Science.gov (United States)

    Grieser, Bernhard J; Kleiser, Leonhard; Obrist, Dominik

    2016-04-01

    Patients with superior canal dehiscence (SCD) suffer from events of dizziness and vertigo in response to sound, also known as Tullio phenomenon (TP). The present work seeks to explain the fluid-dynamical mechanisms behind TP. In accordance with the so-called third window theory, we developed a computational model for the vestibular signal pathway between stapes and SCD. It is based on first principles and accounts for fluid-structure interactions arising between endolymph, perilymph, and membranous labyrinth. The simulation results reveal a wave propagation phenomenon in the membranous canal, leading to two flow phenomena within the endolymph which are in close interaction. First, the periodic deformation of the membranous labyrinth causes oscillating endolymph flow which forces the cupula to oscillate in phase with the sound stimulus. Second, these primary oscillations of the endolymph induce a steady flow component by a phenomenon known as steady streaming. We find that this steady flow of the endolymph is typically in ampullofugal direction. This flow leads to a quasi-steady deflection of the cupula which increases until the driving forces of the steady streaming are balanced by the elastic reaction forces of the cupula, such that the cupula attains a constant deflection amplitude which lasts as long as the sound stimulus. Both response types have been observed in the literature. In a sensitivity study, we obtain an analytical fit which very well matches our simulation results in a relevant parameter range. Finally, we correlate the corresponding eye response (vestibulo-ocular reflex) with the fluid dynamics by a simplified model of lumped system constants. The results reveal a "sweet spot" for TP within the audible sound spectrum. We find that the underlying mechanisms which lead to TP originate primarily from Reynolds stresses in the fluid, which are weaker at lower sound frequencies.

  8. Computational intelligent gait-phase detection system to identify pathological gait.

    Science.gov (United States)

    Senanayake, Chathuri M; Senanayake, S M N Arosha

    2010-09-01

    An intelligent gait-phase detection algorithm based on kinematic and kinetic parameters is presented in this paper. The gait parameters do not vary distinctly for each gait phase; therefore, it is complex to differentiate gait phases with respect to a threshold value. To overcome this intricacy, the concept of fuzzy logic was applied to detect gait phases with respect to fuzzy membership values. A real-time data-acquisition system was developed consisting of four force-sensitive resistors and two inertial sensors to obtain foot-pressure patterns and knee flexion/extension angle, respectively. The detected gait phases could be further analyzed to identify abnormality occurrences, and hence, is applicable to determine accurate timing for feedback. The large amount of data required for quality gait analysis necessitates the utilization of information technology to store, manage, and extract required information. Therefore, a software application was developed for real-time acquisition of sensor data, data processing, database management, and a user-friendly graphical-user interface as a tool to simplify the task of clinicians. The experiments carried out to validate the proposed system are presented along with the results analysis for normal and pathological walking patterns.

  9. Identifying and establishing consensus on the most important safety features of GP computer systems: e-Delphi study

    Directory of Open Access Journals (Sweden)

    Anthony Avery

    2005-03-01

    Full Text Available Our objective was to identify and establish consensus on the most important safety features of GP computer systems, with a particular emphasis on medicines management.Weused a two-round electronic Delphi survey, completed by a 21-member multidisciplinary expert panel, all from the UK. The main outcome measure was percentage agreement of the panel members on the importance of the presence of a number of different safety features (presented as clinical statements onGP computer systems.We found 90% or greater agreement on the importance of 32 (58% statements. These statements, indicating issues considered to be of considerable importance (rated as important or very important, related to: computerised alerts; the need to avoid spurious alerts; making it difficult to override critical alerts; having audit trails of such overrides; support for safe repeat prescribing; effective computer_user interface; importance of call and recall management; and the need to be able to run safety reports. The high level of agreement among the expert panel members indicates clear themes and priorities that need to be addressed in any further improvement of safety features in primary care computing systems.

  10. Light-induced systemic regulation of photosynthesis in primary and trifoliate leaves of Phaseolus vulgaris: effects of photosynthetic photon flux density (PPFD) versus spectrum.

    Science.gov (United States)

    Murakami, K; Matsuda, R; Fujiwara, K

    2014-01-01

    The objectives of this work using Phaseolus vulgaris were to examine whether the light spectrum incident on mature primary leaves (PLs) is related to leaf-to-leaf systemic regulation of developing trifoliate leaves (TLs) in photosynthetic characteristics, and to investigate the relative importance of spectrum and photosynthetic photon flux density (PPFD) in light-induced systemic regulation. Systemic regulation was induced by altering PPFD and the spectrum of light incident on PLs using a shading treatment and lighting treatments including either white, blue, green or red light-emitting diodes (LEDs). Photosynthetic characteristics were evaluated by measuring the light-limited and light-saturated net photosynthetic rates and the amounts of nitrogen (N), chlorophyll (Chl) and ribulose-1,5-bisphosphate carboxylase/oxygenase (Rubisco; EC 4.1.1.39). Shading treatment on PLs decreased the amounts of N, Chl and Rubisco of TLs and tended to decrease the photosynthetic rates. However, we observed no systemic effects induced by the light spectrum on PLs in this study, except that a higher amount of Rubisco of TLs was observed when the PLs were irradiated with blue LEDs. Our results imply that photoreceptors in mature leaves have little influence on photosynthetic rates and amounts of N and Chl of developing leaves through systemic regulation, although the possibility of the action of blue light irradiation on the amount of Rubisco cannot be ruled out. Based on these results, we concluded that the light spectrum incident on mature leaves has little systemic effect on developing leaves in terms of photosynthetic characteristics and that the light-induced systemic regulation was largely accounted for by PPFD.

  11. Arbuscular mycorrhiza mediates glomalin-related soil protein production and soil enzyme activities in the rhizosphere of trifoliate orange grown under different P levels.

    Science.gov (United States)

    Wu, Qiang-Sheng; Li, Yan; Zou, Ying-Ning; He, Xin-Hua

    2015-02-01

    Glomalin-related soil protein (GRSP) is beneficial to soil and plants and is affected by various factors. To address whether mycorrhizal-induced GRSP and relevant soil enzymes depend on external P levels, a pot study evaluated effects of the arbuscular mycorrhizal fungus (AMF) Funneliformis mosseae on GRSP production and soil enzyme activities. Three GRSP categories, as easily-extractable GRSP (EE-GRSP), difficultly-extractable GRSP (DE-GRSP), and total (EE-GRSP + DE-GRSP) GRSP (T-GRSP), were analyzed, together with five enzyme activities (β-glucosidase, catalase, peroxidase, phosphatase, polyphenol oxidase) in the rhizosphere of trifoliate orange (Poncirus trifoliata) grown under 0, 3, and 30 mM KH2PO4 in a sand substrate. After 4 months, root AM colonization and substrate hyphal length decreased with increasing P levels. Shoot, root, and total biomass production was significantly increased by AM colonization, regardless of P levels, but more profound under 0 mM P than under 30 mM KH2PO4. In general, production of these three GRSP categories under 0 or 30 mM KH2PO4 was similar in non-mycorrhizosphere but decreased in mycorrhizosphere. Mycorrhization significantly increased the production of EE-GRSP, DE-GRSP and T-GRSP, soil organic carbon (SOC), and activity of substrate β-glucosidase, catalase, peroxidase, and phosphatase, but decreased polyphenol oxidase activity, irrespective of P levels. Production of EE-GRSP, DE-GRSP, and T-GRSP significantly positively correlated with SOC and β-glucosidase, catalase, and peroxidase activity, negatively with polyphenol oxidase activity, but not with hyphal length or phosphatase activity. These results indicate that AM-mediated production of GRSP and relevant soil enzyme activities may not depend on external P concentrations.

  12. Disruption of mycorrhizal extraradical mycelium and changes in leaf water status and soil aggregate stability in rootbox-grown trifoliate orange

    Directory of Open Access Journals (Sweden)

    Ying-Ning eZou

    2015-03-01

    Full Text Available Arbuscular mycorrhizas possess well developed extraradical mycelium (ERM network that enlarge the surrounding soil for better acquisition of water and nutrients, besides soil aggregation. Distinction in ERM functioning was studied under a rootbox system, which consisted of root+hyphae and root-free hyphae compartments separated by 37-μm nylon mesh with an air gap. Trifoliate orange (Poncirus trifoliata seedlings were inoculated with Funneliformis mosseae in root+hyphae compartment, and the ERM network was established between the two compartments. The ERM network of air gap was disrupted before 8 h of the harvest (one time disruption or multiple disruptions during seedlings acclimation. Our results showed that mycorrhizal inoculation induced a significant increase in growth (plant height, stem diameter, and leaf, stem, and root biomass and physiological characters (leaf relative water content, leaf water potential, and transpiration rate, irrespective of ERM status. Easily-extractable glomalin-related soil protein (EE-GRSP and total GRSP (T-GRSP concentration and mean weight diameter (MWD, an indicator of soil aggregate stability were significantly higher in mycorrhizosphere of root+hyphae and root-free hyphae compartments than non-mycorrhizosphere. One time disruption of ERM network did not influence plant growth and soil properties but only notably decreased leaf water. Periodical disruption of ERM network at weekly interval markedly inhibited the mycorrhizal roles on plant growth, leaf water, GRSP production, and MWD in root+hyphae and hyphae chambers. EE-GRSP was the most responsive GRSP fraction to changes in leaf water and MWD under root+hyphae and hyphae conditions. It suggests that effect of peridical disruption of ERM network was more impactful than one-time disruption of ERM network with regard to leaf water, plant growth, and aggregate stability responses, thereby, implying ERM network aided in developing the host plant metabolically

  13. A novel computational method identifies intra- and inter-species recombination events in Staphylococcus aureus and Streptococcus pneumoniae.

    Directory of Open Access Journals (Sweden)

    Lisa Sanguinetti

    Full Text Available Advances in high-throughput DNA sequencing technologies have determined an explosion in the number of sequenced bacterial genomes. Comparative sequence analysis frequently reveals evidences of homologous recombination occurring with different mechanisms and rates in different species, but the large-scale use of computational methods to identify recombination events is hampered by their high computational costs. Here, we propose a new method to identify recombination events in large datasets of whole genome sequences. Using a filtering procedure of the gene conservation profiles of a test genome against a panel of strains, this algorithm identifies sets of contiguous genes acquired by homologous recombination. The locations of the recombination breakpoints are determined using a statistical test that is able to account for the differences in the natural rate of evolution between different genes. The algorithm was tested on a dataset of 75 genomes of Staphylococcus aureus and 50 genomes comprising different streptococcal species, and was able to detect intra-species recombination events in S. aureus and in Streptococcus pneumoniae. Furthermore, we found evidences of an inter-species exchange of genetic material between S. pneumoniae and Streptococcus mitis, a closely related commensal species that colonizes the same ecological niche. The method has been implemented in an R package, Reco, which is freely available from supplementary material, and provides a rapid screening tool to investigate recombination on a genome-wide scale from sequence data.

  14. A High-Throughput Computational Framework for Identifying Significant Copy Number Aberrations from Array Comparative Genomic Hybridisation Data

    Directory of Open Access Journals (Sweden)

    Ian Roberts

    2012-01-01

    Full Text Available Reliable identification of copy number aberrations (CNA from comparative genomic hybridization data would be improved by the availability of a generalised method for processing large datasets. To this end, we developed swatCGH, a data analysis framework and region detection heuristic for computational grids. swatCGH analyses sequentially displaced (sliding windows of neighbouring probes and applies adaptive thresholds of varying stringency to identify the 10% of each chromosome that contains the most frequently occurring CNAs. We used the method to analyse a published dataset, comparing data preprocessed using four different DNA segmentation algorithms, and two methods for prioritising the detected CNAs. The consolidated list of the most commonly detected aberrations confirmed the value of swatCGH as a simplified high-throughput method for identifying biologically significant CNA regions of interest.

  15. A High-Throughput Computational Framework for Identifying Significant Copy Number Aberrations from Array Comparative Genomic Hybridisation Data

    Science.gov (United States)

    Roberts, Ian; Carter, Stephanie A.; Scarpini, Cinzia G.; Karagavriilidou, Konstantina; Barna, Jenny C. J.; Calleja, Mark; Coleman, Nicholas

    2012-01-01

    Reliable identification of copy number aberrations (CNA) from comparative genomic hybridization data would be improved by the availability of a generalised method for processing large datasets. To this end, we developed swatCGH, a data analysis framework and region detection heuristic for computational grids. swatCGH analyses sequentially displaced (sliding) windows of neighbouring probes and applies adaptive thresholds of varying stringency to identify the 10% of each chromosome that contains the most frequently occurring CNAs. We used the method to analyse a published dataset, comparing data preprocessed using four different DNA segmentation algorithms, and two methods for prioritising the detected CNAs. The consolidated list of the most commonly detected aberrations confirmed the value of swatCGH as a simplified high-throughput method for identifying biologically significant CNA regions of interest. PMID:23008709

  16. Computational methods using genome-wide association studies to predict radiotherapy complications and to identify correlative molecular processes

    Science.gov (United States)

    Oh, Jung Hun; Kerns, Sarah; Ostrer, Harry; Powell, Simon N.; Rosenstein, Barry; Deasy, Joseph O.

    2017-02-01

    The biological cause of clinically observed variability of normal tissue damage following radiotherapy is poorly understood. We hypothesized that machine/statistical learning methods using single nucleotide polymorphism (SNP)-based genome-wide association studies (GWAS) would identify groups of patients of differing complication risk, and furthermore could be used to identify key biological sources of variability. We developed a novel learning algorithm, called pre-conditioned random forest regression (PRFR), to construct polygenic risk models using hundreds of SNPs, thereby capturing genomic features that confer small differential risk. Predictive models were trained and validated on a cohort of 368 prostate cancer patients for two post-radiotherapy clinical endpoints: late rectal bleeding and erectile dysfunction. The proposed method results in better predictive performance compared with existing computational methods. Gene ontology enrichment analysis and protein-protein interaction network analysis are used to identify key biological processes and proteins that were plausible based on other published studies. In conclusion, we confirm that novel machine learning methods can produce large predictive models (hundreds of SNPs), yielding clinically useful risk stratification models, as well as identifying important underlying biological processes in the radiation damage and tissue repair process. The methods are generally applicable to GWAS data and are not specific to radiotherapy endpoints.

  17. Computational sensitivity analysis to identify muscles that can mechanically contribute to shoulder deformity following brachial plexus birth palsy.

    Science.gov (United States)

    Crouch, Dustin L; Plate, Johannes F; Li, Zhongyu; Saul, Katherine R

    2014-02-01

    Two mechanisms, strength imbalance or impaired longitudinal muscle growth, potentially cause osseous and postural shoulder deformity in children with brachial plexus birth palsy. Our objective was to determine which muscles, via either deformity mechanism, were mechanically capable of producing forces that could promote shoulder deformity. In an upper limb computational musculoskeletal model, we simulated strength imbalance by allowing each muscle crossing the shoulder to produce 30% of its maximum force. To simulate impaired longitudinal muscle growth, the functional length of each muscle crossing the shoulder was reduced by 30%. We performed a sensitivity analysis to identify muscles that, through either simulated deformity mechanism, increased the posteriorly directed, compressive glenohumeral joint force consistent with osseous deformity or reduced the shoulder external rotation or abduction range of motion consistent with postural deformity. Most of the increase in the posterior glenohumeral joint force by the strength imbalance mechanism was caused by the subscapularis, latissimus dorsi, and infraspinatus. Posterior glenohumeral joint force increased the most owing to impaired growth of the infraspinatus, subscapularis, and long head of biceps. Through the strength imbalance mechanism, the subscapularis, anterior deltoid, and pectoralis major muscles reduced external shoulder rotation by 28°, 17°, and 10°, respectively. Shoulder motion was reduced by 40° to 56° owing to impaired growth of the anterior deltoid, subscapularis, and long head of triceps. The infraspinatus, subscapularis, latissimus dorsi, long head of biceps, anterior deltoid, pectoralis major, and long head of triceps were identified in this computational study as being the most capable of producing shoulder forces that may contribute to shoulder deformity following brachial plexus birth palsy. The muscles mechanically capable of producing deforming shoulder forces should be the focus of

  18. Identifying shared genetic structure patterns among Pacific Northwest forest taxa: insights from use of visualization tools and computer simulations.

    Directory of Open Access Journals (Sweden)

    Mark P Miller

    Full Text Available BACKGROUND: Identifying causal relationships in phylogeographic and landscape genetic investigations is notoriously difficult, but can be facilitated by use of multispecies comparisons. METHODOLOGY/PRINCIPAL FINDINGS: We used data visualizations to identify common spatial patterns within single lineages of four taxa inhabiting Pacific Northwest forests (northern spotted owl: Strix occidentalis caurina; red tree vole: Arborimus longicaudus; southern torrent salamander: Rhyacotriton variegatus; and western white pine: Pinus monticola. Visualizations suggested that, despite occupying the same geographical region and habitats, species responded differently to prevailing historical processes. S. o. caurina and P. monticola demonstrated directional patterns of spatial genetic structure where genetic distances and diversity were greater in southern versus northern locales. A. longicaudus and R. variegatus displayed opposite patterns where genetic distances were greater in northern versus southern regions. Statistical analyses of directional patterns subsequently confirmed observations from visualizations. Based upon regional climatological history, we hypothesized that observed latitudinal patterns may have been produced by range expansions. Subsequent computer simulations confirmed that directional patterns can be produced by expansion events. CONCLUSIONS/SIGNIFICANCE: We discuss phylogeographic hypotheses regarding historical processes that may have produced observed patterns. Inferential methods used here may become increasingly powerful as detailed simulations of organisms and historical scenarios become plausible. We further suggest that inter-specific comparisons of historical patterns take place prior to drawing conclusions regarding effects of current anthropogenic change within landscapes.

  19. New breast cancer prognostic factors identified by computer-aided image analysis of HE stained histopathology images.

    Science.gov (United States)

    Chen, Jia-Mei; Qu, Ai-Ping; Wang, Lin-Wei; Yuan, Jing-Ping; Yang, Fang; Xiang, Qing-Ming; Maskey, Ninu; Yang, Gui-Fang; Liu, Juan; Li, Yan

    2015-05-29

    Computer-aided image analysis (CAI) can help objectively quantify morphologic features of hematoxylin-eosin (HE) histopathology images and provide potentially useful prognostic information on breast cancer. We performed a CAI workflow on 1,150 HE images from 230 patients with invasive ductal carcinoma (IDC) of the breast. We used a pixel-wise support vector machine classifier for tumor nests (TNs)-stroma segmentation, and a marker-controlled watershed algorithm for nuclei segmentation. 730 morphologic parameters were extracted after segmentation, and 12 parameters identified by Kaplan-Meier analysis were significantly associated with 8-year disease free survival (P < 0.05 for all). Moreover, four image features including TNs feature (HR 1.327, 95%CI [1.001-1.759], P = 0.049), TNs cell nuclei feature (HR 0.729, 95%CI [0.537-0.989], P = 0.042), TNs cell density (HR 1.625, 95%CI [1.177-2.244], P = 0.003), and stromal cell structure feature (HR 1.596, 95%CI [1.142-2.229], P = 0.006) were identified by multivariate Cox proportional hazards model to be new independent prognostic factors. The results indicated that CAI can assist the pathologist in extracting prognostic information from HE histopathology images for IDC. The TNs feature, TNs cell nuclei feature, TNs cell density, and stromal cell structure feature could be new prognostic factors.

  20. Advanced computational biology methods identify molecular switches for malignancy in an EGF mouse model of liver cancer.

    Directory of Open Access Journals (Sweden)

    Philip Stegmaier

    Full Text Available The molecular causes by which the epidermal growth factor receptor tyrosine kinase induces malignant transformation are largely unknown. To better understand EGFs' transforming capacity whole genome scans were applied to a transgenic mouse model of liver cancer and subjected to advanced methods of computational analysis to construct de novo gene regulatory networks based on a combination of sequence analysis and entrained graph-topological algorithms. Here we identified transcription factors, processes, key nodes and molecules to connect as yet unknown interacting partners at the level of protein-DNA interaction. Many of those could be confirmed by electromobility band shift assay at recognition sites of gene specific promoters and by western blotting of nuclear proteins. A novel cellular regulatory circuitry could therefore be proposed that connects cell cycle regulated genes with components of the EGF signaling pathway. Promoter analysis of differentially expressed genes suggested the majority of regulated transcription factors to display specificity to either the pre-tumor or the tumor state. Subsequent search for signal transduction key nodes upstream of the identified transcription factors and their targets suggested the insulin-like growth factor pathway to render the tumor cells independent of EGF receptor activity. Notably, expression of IGF2 in addition to many components of this pathway was highly upregulated in tumors. Together, we propose a switch in autocrine signaling to foster tumor growth that was initially triggered by EGF and demonstrate the knowledge gain form promoter analysis combined with upstream key node identification.

  1. Parotid Incidentaloma Identified by Positron Emission/Computed Tomography: When to Consider Diagnoses Other than Warthin Tumor

    Directory of Open Access Journals (Sweden)

    Bothe, Carolina

    2015-01-01

    Full Text Available Introduction Parotid gland incidentalomas (PGIs are unexpected hypermetabolic foci in the parotid region that can be found when scanning with whole-body positron emission/computed tomography (PET/CT. These deposits are most commonly due to benign lesions such as Warthin tumor. Objective The aim of this study was to determine the prevalence of PGIs identified in PET/CT scans and to assess the role of smoking in their etiology. Methods We retrospectively reviewed all PET/CT scans performed at our center in search of PGIs and identified smoking status and standardized uptake value (SUVmax in each case. We also analyzed the database of parotidectomies performed in our department in the previous 10 years and focused on the pathologic diagnosis and the presence or absence of smoking in each case. Results Sixteen cases of PGIs were found in 4,250 PET/CT scans, accounting for 0.4%. The average SUVmax was 6.5 (range 2.8 to 16. Cytology was performed in five patients; it was benign in four cases and inconclusive in one case. Thirteen patients had a history of smoking. Of the parotidectomies performed in our center with a diagnosis of Warthin tumor, we identified a history of smoking in 93.8% of those patients. Conclusions The prevalence of PGIs on PET/CT was similar to that reported by other authors. Warthin tumor is frequently diagnosed among PGIs on PET/CT, and it has a strong relationship with smoking. We suggest that a diagnosis other than Warthin tumor should be considered for PGIs in nonsmokers.

  2. Incidence of anatomical variations and disease of the maxillary sinuses as identified by cone beam computed tomography: a systematic review.

    Science.gov (United States)

    Vogiatzi, Theodosia; Kloukos, Dimitrios; Scarfe, William C; Bornstein, Michael M

    2014-01-01

    To analyze available evidence on the incidence of anatomical variations or disease of the maxillary sinuses as identified by cone beam computed tomography (CBCT) in dentistry. A focused question was developed to search the electronic databases MEDLINE, EMBASE, the Cochrane Oral Health Group Trials Register, and CENTRAL and identify all relevant papers published between 1980 and January 19, 2013. Unpublished literature at ClinicalTrials.gov, in the National Research Register, and in the Pro-Quest Dissertation Abstracts and Thesis database was also included. Studies were included irrespective of language. These results were supplemented by hand and gray literature searches. Twenty-two studies were identified. Twenty were retrospective cohort studies, one was a prospective cohort study, and one was a case control study. The main indication for CBCT was dental implant treatment planning, and the majority of studies used a small field of view for imaging. The most common anatomical variations included increased thickness of the sinus membrane, the presence of sinus septa, and pneumatization. Reported sinus disease frequency varied widely, ranging from 14.3% to 82%. There was a wide range in the reported prevalence of mucosal thickening related to apical pathology, the degree of lumenal opacification, features of sinusitis, and the presence of retention cysts and polyps. More pathologic findings in the maxillary sinus were reported in men than in women, and the medial wall and sinus floor were most frequently affected. CBCT is used primarily to evaluate bony anatomy and to screen for overt pathology of the maxillary sinuses prior to dental implant treatment. Differences in the classification of mucosal findings are problematic in the consistent and valid assessment of health and disease of the maxillary sinus.

  3. Identifying the pitfalls for social interaction in computer-supported collaborative learning environments: a review of the research

    NARCIS (Netherlands)

    Kreijns, K.; Kirschner, P.A.; Jochems, W.

    2003-01-01

    Computer-mediated world-wide networks have enabled a shift from contiguous learning groups to asynchronous distributed learning groups utilizing computer-supported collaborative learning environments. Although these environments can support communication and collaboration, both research and field ob

  4. Identifying Trainees' Computer Self-Efficacy in Relation to Some Variables: The Case of Turkish EFL Trainees

    Science.gov (United States)

    Inal, Sevim

    2015-01-01

    The purpose of this study was to define the self-efficacy perception of Turkish ELT students and examine the relationship between their self-efficacy and such variables as grade level, computer ownership, first time computer use, and frequency of internet and computer use. The participants are 305 Turkish ELT trainees at Dokuz Eylul University,…

  5. Data Mining Techniques for Identifying Students at Risk of Failing a Computer Proficiency Test Required for Graduation

    Science.gov (United States)

    Tsai, Chih-Fong; Tsai, Ching-Tzu; Hung, Chia-Sheng; Hwang, Po-Sen

    2011-01-01

    Enabling undergraduate students to develop basic computing skills is an important issue in higher education. As a result, some universities have developed computer proficiency tests, which aim to assess students' computer literacy. Generally, students are required to pass such tests in order to prove that they have a certain level of computer…

  6. A computationally identified compound antagonizes excess FGF-23 signaling in renal tubules and a mouse model of hypophosphatemia.

    Science.gov (United States)

    Xiao, Zhousheng; Riccardi, Demian; Velazquez, Hector A; Chin, Ai L; Yates, Charles R; Carrick, Jesse D; Smith, Jeremy C; Baudry, Jerome; Quarles, L Darryl

    2016-11-22

    Fibroblast growth factor-23 (FGF-23) interacts with a binary receptor complex composed of α-Klotho (α-KL) and FGF receptors (FGFRs) to regulate phosphate and vitamin D metabolism in the kidney. Excess FGF-23 production, which causes hypophosphatemia, is genetically inherited or occurs with chronic kidney disease. Among other symptoms, hypophosphatemia causes vitamin D deficiency and the bone-softening disorder rickets. Current therapeutics that target the receptor complex have limited utility clinically. Using a computationally driven, structure-based, ensemble docking and virtual high-throughput screening approach, we identified four novel compounds predicted to selectively inhibit FGF-23-induced activation of the FGFR/α-KL complex. Additional modeling and functional analysis found that Zinc13407541 bound to FGF-23 and disrupted its interaction with the FGFR1/α-KL complex; experiments in a heterologous cell expression system showed that Zinc13407541 selectivity inhibited α-KL-dependent FGF-23 signaling. Zinc13407541 also inhibited FGF-23 signaling in isolated renal tubules ex vivo and partially reversed the hypophosphatemic effects of excess FGF-23 in a mouse model. These chemical probes provide a platform to develop lead compounds to treat disorders caused by excess FGF-23.

  7. A computational approach for identifying the chemical factors involved in the glycosaminoglycans-mediated acceleration of amyloid fibril formation.

    Directory of Open Access Journals (Sweden)

    Elodie Monsellier

    Full Text Available BACKGROUND: Amyloid fibril formation is the hallmark of many human diseases, including Alzheimer's disease, type II diabetes and amyloidosis. Amyloid fibrils deposit in the extracellular space and generally co-localize with the glycosaminoglycans (GAGs of the basement membrane. GAGs have been shown to accelerate the formation of amyloid fibrils in vitro for a number of protein systems. The high number of data accumulated so far has created the grounds for the construction of a database on the effects of a number of GAGs on different proteins. METHODOLOGY/PRINCIPAL FINDINGS: In this study, we have constructed such a database and have used a computational approach that uses a combination of single parameter and multivariate analyses to identify the main chemical factors that determine the GAG-induced acceleration of amyloid formation. We show that the GAG accelerating effect is mainly governed by three parameters that account for three-fourths of the observed experimental variability: the GAG sulfation state, the solute molarity, and the ratio of protein and GAG molar concentrations. We then combined these three parameters into a single equation that predicts, with reasonable accuracy, the acceleration provided by a given GAG in a given condition. CONCLUSIONS/SIGNIFICANCE: In addition to shedding light on the chemical determinants of the protein:GAG interaction and to providing a novel mathematical predictive tool, our findings highlight the possibility that GAGs may not have such an accelerating effect on protein aggregation under the conditions existing in the basement membrane, given the values of salt molarity and protein:GAG molar ratio existing under such conditions.

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  9. Identifying unknown minerals and compounds from X-ray diffraction patterns using the Johnson and Vand FORTRAN 4 computer program

    Science.gov (United States)

    Kyte, F. T.

    1976-01-01

    Automated computer identification of minerals and compounds from unknown samples is provided along with detailed instructions and worked examples for use in graduate level courses in mineralogy and X-ray analysis applications.

  10. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

    Science.gov (United States)

    Durstewitz, Daniel

    2017-06-01

    The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic) network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional) state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs) are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs) within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC) obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover relevant aspects

  11. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

    Directory of Open Access Journals (Sweden)

    Daniel Durstewitz

    2017-06-01

    Full Text Available The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast maximum-likelihood estimation framework for PLRNNs that may enable to recover

  12. Computer Breakdown as a Stress Factor during Task Completion under Time Pressure: Identifying Gender Differences Based on Skin Conductance

    Directory of Open Access Journals (Sweden)

    René Riedl

    2013-01-01

    Full Text Available In today’s society, as computers, the Internet, and mobile phones pervade almost every corner of life, the impact of Information and Communication Technologies (ICT on humans is dramatic. The use of ICT, however, may also have a negative side. Human interaction with technology may lead to notable stress perceptions, a phenomenon referred to as technostress. An investigation of the literature reveals that computer users’ gender has largely been ignored in technostress research, treating users as “gender-neutral.” To close this significant research gap, we conducted a laboratory experiment in which we investigated users’ physiological reaction to the malfunctioning of technology. Based on theories which explain that men, in contrast to women, are more sensitive to “achievement stress,” we predicted that male users would exhibit higher levels of stress than women in cases of system breakdown during the execution of a human-computer interaction task under time pressure, if compared to a breakdown situation without time pressure. Using skin conductance as a stress indicator, the hypothesis was confirmed. Thus, this study shows that user gender is crucial to better understanding the influence of stress factors such as computer malfunctions on physiological stress reactions.

  13. Binary Logistic Regression Analysis in Assessment and Identifying Factors That Influence Students' Academic Achievement: The Case of College of Natural and Computational Science, Wolaita Sodo University, Ethiopia

    Science.gov (United States)

    Zewude, Bereket Tessema; Ashine, Kidus Meskele

    2016-01-01

    An attempt has been made to assess and identify the major variables that influence student academic achievement at college of natural and computational science of Wolaita Sodo University in Ethiopia. Study time, peer influence, securing first choice of department, arranging study time outside class, amount of money received from family, good life…

  14. Avaliação de citrandarins e outros híbridos de trifoliata como porta-enxertos para citros em São Paulo Performance of citrandarins and others trifoliate hybrids rootstocks in Sao Paulo, Brazil

    Directory of Open Access Journals (Sweden)

    Silvia Blumer

    2005-08-01

    Full Text Available Laranjeiras Valência enxertadas em citrandarins e outros híbridos de trifoliata foram plantadas em 1988, em Itirapina (SP, num Latossolo Vermelho-Amarelo textura arenosa e conduzidas sem irrigação. O citrandarin Sunki x English (1.628, sem diferir estatisticamente de Cleópatra x Rubidoux (1.660, Cleópatra x English (710, Cleópatra x Swingle (715 e do trangpur Cravo x Carrizo (717, induziu a maior produção de frutos nas cinco primeiras colheitas do experimento (1991-1995, sendo que os três primeiros foram os mais produtivos nas três últimas colheitas. Os citranges Troyer e Carrizo foram significativamente inferiores aos citrandarins Sunki x English (1.628, Cleópatra x Rubidoux (1.660 e Cleópatra x English (710 em todos os anos, exceto 1994. Nenhuma das plantas apresentou sintomas de suscetibilidade à tristeza ou ao declínio. Os seedlings dos porta-enxertos diferiram quanto à área lesionada pela inoculação com Phytophthora parasitica. Os citrandarins Cleópatra x Swingle (1.587, Cleópatra x Trifoliata (1.574, Cleópatra x Rubidoux (1.600, Clementina x Trifoliata (1.615 e o limão Cravo x citrange Carrizo (717 foram significativamente mais resistentes que Cleópatra x Christian (712, Sunki x English (1.628, Cleópatra x Swingle (715 e Cleópatra x English (710.Valencia sweet orange trees budded onto citrandarins and others trifoliate hybrids rootstocks from the USDA Horticultural Research Laboratory, Fort Pierce, Florida, were planted in 1988 on a sandy textured Oxisol in São Paulo State, Brazil, and managed without irrigation. Tristeza and blight diseases are endemic in this area. Trees of Sunki x English (1.628, Cleopatra x Rubidoux (1.660, Cleopatra x English (710, Cleopatra x Swingle (715 and Rangpur lime x Carrizo citrange (717, produced the highest cumulative yields in the first five crops (1991-1995. The first three rootstocks induced the highest crops in the last three years. Carrizo and Troyer citranges had the lowest

  15. Computational and experimental analysis identified 6-diazo-5-oxonorleucine as a potential agent for treating infection by Plasmodium falciparum.

    Science.gov (United States)

    Plaimas, Kitiporn; Wang, Yulin; Rotimi, Solomon O; Olasehinde, Grace; Fatumo, Segun; Lanzer, Michael; Adebiyi, Ezekiel; König, Rainer

    2013-12-01

    Plasmodium falciparum (PF) is the most severe malaria parasite. It is developing resistance quickly to existing drugs making it indispensable to discover new drugs. Effective drugs have been discovered targeting metabolic enzymes of the parasite. In order to predict new drug targets, computational methods can be used employing database information of metabolism. Using this data, we performed recently a computational network analysis of metabolism of PF. We analyzed the topology of the network to find reactions which are sensitive against perturbations, i.e., when a single enzyme is blocked by drugs. We now used a refined network comprising also the host enzymes which led to a refined set of the five targets glutamyl-tRNA (gln) amidotransferase, hydroxyethylthiazole kinase, deoxyribose-phophate aldolase, pseudouridylate synthase, and deoxyhypusine synthase. It was shown elsewhere that glutamyl-tRNA (gln) amidotransferase of other microorganisms can be inhibited by 6-diazo-5-oxonorleucine. Performing a half maximal inhibitory concentration (IC50) assay, we showed, that 6-diazo-5-oxonorleucine is also severely affecting viability of PF in blood plasma of the human host. We confirmed this by an in vivo study observing Plasmodium berghei infected mice.

  16. The development of bronchiectasis on chest computed tomography in children with cystic fibrosis: can pre-stages be identified?

    Energy Technology Data Exchange (ETDEWEB)

    Tepper, Leonie A. [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Sophia Children' s Hospital, Department of Radiology, Erasmus MC, Rotterdam (Netherlands); Caudri, Daan [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Perez Rovira, Adria [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Erasmus MC, Biomedical Imaging Group Rotterdam, Departments of Radiology and Medical Informatics, Rotterdam (Netherlands); Tiddens, Harm A.W.M. [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Sophia Children' s Hospital, Department of Radiology, Erasmus MC, Rotterdam (Netherlands); Sophia Children' s Hospital, Department of Pediatric Pulmonology and Radiology, Erasmus Medical Center, Rotterdam (Netherlands); Bruijne, Marleen de [Erasmus MC, Biomedical Imaging Group Rotterdam, Departments of Radiology and Medical Informatics, Rotterdam (Netherlands); University of Copenhagen, Department of Computer Science, Copenhagen (Denmark)

    2016-12-15

    Bronchiectasis is an important component of cystic fibrosis (CF) lung disease but little is known about its development. We aimed to study the development of bronchiectasis and identify determinants for rapid progression of bronchiectasis on chest CT. Forty-three patients with CF with at least four consecutive biennial volumetric CTs were included. Areas with bronchiectasis on the most recent CT were marked as regions of interest (ROIs). These ROIs were generated on all preceding CTs using deformable image registration. Observers indicated whether: bronchiectasis, mucus plugging, airway wall thickening, atelectasis/consolidation or normal airways were present in the ROIs. We identified 362 ROIs on the most recent CT. In 187 (51.7 %) ROIs bronchiectasis was present on all preceding CTs, while 175 ROIs showed development of bronchiectasis. In 139/175 (79.4 %) no pre-stages of bronchiectasis were identified. In 36/175 (20.6 %) bronchiectatic airways the following pre-stages were identified: mucus plugging (17.7 %), airway wall thickening (1.7 %) or atelectasis/consolidation (1.1 %). Pancreatic insufficiency was more prevalent in the rapid progressors compared to the slow progressors (p = 0.05). Most bronchiectatic airways developed within 2 years without visible pre-stages, underlining the treacherous nature of CF lung disease. Mucus plugging was the most frequent pre-stage. (orig.)

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  18. Can computed tomography classifications of chronic obstructive pulmonary disease be identified using Bayesian networks and clinical data?

    Science.gov (United States)

    Thomsen, Lars P; Weinreich, Ulla M; Karbing, Dan S; Helbo Jensen, Vanja G; Vuust, Morten; Frøkjær, Jens B; Rees, Stephen E

    2013-06-01

    Diagnosis and classification of chronic obstructive pulmonary disease (COPD) may be seen as difficult. Causal reasoning can be used to relate clinical measurements with radiological representation of COPD phenotypes airways disease and emphysema. In this paper a causal probabilistic network was constructed that uses clinically available measurements to classify patients suffering from COPD into the main phenotypes airways disease and emphysema. The network grades the severity of disease and for emphysematous COPD, the type of bullae and its location central or peripheral. In four patient cases the network was shown to reach the same conclusion as was gained from the patients' High Resolution Computed Tomography (HRCT) scans. These were: airways disease, emphysema with central small bullae, emphysema with central large bullae, and emphysema with peripheral bullae. The approach may be promising in targeting HRCT in COPD patients, assessing phenotypes of the disease and monitoring its progression using clinical data.

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  1. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  2. Method and apparatus for analyzing error conditions in a massively parallel computer system by identifying anomalous nodes within a communicator set

    Science.gov (United States)

    Gooding, Thomas Michael

    2011-04-19

    An analytical mechanism for a massively parallel computer system automatically analyzes data retrieved from the system, and identifies nodes which exhibit anomalous behavior in comparison to their immediate neighbors. Preferably, anomalous behavior is determined by comparing call-return stack tracebacks for each node, grouping like nodes together, and identifying neighboring nodes which do not themselves belong to the group. A node, not itself in the group, having a large number of neighbors in the group, is a likely locality of error. The analyzer preferably presents this information to the user by sorting the neighbors according to number of adjoining members of the group.

  3. Insecticidal activity of Nerium indicum, Derris trifoliate and Sapium sebiferum against Homocentridia picta%夹竹桃、鱼藤、乌桕对锈同心舟蛾的毒杀活性

    Institute of Scientific and Technical Information of China (English)

    林同; 黎荣彬; 陆宁将

    2006-01-01

    为有效控制锈同心舟蛾Homocentridia picta Hampson对木荷(Schima superda Gardn.et Champ.)的危害,用索氏提取法获得夹竹桃(Nerium indicum Mill.)叶、鱼藤(Derris trifoliate Lour.)叶和乌桕(Sapium sebiferum(L.)Roxb.)皮的乙醇提取物,在室内分别测定3种植物提取物对锈同心舟蛾3龄幼虫的防治效果.实验结果和方差分析表明,3种植物提取物为锈同心舟蛾都具有防治效果,3.5 d后,受试昆虫的校正死亡率都达到60%以上;其中以夹竹桃和乌桕的提取物对受试昆虫的影响最为明显,6 d后,受试昆虫的校正死亡率都达到90%以上.3种植物对锈同心舟蛾的毒杀作用由大到小的顺序依次是乌桕、夹竹桃和鱼藤.

  4. Coronary plaque quantification and fractional flow reserve by coronary computed tomography angiography identify ischaemia-causing lesions

    DEFF Research Database (Denmark)

    Gaur, Sara; Øvrehus, Kristian Altern; Dey, Damini

    2016-01-01

    tomography angiography (CTA)-derived fractional flow reserve (FFRCT), and lesion-specific ischaemia identified by FFR in a substudy of the NXT trial (Analysis of Coronary Blood Flow Using CT Angiography: Next Steps). METHODS AND RESULTS: Coronary CTA stenosis, plaque volumes, FFRCT, and FFR were assessed...... the receiver-operating characteristics curve (AUC) analysis. Ischaemia was defined by FFR or FFRCT ≤0.80. Plaque volumes were inversely related to FFR irrespective of stenosis severity. Relative risk (95% confidence interval) for prediction of ischaemia for stenosis >50%, NCP ≥185 mm(3), LD-NCP ≥30 mm(3), CP...

  5. In search of Leonardo: computer-based facial image analysis of Renaissance artworks for identifying Leonardo as subject

    Science.gov (United States)

    Tyler, Christopher W.; Smith, William A. P.; Stork, David G.

    2012-03-01

    One of the enduring mysteries in the history of the Renaissance is the adult appearance of the archetypical "Renaissance Man," Leonardo da Vinci. His only acknowledged self-portrait is from an advanced age, and various candidate images of younger men are difficult to assess given the absence of documentary evidence. One clue about Leonardo's appearance comes from the remark of the contemporary historian, Vasari, that the sculpture of David by Leonardo's master, Andrea del Verrocchio, was based on the appearance of Leonardo when he was an apprentice. Taking a cue from this statement, we suggest that the more mature sculpture of St. Thomas, also by Verrocchio, might also have been a portrait of Leonardo. We tested the possibility Leonardo was the subject for Verrocchio's sculpture by a novel computational technique for the comparison of three-dimensional facial configurations. Based on quantitative measures of similarities, we also assess whether another pair of candidate two-dimensional images are plausibly attributable as being portraits of Leonardo as a young adult. Our results are consistent with the claim Leonardo is indeed the subject in these works, but we need comparisons with images in a larger corpora of candidate artworks before our results achieve statistical significance.

  6. Disruptions in a cluster of computationally identified enhancers near FOXC1 and GMDS may influence brain development.

    Science.gov (United States)

    Haliburton, Genevieve D E; McKinsey, Gabriel L; Pollard, Katherine S

    2016-01-01

    Regulatory elements are more evolutionarily conserved and provide a larger mutational target than coding regions of the human genome, suggesting that mutations in non-coding regions contribute significantly to development and disease. Using a computational approach to predict gene regulatory enhancers, we found that many known and predicted embryonic enhancers cluster in genomic loci harboring development-associated genes. One of the densest clusters of predicted enhancers in the human genome is near the genes GMDS and FOXC1. GMDS encodes a short-chain mannose dehydrogenase enzyme involved in the regulation of hindbrain neural migration, and FOXC1 encodes a developmental transcription factor required for brain, heart, and eye development. We experimentally validate four novel enhancers in this locus and demonstrate that these enhancers show consistent activity during embryonic development in domains that overlap with the expression of FOXC1 and GMDS. These four enhancers contain binding motifs for several transcription factors, including the ZIC family of transcription factors. Removal of the ZIC binding sites significantly alters enhancer activity in three of these enhancers, reducing expression in the eye, hindbrain, and limb, suggesting a mechanism whereby ZIC family members may transcriptionally regulate FOXC1 and/or GMDS expression. Our findings uncover novel enhancer regions that may control transcription in a topological domain important for embryonic development.

  7. Computer-assisted versus oral-and-written family history taking for identifying people with elevated risk of type 2 diabetes mellitus.

    Science.gov (United States)

    Pappas, Yannis; Wei, Igor; Car, Josip; Majeed, Azeem; Sheikh, Aziz

    2011-12-07

    Diabetes is a chronic illness characterised by insulin resistance or deficiency, resulting in elevated glycosylated haemoglobin A1c (HbA1c) levels. Because diabetes tends to run in families, the collection of data is an important tool for identifying people with elevated risk of type2 diabetes. Traditionally, oral-and-written data collection methods are employed but computer-assisted history taking systems (CAHTS) are increasingly used. Although CAHTS were first described in the 1960s, there remains uncertainty about the impact of these methods on family history taking, clinical care and patient outcomes such as health-related quality of life.  To assess the effectiveness of computer-assisted versus oral-and-written family history taking for identifying people with elevated risk of developing type 2 diabetes mellitus. We searched The Cochrane Library (issue 6, 2011), MEDLINE (January 1985 to June 2011), EMBASE (January 1980 to June 2011) and CINAHL (January 1981 to June 2011). Reference lists of obtained articles were also pursued further and no limits were imposed on languages and publication status. Randomised controlled trials of computer-assisted versus oral-and-written history taking in adult participants (16 years and older). Two authors independently scanned the title and abstract of retrieved articles. Potentially relevant articles were investigated as full text. Studies that met the inclusion criteria were abstracted for relevant population and intervention characteristics with any disagreements resolved by discussion, or by a third party. Risk of bias was similarly assessed independently. We found no controlled trials on computer-assisted versus oral-and-written family history taking for identifying people with elevated risk of type 2 diabetes mellitus. There is a need to develop an evidence base to support the effective development and use of computer-assisted history taking systems in this area of practice. In the absence of evidence on effectiveness

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  9. Computational spatiotemporal analysis identifies WAVE2 and cofilin as joint regulators of costimulation-mediated T cell actin dynamics.

    Science.gov (United States)

    Roybal, Kole T; Buck, Taráz E; Ruan, Xiongtao; Cho, Baek Hwan; Clark, Danielle J; Ambler, Rachel; Tunbridge, Helen M; Zhang, Jianwei; Verkade, Paul; Wülfing, Christoph; Murphy, Robert F

    2016-04-19

    Fluorescence microscopy is one of the most important tools in cell biology research because it provides spatial and temporal information to investigate regulatory systems inside cells. This technique can generate data in the form of signal intensities at thousands of positions resolved inside individual live cells. However, given extensive cell-to-cell variation, these data cannot be readily assembled into three- or four-dimensional maps of protein concentration that can be compared across different cells and conditions. We have developed a method to enable comparison of imaging data from many cells and applied it to investigate actin dynamics in T cell activation. Antigen recognition in T cells by the T cell receptor (TCR) is amplified by engagement of the costimulatory receptor CD28. We imaged actin and eight core actin regulators to generate over a thousand movies of T cells under conditions in which CD28 was either engaged or blocked in the context of a strong TCR signal. Our computational analysis showed that the primary effect of costimulation blockade was to decrease recruitment of the activator of actin nucleation WAVE2 (Wiskott-Aldrich syndrome protein family verprolin-homologous protein 2) and the actin-severing protein cofilin to F-actin. Reconstitution of WAVE2 and cofilin activity restored the defect in actin signaling dynamics caused by costimulation blockade. Thus, we have developed and validated an approach to quantify protein distributions in time and space for the analysis of complex regulatory systems. Copyright © 2016, American Association for the Advancement of Science.

  10. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  11. Computational Analysis of Breast Cancer GWAS Loci Identifies the Putative Deleterious Effect of STXBP4 and ZNF404 Gene Variants.

    Science.gov (United States)

    Masoodi, Tariq Ahmad; Banaganapalli, Babajan; Vaidyanathan, Venkatesh; Talluri, Venkateswar R; Shaik, Noor A

    2017-04-19

    The genome-wide association studies (GWAS) have enabled us in identifying different breast cancer (BC) susceptibility loci. However, majority of these are non-coding variants with no annotated biological function. We investigated such 78 noncoding genome wide associated SNPs of BC and further expanded the list to 2,162 variants with strong linkage-disequilibrium (LD, r(2) ≥0.8). Using multiple publically available algorithms such as CADD, GWAVA, and FATHAMM, we classified all these variants into deleterious, damaging, or benign categories. Out of total 2,241 variants, 23 (1.02%) variants were extreme deleterious (rank 1), 70 (3.12%) variants were deleterious (rank 2), and 1,937 (86.43%) variants were benign (rank 3). The results show 14% of lead or associated variants are under strong negative selection (GERP++ RS ≥2), and ∼22% are under balancing selection (Tajima's D score >2) in CEU population of 1KGP-the regions being positively selected (GERP++ RS <0) in mammalian evolution. The expression quantitative trait loci of highest deleteriously ranked genes were tested on relevant adipose and breast tissues, the results of which were extended for protein expression on breast tissues. From the concordance analysis of ranking system of GWAVA, CADD, and FATHMM, eQTL and protein expression, we identified the deleterious SNPs localized in STXBP4 and ZNF404 genes which might play a role in BC development by dysregulating its gene expression. This simple approach will be easier to implement and to prioritize large scale GWAS data for variety of diseases and link to the potentially unrecognized functional roles of genes. J. Cell. Biochem. 9999: 1-12, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. Identifying potential selective fluorescent probes for cancer-associated protein carbonic anhydrase IX using a computational approach.

    Science.gov (United States)

    Kamstra, Rhiannon L; Floriano, Wely B

    2014-11-01

    Carbonic anhydrase IX (CAIX) is a biomarker for tumor hypoxia. Fluorescent inhibitors of CAIX have been used to study hypoxic tumor cell lines. However, these inhibitor-based fluorescent probes may have a therapeutic effect that is not appropriate for monitoring treatment efficacy. In the search for novel fluorescent probes that are not based on known inhibitors, a database of 20,860 fluorescent compounds was virtually screened against CAIX using hierarchical virtual ligand screening (HierVLS). The screening database contained 14,862 compounds tagged with the ATTO680 fluorophore plus an additional 5998 intrinsically fluorescent compounds. Overall ranking of compounds to identify hit molecular probe candidates utilized a principal component analysis (PCA) approach. Four potential binding sites, including the catalytic site, were identified within the structure of the protein and targeted for virtual screening. Available sequence information for 23 carbonic anhydrase isoforms was used to prioritize the four sites based on the estimated "uniqueness" of each site in CAIX relative to the other isoforms. A database of 32 known inhibitors and 478 decoy compounds was used to validate the methodology. A receiver-operating characteristic (ROC) analysis using the first principal component (PC1) as predictive score for the validation database yielded an area under the curve (AUC) of 0.92. AUC is interpreted as the probability that a binder will have a better score than a non-binder. The use of first component analysis of binding energies for multiple sites is a novel approach for hit selection. The very high prediction power for this approach increases confidence in the outcome from the fluorescent library screening. Ten of the top scoring candidates for isoform-selective putative binding sites are suggested for future testing as fluorescent molecular probe candidates.

  13. Bioinformatic analysis of computational identified differentially expressed genes in tumor stoma of pregnancy‑associated breast cancer.

    Science.gov (United States)

    Zhou, Qian; Sun, Erhu; Ling, Lijun; Liu, Xiaofeng; Zhang, Min; Yin, Hong; Lu, Cheng

    2017-09-01

    The present study aimed to screen the differentially expressed genes (DEGs) in tumor‑associated stroma of pregnancy‑associated breast cancer (PABC). By analyzing Affymetrix microarray data (GSE31192) from the Gene Expression Omnibus database, DEGs between tumor asso-ciated stromal cells and normal stromal cells in PABC were identified. Gene Ontology (GO) function and pathway enrichment analyses for the DEGs were then performed, followed by construction of a protein‑protein interaction (PPI) network. A total of 94 upregulated and 386 downregulated DEGs were identified between tumor associated stromal cells and normal stromal cells in patients with PABC. The upregulated DEGs were primarily enriched in the cytokine‑cytokine receptor interaction pathway and GO terms associated with the immune response, which included the DEGs of interleukin 18 (IL18) and cluster of differentiation 274 (CD274). The downregulated DEGs were primarily involved in GO terms associated with cell surface receptor linked signal transduction and pathways of focal adhesion and pathways in cancer. In the PPI network, nodes of jun proto‑oncogene (JUN), FBJ murine osteosarcoma viral oncogene homolog (FOS), V‑myc avian myelocytomatosis viral oncogene homolog (MYC), and alpha‑smooth muscle actin (ACTA2) had higher degrees. The hub genes of JUN, FOS, MYC and ACTA2, as well as the DEGs IL18 and CD274 that were associated with the immune response in GO terms may exert important functions in the molecular mechanisms of PABC. These genes may be used as new molecular targets in the treatment of this disease.

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  16. Use of manual alveolar recruitment maneuvers to eliminate atelectasis artifacts identified during thoracic computed tomography of healthy neonatal foals.

    Science.gov (United States)

    Lascola, Kara M; Clark-Price, Stuart C; Joslyn, Stephen K; Mitchell, Mark A; O'Brien, Robert T; Hartman, Susan K; Kline, Kevin H

    2016-11-01

    OBJECTIVE To evaluate use of single manual alveolar recruitment maneuvers (ARMs) to eliminate atelectasis during CT of anesthetized foals. ANIMALS 6 neonatal Standardbred foals. PROCEDURES Thoracic CT was performed on spontaneously breathing anesthetized foals positioned in sternal (n = 3) or dorsal (3) recumbency when foals were 24 to 36 hours old (time 1), 4 days old (time 2), 7 days old (time 3), and 10 days old (time 4). The CT images were collected without ARMs (all times) and during ARMs with an internal airway pressure of 10, 20, and 30 cm H2O (times 2 and 3). Quantitative analysis of CT images measured whole lung and regional changes in attenuation or volume with ARMs. RESULTS Increased attenuation and an alveolar pattern were most prominent in the dependent portion of the lungs. Subjectively, ARMs did not eliminate atelectasis; however, they did incrementally reduce attenuation, particularly in the nondependent portion of the lungs. Quantitative differences in lung attenuation attributable to position of foal were not identified. Lung attenuation decreased significantly (times 2 and 3) and lung volume increased significantly (times 2 and 3) after ARMs. Changes in attenuation and volume were most pronounced in the nondependent portion of the lungs and at ARMs of 20 and 30 cm H2O. CONCLUSIONS AND CLINICAL RELEVANCE Manual ARMs did not eliminate atelectasis but reduced attenuation in nondependent portions of the lungs. Positioning of foals in dorsal recumbency for CT may be appropriate when pathological changes in the ventral portion of the lungs are suspected.

  17. Soft computing model for optimized siRNA design by identifying off target possibilities using artificial neural network model.

    Science.gov (United States)

    Murali, Reena; John, Philips George; Peter S, David

    2015-05-15

    The ability of small interfering RNA (siRNA) to do posttranscriptional gene regulation by knocking down targeted genes is an important research topic in functional genomics, biomedical research and in cancer therapeutics. Many tools had been developed to design exogenous siRNA with high experimental inhibition. Even though considerable amount of work has been done in designing exogenous siRNA, design of effective siRNA sequences is still a challenging work because the target mRNAs must be selected such that their corresponding siRNAs are likely to be efficient against that target and unlikely to accidentally silence other transcripts due to sequence similarity. In some cases, siRNAs may tolerate mismatches with the target mRNA, but knockdown of genes other than the intended target could make serious consequences. Hence to design siRNAs, two important concepts must be considered: the ability in knocking down target genes and the off target possibility on any nontarget genes. So before doing gene silencing by siRNAs, it is essential to analyze their off target effects in addition to their inhibition efficacy against a particular target. Only a few methods have been developed by considering both efficacy and off target possibility of siRNA against a gene. In this paper we present a new design of neural network model with whole stacking energy (ΔG) that enables to identify the efficacy and off target effect of siRNAs against target genes. The tool lists all siRNAs against a particular target with their inhibition efficacy and number of matches or sequence similarity with other genes in the database. We could achieve an excellent performance of Pearson Correlation Coefficient (R=0. 74) and Area Under Curve (AUC=0.906) when the threshold of whole stacking energy is ≥-34.6 kcal/mol. To the best of the author's knowledge, this is one of the best score while considering the "combined efficacy and off target possibility" of siRNA for silencing a gene. The proposed model

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  19. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  5. Pharmacophore modeling, docking, and principal component analysis based clustering: combined computer-assisted approaches to identify new inhibitors of the human rhinovirus coat protein.

    Science.gov (United States)

    Steindl, Theodora M; Crump, Carolyn E; Hayden, Frederick G; Langer, Thierry

    2005-10-06

    The development and application of a sophisticated virtual screening and selection protocol to identify potential, novel inhibitors of the human rhinovirus coat protein employing various computer-assisted strategies are described. A large commercially available database of compounds was screened using a highly selective, structure-based pharmacophore model generated with the program Catalyst. A docking study and a principal component analysis were carried out within the software package Cerius and served to validate and further refine the obtained results. These combined efforts led to the selection of six candidate structures, for which in vitro anti-rhinoviral activity could be shown in a biological assay.

  6. An In Vivo Method to Identify microRNA Targets Not Predicted by Computation Algorithms: p21 Targeting by miR-92a in Cancer.

    Science.gov (United States)

    Su, Xiaoping; Wang, Huaming; Ge, Wei; Yang, Mingjin; Hou, Jin; Chen, Taoyong; Li, Nan; Cao, Xuetao

    2015-07-15

    microRNA (miRNA) dysregulation is involved in the development and progression of various human cancers, including hepatocellular carcinoma (HCC). However, how to identify the miRNAs targeting specific mRNA in cells is a significant challenge because of the interaction complexity and the limited knowledge of rules governing these processes. Some miRNAs are not predictable by current computer algorithms available. Here, using p21 mRNA as target, we established a new method, called miRNA in vivo precipitation (miRIP), to identify which kind of miRNAs can actually bind to the specific mRNA in cells. Several unpredictable miRNAs that bound p21 mRNA in HepG2 and PC-3 cells were identified by the miRIP method. Among these miRNAs identified by miRIP, miR-92a was found and confirmed to interact robustly with p21 mRNA, both in HepG2 and PC-3 cells. miR-92a was found to be remarkably increased in HCC tissues, and higher expression of miR-92a significantly correlated with lower expression of p21, which is related to poor survival of HCC patients. Moreover, inhibition of miR-92a could significantly suppress HCC growth in vitro and in vivo by upregulating p21. Together, miR-92a, which is identified by miRIP, is functionally shown to be associated with HCC growth as an oncogenic miRNA by inhibiting expression of targeting gene p21. In addition, several unpredictable miRNAs that target STAT3 mRNA were also identified by the miRIP method in HepG2 cells. Our results demonstrated that the miRIP approach can effectively identify the unpredictable but intracellular existing miRNAs that target specific mRNA in vivo.

  7. ISD97, a computer program to analyze data from a series of in situ measurements on a grid and identify potential localized areas of elevated activity

    Energy Technology Data Exchange (ETDEWEB)

    Reginatto, M.; Shebell, P.; Miller, K.M.

    1997-10-01

    A computer program, ISD97, was developed to analyze data from a series of in situ measurements on a grid and identify potential localized areas of elevated activity. The ISD97 code operates using a two-step process. A deconvolution of the data is carried out using the maximum entropy method, and a map of activity on the ground that fits the data within experimental error is generated. This maximum entropy map is then analyzed to determine the locations and magnitudes of potential areas of elevated activity that are consistent with the data. New deconvolutions are then carried out for each potential area of elevated activity identified by the code. Properties of the algorithm are demonstrated using data from actual field measurements.

  8. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  9. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  10. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  12. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  14. Field performance of "marsh seedless" grapefruit on trifoliate orange inoculated with viroids in Brazil Desempenho do pomeleiro "marsh seedles" enxertado em trifoliata inoculado com viróides no Brasil

    Directory of Open Access Journals (Sweden)

    Eduardo Sanches Stuchi

    2007-12-01

    Full Text Available Some viroids reduce citrus tree growth and may be used for tree size control aiming the establishment of orchards with close tree spacing that may provide higher productivity than conventional ones. To study the effects of citrus viroids inoculation on vegetative growth, yield and fruit quality of 'Marsh Seedless' grapefruit (Citrus paradisi Macf. grafted on trifoliate orange [Poncirus trifoliata (L. Raf.], an experiment was set up in January 1991, in Bebedouro, São Paulo State, Brazil. The experimental design was randomized blocks with four treatments with two plants per plot: viroid isolates Citrus Exocortis Viroid (CEVd + Hop stunt viroid (HSVd - CVd-II, a non cachexia variant + Citrus III viroid (CVd-III and Hop stunt viroid (HSVd - CVd-II, a non cachexia variant + Citrus III viroid (CVd-III and controls: two healthy buds (control, and no grafting (absolute control. Inoculation was done in the field, six months after planting by bud grafting. Both isolates reduced tree growth (trunk diameter, plant height, canopy diameter and volume. Trees not inoculated yielded better (average of eleven harvests than inoculated ones but the productivity was the same after 150 months. Fruit quality was affected by viroids inoculation but not in a restrictive way. The use of such severe dwarfing isolates for high density plantings of grapefruit on trifoliate orange rootstock is not recommended.Alguns viróides reduzem o crescimento dos citros e podem ser usados para o controle do tamanho das plantas objetivando a instalação de pomares adensados que podem ter maior produtividade que os pomares com espaçamentos convencionais. Para estudar o efeito da inoculação de viróides no desenvolvimento vegetativo, produção e qualidade dos frutos de pomeleiro 'Marsh Seedless' (Citrus paradisi Macf. enxertado em trifoliata [Poncirus trifoliata (L. Raf.], foi instalado um experimento em Janeiro de 1991, em Bebedouro, Estado de São Paulo, Brasil. O delineamento

  15. An experimentally based computer search identifies unstructured membrane-binding sites in proteins: application to class I myosins, PAKS, and CARMIL.

    Science.gov (United States)

    Brzeska, Hanna; Guag, Jake; Remmert, Kirsten; Chacko, Susan; Korn, Edward D

    2010-02-19

    Programs exist for searching protein sequences for potential membrane-penetrating segments (hydrophobic regions) and for lipid-binding sites with highly defined tertiary structures, such as PH, FERM, C2, ENTH, and other domains. However, a rapidly growing number of membrane-associated proteins (including cytoskeletal proteins, kinases, GTP-binding proteins, and their effectors) bind lipids through less structured regions. Here, we describe the development and testing of a simple computer search program that identifies unstructured potential membrane-binding sites. Initially, we found that both basic and hydrophobic amino acids, irrespective of sequence, contribute to the binding to acidic phospholipid vesicles of synthetic peptides that correspond to the putative membrane-binding domains of Acanthamoeba class I myosins. Based on these results, we modified a hydrophobicity scale giving Arg- and Lys-positive, rather than negative, values. Using this basic and hydrophobic scale with a standard search algorithm, we successfully identified previously determined unstructured membrane-binding sites in all 16 proteins tested. Importantly, basic and hydrophobic searches identified previously unknown potential membrane-binding sites in class I myosins, PAKs and CARMIL (capping protein, Arp2/3, myosin I linker; a membrane-associated cytoskeletal scaffold protein), and synthetic peptides and protein domains containing these newly identified sites bound to acidic phospholipids in vitro.

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  17. A computational module assembled from different protease family motifs identifies PI PLC from Bacillus cereus as a putative prolyl peptidase with a serine protease scaffold.

    Science.gov (United States)

    Rendón-Ramírez, Adela; Shukla, Manish; Oda, Masataka; Chakraborty, Sandeep; Minda, Renu; Dandekar, Abhaya M; Ásgeirsson, Bjarni; Goñi, Félix M; Rao, Basuthkar J

    2013-01-01

    Proteolytic enzymes have evolved several mechanisms to cleave peptide bonds. These distinct types have been systematically categorized in the MEROPS database. While a BLAST search on these proteases identifies homologous proteins, sequence alignment methods often fail to identify relationships arising from convergent evolution, exon shuffling, and modular reuse of catalytic units. We have previously established a computational method to detect functions in proteins based on the spatial and electrostatic properties of the catalytic residues (CLASP). CLASP identified a promiscuous serine protease scaffold in alkaline phosphatases (AP) and a scaffold recognizing a β-lactam (imipenem) in a cold-active Vibrio AP. Subsequently, we defined a methodology to quantify promiscuous activities in a wide range of proteins. Here, we assemble a module which encapsulates the multifarious motifs used by protease families listed in the MEROPS database. Since APs and proteases are an integral component of outer membrane vesicles (OMV), we sought to query other OMV proteins, like phospholipase C (PLC), using this search module. Our analysis indicated that phosphoinositide-specific PLC from Bacillus cereus is a serine protease. This was validated by protease assays, mass spectrometry and by inhibition of the native phospholipase activity of PI-PLC by the well-known serine protease inhibitor AEBSF (IC50 = 0.018 mM). Edman degradation analysis linked the specificity of the protease activity to a proline in the amino terminal, suggesting that the PI-PLC is a prolyl peptidase. Thus, we propose a computational method of extending protein families based on the spatial and electrostatic congruence of active site residues.

  18. Computational drugs repositioning identifies inhibitors of oncogenic PI3K/AKT/P70S6K-dependent pathways among FDA-approved compounds

    Science.gov (United States)

    Carrella, Diego; Manni, Isabella; Tumaini, Barbara; Dattilo, Rosanna; Papaccio, Federica; Mutarelli, Margherita; Sirci, Francesco; Amoreo, Carla A.; Mottolese, Marcella; Iezzi, Manuela; Ciolli, Laura; Aria, Valentina; Bosotti, Roberta; Isacchi, Antonella; Loreni, Fabrizio; Bardelli, Alberto; Avvedimento, Vittorio E.; di Bernardo, Diego; Cardone, Luca

    2016-01-01

    The discovery of inhibitors for oncogenic signalling pathways remains a key focus in modern oncology, based on personalized and targeted therapeutics. Computational drug repurposing via the analysis of FDA-approved drug network is becoming a very effective approach to identify therapeutic opportunities in cancer and other human diseases. Given that gene expression signatures can be associated with specific oncogenic mutations, we tested whether a “reverse” oncogene-specific signature might assist in the computational repositioning of inhibitors of oncogenic pathways. As a proof of principle, we focused on oncogenic PI3K-dependent signalling, a molecular pathway frequently driving cancer progression as well as raising resistance to anticancer-targeted therapies. We show that implementation of “reverse” oncogenic PI3K-dependent transcriptional signatures combined with interrogation of drug networks identified inhibitors of PI3K-dependent signalling among FDA-approved compounds. This led to repositioning of Niclosamide (Niclo) and Pyrvinium Pamoate (PP), two anthelmintic drugs, as inhibitors of oncogenic PI3K-dependent signalling. Niclo inhibited phosphorylation of P70S6K, while PP inhibited phosphorylation of AKT and P70S6K, which are downstream targets of PI3K. Anthelmintics inhibited oncogenic PI3K-dependent gene expression and showed a cytostatic effect in vitro and in mouse mammary gland. Lastly, PP inhibited the growth of breast cancer cells harbouring PI3K mutations. Our data indicate that drug repositioning by network analysis of oncogene-specific transcriptional signatures is an efficient strategy for identifying oncogenic pathway inhibitors among FDA-approved compounds. We propose that PP and Niclo should be further investigated as potential therapeutics for the treatment of tumors or diseases carrying the constitutive activation of the PI3K/P70S6K signalling axis. PMID:27542212

  19. A computational module assembled from different protease family motifs identifies PI PLC from Bacillus cereus as a putative prolyl peptidase with a serine protease scaffold.

    Directory of Open Access Journals (Sweden)

    Adela Rendón-Ramírez

    Full Text Available Proteolytic enzymes have evolved several mechanisms to cleave peptide bonds. These distinct types have been systematically categorized in the MEROPS database. While a BLAST search on these proteases identifies homologous proteins, sequence alignment methods often fail to identify relationships arising from convergent evolution, exon shuffling, and modular reuse of catalytic units. We have previously established a computational method to detect functions in proteins based on the spatial and electrostatic properties of the catalytic residues (CLASP. CLASP identified a promiscuous serine protease scaffold in alkaline phosphatases (AP and a scaffold recognizing a β-lactam (imipenem in a cold-active Vibrio AP. Subsequently, we defined a methodology to quantify promiscuous activities in a wide range of proteins. Here, we assemble a module which encapsulates the multifarious motifs used by protease families listed in the MEROPS database. Since APs and proteases are an integral component of outer membrane vesicles (OMV, we sought to query other OMV proteins, like phospholipase C (PLC, using this search module. Our analysis indicated that phosphoinositide-specific PLC from Bacillus cereus is a serine protease. This was validated by protease assays, mass spectrometry and by inhibition of the native phospholipase activity of PI-PLC by the well-known serine protease inhibitor AEBSF (IC50 = 0.018 mM. Edman degradation analysis linked the specificity of the protease activity to a proline in the amino terminal, suggesting that the PI-PLC is a prolyl peptidase. Thus, we propose a computational method of extending protein families based on the spatial and electrostatic congruence of active site residues.

  20. Sepsis reconsidered: Identifying novel metrics for behavioral landscape characterization with a high-performance computing implementation of an agent-based model.

    Science.gov (United States)

    Cockrell, Chase; An, Gary

    2017-10-07

    Sepsis affects nearly 1 million people in the United States per year, has a mortality rate of 28-50% and requires more than $20 billion a year in hospital costs. Over a quarter century of research has not yielded a single reliable diagnostic test or a directed therapeutic agent for sepsis. Central to this insufficiency is the fact that sepsis remains a clinical/physiological diagnosis representing a multitude of molecularly heterogeneous pathological trajectories. Advances in computational capabilities offered by High Performance Computing (HPC) platforms call for an evolution in the investigation of sepsis to attempt to define the boundaries of traditional research (bench, clinical and computational) through the use of computational proxy models. We present a novel investigatory and analytical approach, derived from how HPC resources and simulation are used in the physical sciences, to identify the epistemic boundary conditions of the study of clinical sepsis via the use of a proxy agent-based model of systemic inflammation. Current predictive models for sepsis use correlative methods that are limited by patient heterogeneity and data sparseness. We address this issue by using an HPC version of a system-level validated agent-based model of sepsis, the Innate Immune Response ABM (IIRBM), as a proxy system in order to identify boundary conditions for the possible behavioral space for sepsis. We then apply advanced analysis derived from the study of Random Dynamical Systems (RDS) to identify novel means for characterizing system behavior and providing insight into the tractability of traditional investigatory methods. The behavior space of the IIRABM was examined by simulating over 70 million sepsis patients for up to 90 days in a sweep across the following parameters: cardio-respiratory-metabolic resilience; microbial invasiveness; microbial toxigenesis; and degree of nosocomial exposure. In addition to using established methods for describing parameter space, we

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  2. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Directory of Open Access Journals (Sweden)

    Sebastian McBride

    Full Text Available Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1 conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2 implementation and validation of the model into robotic hardware (as a representative of an active vision system. Seven computational requirements were identified: 1 transformation of retinotopic to egocentric mappings, 2 spatial memory for the purposes of medium-term inhibition of return, 3 synchronization of 'where' and 'what' information from the two visual streams, 4 convergence of top-down and bottom-up information to a centralized point of information processing, 5 a threshold function to elicit saccade action, 6 a function to represent task relevance as a ratio of excitation and inhibition, and 7 derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  3. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Science.gov (United States)

    McBride, Sebastian; Huelse, Martin; Lee, Mark

    2013-01-01

    Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1) conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2) implementation and validation of the model into robotic hardware (as a representative of an active vision system). Seven computational requirements were identified: 1) transformation of retinotopic to egocentric mappings, 2) spatial memory for the purposes of medium-term inhibition of return, 3) synchronization of 'where' and 'what' information from the two visual streams, 4) convergence of top-down and bottom-up information to a centralized point of information processing, 5) a threshold function to elicit saccade action, 6) a function to represent task relevance as a ratio of excitation and inhibition, and 7) derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  4. Development of computational fluid dynamics--habitat suitability (CFD-HSI) models to identify potential passage--Challenge zones for migratory fishes in the Penobscot River

    Science.gov (United States)

    Haro, Alexander J.; Dudley, Robert W.; Chelminski, Michael

    2012-01-01

    A two-dimensional computational fluid dynamics-habitat suitability (CFD–HSI) model was developed to identify potential zones of shallow depth and high water velocity that may present passage challenges for five anadromous fish species in the Penobscot River, Maine, upstream from two existing dams and as a result of the proposed future removal of the dams. Potential depth-challenge zones were predicted for larger species at the lowest flow modeled in the dam-removal scenario. Increasing flows under both scenarios increased the number and size of potential velocity-challenge zones, especially for smaller species. This application of the two-dimensional CFD–HSI model demonstrated its capabilities to estimate the potential effects of flow and hydraulic alteration on the passage of migratory fish.

  5. An Approach for a Synthetic CTL Vaccine Design against Zika Flavivirus Using Class I and Class II Epitopes Identified by Computer Modeling

    Directory of Open Access Journals (Sweden)

    Edecio Cunha-Neto

    2017-06-01

    Full Text Available The threat posed by severe congenital abnormalities related to Zika virus (ZKV infection during pregnancy has turned development of a ZKV vaccine into an emergency. Recent work suggests that the cytotoxic T lymphocyte (CTL response to infection is an important defense mechanism in response to ZKV. Here, we develop the rationale and strategy for a new approach to developing cytotoxic T lymphocyte (CTL vaccines for ZKV flavivirus infection. The proposed approach is based on recent studies using a protein structure computer model for HIV epitope selection designed to select epitopes for CTL attack optimized for viruses that exhibit antigenic drift. Because naturally processed and presented human ZKV T cell epitopes have not yet been described, we identified predicted class I peptide sequences on ZKV matching previously identified DNV (Dengue class I epitopes and by using a Major Histocompatibility Complex (MHC binding prediction tool. A subset of those met the criteria for optimal CD8+ attack based on physical chemistry parameters determined by analysis of the ZKV protein structure encoded in open source Protein Data File (PDB format files. We also identified candidate ZKV epitopes predicted to bind promiscuously to multiple HLA class II molecules that could provide help to the CTL responses. This work suggests that a CTL vaccine for ZKV may be possible even if ZKV exhibits significant antigenic drift. We have previously described a microsphere-based CTL vaccine platform capable of eliciting an immune response for class I epitopes in mice and are currently working toward in vivo testing of class I and class II epitope delivery directed against ZKV epitopes using the same microsphere-based vaccine.

  6. Clone and Analysis of A Phytocyanin-related Early Nodulin-like gene, PtBCP1 from Poncirus trifoliate%枳早期结瘤素样基因PtBCP1的克隆和分析

    Institute of Scientific and Technical Information of China (English)

    姚利晓; 马园园; 李凤龙; 许兰珍; 雷天刚; 彭爱红; 何永睿; 陈善春

    2012-01-01

    [目的]克隆枳早期结瘤素样基因PtBCP1并对其序列进行分析.[方法]以枳消减文库中的C28 EST为种子序列进行电子克隆,据此设计引物进行PCR,以获得枳早期结瘤素样基因PtBCP1的全长cDNA和DNA序列,并对获得的序列进行生物信息学分析.[结果] PtBCP1基因由2个外显子和1个内含子组成,在枳幼苗根中的表达量是叶中的140倍,该基因编码的蛋白含131个氨基酸残基,预测分子量为14.0 kD,理论等电点为8.75,具有N端信号肽和PCLD(PFMD:PF02298)功能域,但无完整的铜离子结合位点,属于早期结瘤素祥蛋白.[结论]为进一步研究PtBCP1基因的生物学功能奠定了基础.%[Objective] The aim was to clone and analyze the sequence of phytocyanin-related early nodulin-like gene (PtBCPl) from Poncirus trifoliata. [Method] The electronic cloning was done by using the C28 EST of Poncirus trifoliata subtractive library as seed sequences and the RT-PCR primers were designed by that. Then the PtBCPl cDNA and DNA of Poncirus trifoliate were cloned by PCR and the sequences were analyzed by bioinformatics methods. [Result] The PtBCPl gene included two exons and one intron, and was 140 times of expression in the seeding root of Poncirus trifoliata than that of in leaves. The gene encoded a polypeptide of 131 amino acids with a 14 kD molecular weight and 8.75 isoelectric point. The PtBCPl protein was a early nodulin-like protein, with a signal peptide in the N-terminal and a plastocyanin-lifce domain in the C-terrainal, and without the complete copper-binding sites. [ Conclusion ] It prepared the ground for the further research of PtBCPl gene biological function.

  7. Correlation between Abdominal Perforator Vessels Identified with Preoperative Computed Tomography Angiography and Intraoperative Fluorescent Angiography in the Microsurgical Breast Reconstruction Patient.

    Science.gov (United States)

    Pestana, Ivo A; Crantford, J Clayton; Zenn, Michael R

    2014-05-06

    Background Computed tomography angiography (CTA) has become a reliable method of perforator vessel identification. Indocyanine green laser-assisted fluorescent angiography (ICGLA) produces a real-time image of large and small caliber blood vessels. The aim of this prospective study was to compare ICGLA with CTA to evaluate its reliability of vessel identification and correlation to perforator vessel size and number determined preoperatively by CTA. The effect of both imaging techniques on flap design or intraoperative plan was also evaluated. Methods Over a 1-year period, patients presenting for free-tissue transfer breast reconstruction underwent preoperative CTA mapping of abdominal perforators followed by intraoperative ICGLA. Using visualization software, scaling factors were calculated so that CTA and ICGLA data could be compared. Results A total of 18 patients (24 breast reconstructions) were included. Larger CTA perforator size was associated with larger actual size (p = 0.04). The largest CTA perforator or largest actual perforator was used 78% of the time. Increasing body mass index was not associated with larger CTA perforator size (p = 0.67) or more intense ICGLA blushes (p = 0.13). No significant correlation was found between CTA perforator location and ICGLA skin blush location, size, or intensity. CTA- or ICGLA-guided intraoperative procedure adjustments were done in 72% of the patients. ICGLA identified poor soft tissue perfusion and guided flap resection in 46% of the patients. Conclusions ICGLA skin blush location, size, and intensity does not correlate with CTA-identified perforating vessel location or actual perforating vessel size. Despite this, the ICGLA information was useful for evaluation of soft tissue perfusion and flap design. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  8. A retrospective comparative study of cone-beam computed tomography versus rendered panoramic images in identifying the presence, types, and characteristics of dens invaginatus in a Turkish population.

    Science.gov (United States)

    Capar, Ismail Davut; Ertas, Huseyin; Arslan, Hakan; Tarim Ertas, Elif

    2015-04-01

    This study assessed the presence, characteristics, and type of dens invaginatus (DI) by using cone-beam computed tomography (CBCT) and panoramic images rendered from CBCT images. In addition, the findings of the imaging techniques were compared. We evaluated 300 CBCT images to determine the type of DI, the presence of an impacted tooth near the DI, and the presence of apical pathosis. The McNemar test was used to compare the prevalence of DI according to CBCT and panoramic images rendered from CBCT images. The presence of DI was lower on panoramic images rendered from CBCT images (3% of the patients) compared with on CBCT images (10.7% of the patients) (P mesiodens (9%), maxillary canines (2.3%), and mandibular canines (2.3%). Type I DI was the most commonly observed type of invaginatus (65.9%), followed by type II (29.5%) and type III (4.6%). All patients with type III DI and 25% of the patients with type II DI had apical pathosis at the time of referral, but periapical lesions were not observed in teeth with type I DI. In total, 13.6% of DI cases had impacted teeth. CBCT can be recommended as an effective diagnostic device for identifying DI because it provides an accurate representation of the external and internal dental anatomy. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  9. Ptcorp gene induced by cold stress was identified by proteomic analysis in leaves of Poncirus trifoliata (L.) Raf.

    Science.gov (United States)

    Long, Guiyou; Song, Jinyu; Deng, Ziniu; Liu, Jie; Rao, Liqun

    2012-05-01

    A proteomic approach was employed to investigate the cold stress-responsive proteins in trifoliate orange (Poncirus trifoliata (L.) Raf.), which is a well-known cold tolerant citrus relative and widely used as rootstock in China. Two-year-old potted seedlings were exposed to freezing temperature (-6°C) for 50 min (nonlethal) and 80 min (lethal), and the total proteins were isolated from leaves of the treated plants. Nine differentially accumulated proteins over 2-fold changes in abundance were identified by two-dimensional gel electrophoresis and mass spectrometry. Among these proteins, a resistance protein induced by the nonlethal cold treatment (protein spot #2 from P. trifoliata) was selected as target sequence for degenerated primer design. By using the designed primers, a PCR product of about 700 bp size was amplified from P. trifoliata genomic DNA, which was further cloned and sequenced. A nucleotide sequence of 676 bp was obtained and named Ptcorp. Blast retrieval showed that Ptcorp shared 88% homology with an EST of cold acclimated Bluecrop (Vaccinium corymbosum) library (Accession number: CF811080), indicating that Ptcorp had association with cold acclimation. Semiquantitative RT-PCR analysis demonstrated that Ptcorp gene was up-regulated by cold stress which was consistent with the former result of protein expression profile. As the resistance protein (NBS-LRR disease resistance protein family) gene was up-regulated by cold stress in trifoliate orange and satsuma mandarin, it may imply that NBS-LRR genes might be associated with cold resistance in citrus.

  10. Computer-based malnutrition risk calculation may enhance the ability to identify pediatric patients at malnutrition-related risk for unfavorable outcome.

    Science.gov (United States)

    Karagiozoglou-Lampoudi, Thomais; Daskalou, Efstratia; Lampoudis, Dimitrios; Apostolou, Aggeliki; Agakidis, Charalampos

    2015-05-01

    The study aimed to test the hypothesis that computer-based calculation of malnutrition risk may enhance the ability to identify pediatric patients at malnutrition-related risk for an unfavorable outcome. The Pediatric Digital Scaled MAlnutrition Risk screening Tool (PeDiSMART), incorporating the World Health Organization (WHO) growth reference data and malnutrition-related parameters, was used. This was a prospective cohort study of 500 pediatric patients aged 1 month to 17 years. Upon admission, the PeDiSMART score was calculated and anthropometry was performed. Pediatric Yorkhill Malnutrition Score (PYMS), Screening Tool Risk on Nutritional Status and Growth (STRONGkids), and Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP) malnutrition screening tools were also applied. PeDiSMART's association with the clinical outcome measures (weight loss/nutrition support and hospitalization duration) was assessed and compared with the other screening tools. The PeDiSMART score was inversely correlated with anthropometry and bioelectrical impedance phase angle (BIA PhA). The score's grading scale was based on BIA Pha quartiles. Weight loss/nutrition support during hospitalization was significantly independently associated with the malnutrition risk group allocation on admission, after controlling for anthropometric parameters and age. Receiver operating characteristic curve analysis showed a sensitivity of 87% and a specificity of 75% and a significant area under the curve, which differed significantly from that of STRONGkids and STAMP. In the subgroups of patients with PeDiSMART-based risk allocation different from that based on the other tools, PeDiSMART allocation was more closely related to outcome measures. PeDiSMART, applicable to the full age range of patients hospitalized in pediatric departments, graded according to BIA PhA, and embeddable in medical electronic records, enhances efficacy and reproducibility in identifying pediatric patients at

  11. Building And Using A Data Base To Identify Parameters To Further Improve Diagnostic Performance On The Toshiba Computed Radiography System Model 201

    Science.gov (United States)

    Seeley, George W.; Roehrig, Hans; Mockbee, Brent; Hunter, Tim B.; Ovitt, Theron; Claypool, H. R.; Bielland, John C.; Scott, Anne; Yang, Peter; Dallas, William J.

    1987-01-01

    The digital imaging group at the University of Arizona Health Sciences Center Radiology Department is vigorously pursuing the development of a total digital radiology department (TDRD). One avenue of research being conducted is to define the needed resolutions and capabilities of TDRD systems. Parts of that effort are described in these proceedings and elsewhere. One of these investigations is to assess the general application of computed r adiography (CR) in clinical imaging. Specifically we are comparing images produced by the Toshiba computed radiography system (Model 201) to those produced by conventional imaging techniques. This paper describes one aspect of that work.

  12. Tips from the Classroom: Introducing the Friendly and Useful Computer; Using Annotations to Identify Composition Errors; Building a Scaffold with Video Clips; Movie Karaoke; Gotcha.

    Science.gov (United States)

    Dudley, Albert P.; And Others

    1997-01-01

    Presents various tips that are useful in the classroom for teaching second languages. These tips focus on teaching basic computer operations; using annotations to foster error corrections in language; using video clips as a part of a U.S. history or culture-based English-as-a-Second-Language lesson; using karaoke to speak with less inhibition; and…

  13. Identifying Computer-Supported Collaborative Learning (CSCL) Research in Selected Journals Published from 2003 to 2012: A Content Analysis of Research Topics and Issues

    Science.gov (United States)

    Zheng, Lanqin; Huang, Ronghuai; Yu, Junhui

    2014-01-01

    This study aims to identity the emerging research trends in the field of computed-supported collaborative learning (CSCL) so as to provide insights for researchers and educators into research topics and issues for further exploration. This paper analyzed the research topics, methods and technology adoption of CSCL from 2003 to 2012. A total of 706…

  14. A computational psychiatry approach identifies how alpha-2A noradrenergic agonist Guanfacine affects feature-based reinforcement learning in the macaque

    Science.gov (United States)

    Hassani, S. A.; Oemisch, M.; Balcarras, M.; Westendorff, S.; Ardid, S.; van der Meer, M. A.; Tiesinga, P.; Womelsdorf, T.

    2017-01-01

    Noradrenaline is believed to support cognitive flexibility through the alpha 2A noradrenergic receptor (a2A-NAR) acting in prefrontal cortex. Enhanced flexibility has been inferred from improved working memory with the a2A-NA agonist Guanfacine. But it has been unclear whether Guanfacine improves specific attention and learning mechanisms beyond working memory, and whether the drug effects can be formalized computationally to allow single subject predictions. We tested and confirmed these suggestions in a case study with a healthy nonhuman primate performing a feature-based reversal learning task evaluating performance using Bayesian and Reinforcement learning models. In an initial dose-testing phase we found a Guanfacine dose that increased performance accuracy, decreased distractibility and improved learning. In a second experimental phase using only that dose we examined the faster feature-based reversal learning with Guanfacine with single-subject computational modeling. Parameter estimation suggested that improved learning is not accounted for by varying a single reinforcement learning mechanism, but by changing the set of parameter values to higher learning rates and stronger suppression of non-chosen over chosen feature information. These findings provide an important starting point for developing nonhuman primate models to discern the synaptic mechanisms of attention and learning functions within the context of a computational neuropsychiatry framework. PMID:28091572

  15. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2013-06-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3\\'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. © 2013 Elsevier Inc.

  16. A computational psychiatry approach identifies how alpha-2A noradrenergic agonist Guanfacine affects feature-based reinforcement learning in the macaque.

    Science.gov (United States)

    Hassani, S A; Oemisch, M; Balcarras, M; Westendorff, S; Ardid, S; van der Meer, M A; Tiesinga, P; Womelsdorf, T

    2017-01-16

    Noradrenaline is believed to support cognitive flexibility through the alpha 2A noradrenergic receptor (a2A-NAR) acting in prefrontal cortex. Enhanced flexibility has been inferred from improved working memory with the a2A-NA agonist Guanfacine. But it has been unclear whether Guanfacine improves specific attention and learning mechanisms beyond working memory, and whether the drug effects can be formalized computationally to allow single subject predictions. We tested and confirmed these suggestions in a case study with a healthy nonhuman primate performing a feature-based reversal learning task evaluating performance using Bayesian and Reinforcement learning models. In an initial dose-testing phase we found a Guanfacine dose that increased performance accuracy, decreased distractibility and improved learning. In a second experimental phase using only that dose we examined the faster feature-based reversal learning with Guanfacine with single-subject computational modeling. Parameter estimation suggested that improved learning is not accounted for by varying a single reinforcement learning mechanism, but by changing the set of parameter values to higher learning rates and stronger suppression of non-chosen over chosen feature information. These findings provide an important starting point for developing nonhuman primate models to discern the synaptic mechanisms of attention and learning functions within the context of a computational neuropsychiatry framework.

  17. Selection of personalized patient therapy through the use of knowledge-based computational models that identify tumor-driving signal transduction pathways

    NARCIS (Netherlands)

    Verhaegh, Wim; van Ooijen, Henk; Inda, Márcia A; Hatzis, Pantelis; Versteeg, Rogier; Smid, Marcel; Martens, John; Foekens, John; van de Wiel, Paul; Clevers, Hans; van de Stolpe, Anja

    2014-01-01

    Increasing knowledge about signal transduction pathways as drivers of cancer growth has elicited the development of "targeted drugs," which inhibit aberrant signaling pathways. They require a companion diagnostic test that identifies the tumor-driving pathway; however, currently available tests like

  18. Positron emission tomography/computed tomography with 18F-fluorodeoxyglucose identifies tumor growth or thrombosis in the portal vein with hepatocellular carcinoma

    Institute of Scientific and Technical Information of China (English)

    Long Sun; Hua Wu; Wei-Ming Pan; Yong-Song Guan

    2007-01-01

    Patients suffering from hepatocellular carcinoma (HCC) with tumor thrombus in the portal vein generally have a poor prognosis. Portal vein tumor thrombus must be distinguished from portal vein blood thrombus, and this identification plays a very important role in management of HCC. Conventional imaging modalities have limitations in discrimination of portal vein tumor thrombus. The application of positron emission tomography (PET) with 18F-fluorodeoxyglucose (18F-FDG) for discrimination between tumor extension and blood thrombus has been reported in few cases of HCC, while portal tumor thrombosis and portal vein clot identified by 18F-FDG PET/CT in HCC patients has not been reported so far.We present two HCC cases, one with portal vein tumor thrombus and one thrombosis who were identified with 18F-FDG PET/CT. This report illustrates the complimentary value of combining the morphological and functional imaging in achieving a correct diagnosis in such clinical situations.

  19. A computational approach identifies two regions of Hepatitis C Virus E1 protein as interacting domains involved in viral fusion process.

    Science.gov (United States)

    Bruni, Roberto; Costantino, Angela; Tritarelli, Elena; Marcantonio, Cinzia; Ciccozzi, Massimo; Rapicetta, Maria; El Sawaf, Gamal; Giuliani, Alessandro; Ciccaglione, Anna Rita

    2009-07-29

    The E1 protein of Hepatitis C Virus (HCV) can be dissected into two distinct hydrophobic regions: a central domain containing an hypothetical fusion peptide (FP), and a C-terminal domain (CT) comprising two segments, a pre-anchor and a trans-membrane (TM) region. In the currently accepted model of the viral fusion process, the FP and the TM regions are considered to be closely juxtaposed in the post-fusion structure and their physical interaction cannot be excluded. In the present study, we took advantage of the natural sequence variability present among HCV strains to test, by purely sequence-based computational tools, the hypothesis that in this virus the fusion process involves the physical interaction of the FP and CT regions of E1. Two computational approaches were applied. The first one is based on the co-evolution paradigm of interacting peptides and consequently on the correlation between the distance matrices generated by the sequence alignment method applied to FP and CT primary structures, respectively. In spite of the relatively low random genetic drift between genotypes, co-evolution analysis of sequences from five HCV genotypes revealed a greater correlation between the FP and CT domains than respect to a control HCV sequence from Core protein, so giving a clear, albeit still inconclusive, support to the physical interaction hypothesis.The second approach relies upon a non-linear signal analysis method widely used in protein science called Recurrence Quantification Analysis (RQA). This method allows for a direct comparison of domains for the presence of common hydrophobicity patterns, on which the physical interaction is based upon. RQA greatly strengthened the reliability of the hypothesis by the scoring of a lot of cross-recurrences between FP and CT peptides hydrophobicity patterning largely outnumbering chance expectations and pointing to putative interaction sites. Intriguingly, mutations in the CT region of E1, reducing the fusion process in

  20. Identifying Structure-Property Relationships Through DREAM.3D Representative Volume Elements and DAMASK Crystal Plasticity Simulations: An Integrated Computational Materials Engineering Approach

    Science.gov (United States)

    Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk

    2017-03-01

    Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.

  1. Identifying Structure-Property Relationships Through DREAM.3D Representative Volume Elements and DAMASK Crystal Plasticity Simulations: An Integrated Computational Materials Engineering Approach

    Science.gov (United States)

    Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk

    2017-05-01

    Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.

  2. Impact of different beam directions on intensity-modulated radiation therapy dose delivered to functioning lung tissue identified using single-photon emission computed tomography.

    Science.gov (United States)

    Tian, Qin; Zhang, Fucheng; Wang, Yanming; Qu, Weiqiang

    2014-01-01

    To use different beam arrangements and numbers to plan intensity-modulated radiation therapy (IMRT) and investigate their effects on low and high radiation doses delivered to the functional lung, in order to reduce radiation-induced lung damage. Ten patients with stage I-III non-small cell lung carcinoma (NSCLC) underwent IMRT. Beam arrangements were selected on the basis of orientation and dose-volume histograms to create SPECT-guided IMRT plans that spared the functional lung and maintained target coverage. Four different plans, including CT-7, SPECT-7, SPECT-4, SPECT-5 with different beam arrangements, were used. The differences of conformity index (CI), heterogeneity index (HI) between the plans were analyzed, by using a paired t-test. The seven-beam SPECT (SPECT-7) plan reduced the volume of the functional lung irradiated with at least 20 Gy (FV20) and 30 Gy (FV30) by 26.02% ±15.45% and 14.41% ±16.66%, respectively, as compared to the seven-beam computed tomography (CT-7) plan. The CI significantly differed between the SPECT-7 and SPECT-4 plans and between the SPECT-5 and SPECT-4 plans, but not between the SPECT-5 and SPECT-7 plans. The CIs in the SPECT-5 and SPECT-7 plans were better than that in the SPECT-4 plan. The heterogeneity index significantly differed among the three SPECT plans and was best in the SPECT-7 plan. The incorporation of SPECT images into IMRT planning for NSCLC greatly affected beam angles and number of beams. Fewer beams and modified beam angles achieved similar or better IMRT quality. The low-dose volumes were lower in SPECT-4.

  3. Detecting hepatic nodules and identifying feeding arteries of hepatocellular carcinoma:efifcacy of cone-beam computed tomography in transcatheter arterial chemoembolization

    Institute of Scientific and Technical Information of China (English)

    Yasuhiro Ushijima; Tsuyoshi Tajima; Akihiro Nishie; Yoshiki Asayama; Kousei Ishigami; Masakazu Hirakawa; Daisuke Kakihara; Daisuke Okamoto; Hiroshi Honda

    2016-01-01

    Aim: To evaluate the effectiveness of using cone-beam computed tomography (CBCT) in transcatheter arterial chemoembolization (TACE) to detect hapatocelular carcinoma (HCC) nodules and their feeding arteries.Methods: Twenty-four patients with HCCs who underwent TACE using CBCT in addition to conventional digital subtraction angiography (DSA) were enroled. After both conventional DSA and CBCT through the hepatic artery were acquired, TACE were performed. The nodules were deifned as an HCC when dense accumulation of iodized oil was found within the nodule on CT obtained 2 weeks after the TACE. The number of detected nodules and identiifed feeding arteries, and their correlations with anatomical locations were assessed.Results: A total of 39 HCC nodules (tumor diameter, 7-40 mm; mean, 17.4 ± 7.9 mm) were detected. Thirty-one nodules were detected by DSA alone but 8 nodules were additionaly detected by adding CBCT to DSA. There were 53 feeding arteries associated with the 39 HCC nodules. Among these arteries, 21 were identiifed by DSA alone; however, 47 were identiifed by combining CBCT with DSA. Additional feeding arteries, especialy for the nodules located at the right and caudate lobes, were identiifed by CBCT. On the other hand, there was no difference in detection of nodules between the anatomical locations by CBCT.Conclusion: The use of CBCT in addition to DSA offers potential for increasing the number of detected nodules, and the number of their feeding arteries at the right and caudate lobes. CBCT might improve the quality of TACE procedure for HCC than DSA alone.

  4. Computed tomography scan to detect traumatic arthrotomies and identify periarticular wounds not requiring surgical intervention: an improvement over the saline load test.

    Science.gov (United States)

    Konda, Sanjit R; Davidovitch, Roy I; Egol, Kenneth A

    2013-09-01

    To report our experience with computed tomography (CT) scans to detect traumatic arthrotomies of the knee (TAK) joint based on the presence of intra-articular air. Retrospective review. Level I trauma center. Sixty-two consecutive patients (63 knees) underwent a CT scan of the knee in the emergency department and had a minimum of 14 days follow-up. Cohort of 37 patients (37 knees) from the original 62 patients who underwent a saline load test (SLT). CT scan and SLT. Positive traumatic arthrotomy of the knee (+TAK) was defined as operating room (OR) confirmation of an arthrotomy or no intra-articular air on CT scan (-iaCT) (and -SLT if performed) with follow-up revealing a septic knee. Periarticular wound equivalent to no traumatic arthrotomy (pw = (-TAK)) was defined as OR evaluation revealing no arthrotomy or -iaCT (and -SLT if performed) with follow-up revealing no septic knee. All 32 knees with intra-articular air on CT scan (+iaCT) had OR confirmation of a TAK and none of these patients had a knee infection at a mean follow-up of 140.0 ± 279.6 days. None of the 31 patients with -iaCT had a knee infection at a mean follow-up of 291.0 ± 548.1 days. Based on these results, the sensitivity and specificity of the CT scan to detect +TAK and pw = (-TAK) was 100%. In a subgroup of 37 patients that received both a CT scan and the conventional SLT, the sensitivity and specificity of the CT scan was 100% compared with 92% for the SLT (P wounds that do not require surgical intervention and should be considered a valid diagnostic test in the appropriate clinical setting. Diagnostic Level III. See Instructions for Authors for a complete description of levels of evidence.

  5. Computational Analysis of mRNA Expression Profiles Identifies the ITG Family and PIK3R3 as Crucial Genes for Regulating Triple Negative Breast Cancer Cell Migration

    Directory of Open Access Journals (Sweden)

    Sukhontip Klahan

    2014-01-01

    Full Text Available Triple-negative breast cancer (TNBC is an aggressive type of breast cancer that does not express estrogen receptor (ER, progesterone receptor (PR, and human epidermal growth factor receptor (Her2/neu. TNBC has worse clinical outcomes than other breast cancer subtypes. However, the key molecules and mechanisms of TNBC migration remain unclear. In this study, we compared two normalized microarray datasets from GEO database between Asian (GSE33926 and non-Asian populations (GSE46581 to determine the molecules and common pathways in TNBC migration. We demonstrated that 16 genes in non-Asian samples and 9 genes in Asian samples are related to TNBC migration. In addition, our analytic results showed that 4 genes, PIK3R3, ITGB1, ITGAL, and ITGA6, were involved in the regulation of actin cytoskeleton. Our results indicated potential genes that link to TNBC migration. This study may help identify novel therapeutic targets for drug development in cancer therapy.

  6. A Fast SVM-Based Tongue’s Colour Classification Aided by k-Means Clustering Identifiers and Colour Attributes as Computer-Assisted Tool for Tongue Diagnosis

    Directory of Open Access Journals (Sweden)

    Nur Diyana Kamarudin

    2017-01-01

    Full Text Available In tongue diagnosis, colour information of tongue body has kept valuable information regarding the state of disease and its correlation with the internal organs. Qualitatively, practitioners may have difficulty in their judgement due to the instable lighting condition and naked eye’s ability to capture the exact colour distribution on the tongue especially the tongue with multicolour substance. To overcome this ambiguity, this paper presents a two-stage tongue’s multicolour classification based on a support vector machine (SVM whose support vectors are reduced by our proposed k-means clustering identifiers and red colour range for precise tongue colour diagnosis. In the first stage, k-means clustering is used to cluster a tongue image into four clusters of image background (black, deep red region, red/light red region, and transitional region. In the second-stage classification, red/light red tongue images are further classified into red tongue or light red tongue based on the red colour range derived in our work. Overall, true rate classification accuracy of the proposed two-stage classification to diagnose red, light red, and deep red tongue colours is 94%. The number of support vectors in SVM is improved by 41.2%, and the execution time for one image is recorded as 48 seconds.

  7. A Fast SVM-Based Tongue's Colour Classification Aided by k-Means Clustering Identifiers and Colour Attributes as Computer-Assisted Tool for Tongue Diagnosis

    Science.gov (United States)

    Ooi, Chia Yee; Kawanabe, Tadaaki; Odaguchi, Hiroshi; Kobayashi, Fuminori

    2017-01-01

    In tongue diagnosis, colour information of tongue body has kept valuable information regarding the state of disease and its correlation with the internal organs. Qualitatively, practitioners may have difficulty in their judgement due to the instable lighting condition and naked eye's ability to capture the exact colour distribution on the tongue especially the tongue with multicolour substance. To overcome this ambiguity, this paper presents a two-stage tongue's multicolour classification based on a support vector machine (SVM) whose support vectors are reduced by our proposed k-means clustering identifiers and red colour range for precise tongue colour diagnosis. In the first stage, k-means clustering is used to cluster a tongue image into four clusters of image background (black), deep red region, red/light red region, and transitional region. In the second-stage classification, red/light red tongue images are further classified into red tongue or light red tongue based on the red colour range derived in our work. Overall, true rate classification accuracy of the proposed two-stage classification to diagnose red, light red, and deep red tongue colours is 94%. The number of support vectors in SVM is improved by 41.2%, and the execution time for one image is recorded as 48 seconds.

  8. Use of Anisotropy, 3D Segmented Atlas, and Computational Analysis to Identify Gray Matter Subcortical Lesions Common to Concussive Injury from Different Sites on the Cortex.

    Directory of Open Access Journals (Sweden)

    Praveen Kulkarni

    Full Text Available Traumatic brain injury (TBI can occur anywhere along the cortical mantel. While the cortical contusions may be random and disparate in their locations, the clinical outcomes are often similar and difficult to explain. Thus a question that arises is, do concussions at different sites on the cortex affect similar subcortical brain regions? To address this question we used a fluid percussion model to concuss the right caudal or rostral cortices in rats. Five days later, diffusion tensor MRI data were acquired for indices of anisotropy (IA for use in a novel method of analysis to detect changes in gray matter microarchitecture. IA values from over 20,000 voxels were registered into a 3D segmented, annotated rat atlas covering 150 brain areas. Comparisons between left and right hemispheres revealed a small population of subcortical sites with altered IA values. Rostral and caudal concussions were of striking similarity in the impacted subcortical locations, particularly the central nucleus of the amygdala, laterodorsal thalamus, and hippocampal complex. Subsequent immunohistochemical analysis of these sites showed significant neuroinflammation. This study presents three significant findings that advance our understanding and evaluation of TBI: 1 the introduction of a new method to identify highly localized disturbances in discrete gray matter, subcortical brain nuclei without postmortem histology, 2 the use of this method to demonstrate that separate injuries to the rostral and caudal cortex produce the same subcortical, disturbances, and 3 the central nucleus of the amygdala, critical in the regulation of emotion, is vulnerable to concussion.

  9. Fog computing

    OpenAIRE

    Poplštein, Karel

    2016-01-01

    The purpose of this bachelor's thesis is to address fog computing technology, that emerged as a possible solution for the internet of things requirements and aims to lower latency and network bandwidth by moving a substantial part of computing operation to the network edge. The thesis identifies advantages as well as potential threats and analyses the possible solutions to these problems, proceeding to comparison of cloud and fog computing and specifying areas of use for both of them. Finally...

  10. Identifying Activity

    CERN Document Server

    Lewis, Adrian S

    2009-01-01

    Identification of active constraints in constrained optimization is of interest from both practical and theoretical viewpoints, as it holds the promise of reducing an inequality-constrained problem to an equality-constrained problem, in a neighborhood of a solution. We study this issue in the more general setting of composite nonsmooth minimization, in which the objective is a composition of a smooth vector function c with a lower semicontinuous function h, typically nonsmooth but structured. In this setting, the graph of the generalized gradient of h can often be decomposed into a union (nondisjoint) of simpler subsets. "Identification" amounts to deciding which subsets of the graph are "active" in the criticality conditions at a given solution. We give conditions under which any convergent sequence of approximate critical points finitely identifies the activity. Prominent among these properties is a condition akin to the Mangasarian-Fromovitz constraint qualification, which ensures boundedness of the set of...

  11. Parotid incidentaloma identified by combined {sup 18}F-fluorodeoxyglucose whole-body positron emission tomography and computed tomography: findings at grayscale and power Doppler ultrasonography and ultrasound-guided fine-needle aspiration biopsy or core-needle biopsy

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Kwon; Rho, Byung Hak [Keimyung University School of Medicine, Department of Radiology, Dongsan Medical Center, Daegu (Korea); Won, Kyoung Sook [Keimyung University School of Medicine, Nuclear Medicine, Dongsan Medical Center, Daegu (Korea)

    2009-09-15

    Twelve parotid incidentalomas in 10 consecutive subjects (nine with a known malignancy elsewhere and one presumptively healthy subject) identified by combined {sup 18}F-fluorodeoxyglucose whole-body positron emission tomography and computed tomography ({sup 18}F-FDG PET/CT) were investigated, with the aim of calculating maximum standardized uptake value (SUV{sub max}) of each FDG-avid focus, and identifying corresponding sonographic and pathologic findings. The results of ultrasound-guided fine-needle aspiration biopsy (FNAB) (n = 9) and core-needle biopsy (CNB) (n = 3) were Warthin tumor in 10 cases, and pleomorphic adenoma and chronic inflammation in one each. SUV{sub max} was 7.0-21.0 g/mL (average 13.7 g/mL) for Warthin tumor, 6.8 g/mL for pleomorphic adenoma, and 7.3 g/mL for chronic inflammation. Each FDG-avid focus corresponded to ovoid (n = 11) or lobulated (n = 1) hypoechoic mass on grayscale ultrasonography (US) and hypervascular mass, except one with chronic inflammation, on power Doppler (PD) US. Parotid incidentaloma identified by {sup 18}F-FDG PET/CT during workup of various malignancies elsewhere does not necessarily signify primary or metastatic malignancy, but indicates a high likelihood of benign lesions, particularly Warthin tumor. Such lesions should be evaluated thoroughly by US and ultrasound-guided FNAB or CNB if parotid disease would change the patient's treatment plan. (orig.)

  12. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  13. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  14. ON IDENTIFYING WATER BODY IN REMOTE SENSING IMAGES BASED ON DISTRIBUTED COMPUTING%基于分布式计算的遥感图像水体识别研究

    Institute of Scientific and Technical Information of China (English)

    杨柳; 田生伟

    2016-01-01

    为了提高遥感数据的处理速度,解决遥感信息提取中的数据密集与计算密集问题,将并行计算的思想引入到遥感图像的处理与信息提取中,构建基于Landsat ETM +影像的分布式遥感图像水体提取模型。以渭干河流域为研究区,利用单波段阈值法、多波段谱间关系法、水体指数法等方法进行水体信息自动提取的实验。实验结果表明,该模型具有较高的识别精度,能够快速识别水体,并具有稳定的可扩展性和伸缩性。%In order to improve the speed of remote sensing data processing and solve data-intensive and computing-intensive problems in remote sensing information extraction,we introduced the parallel computing idea to remote sensing image processing and information extraction,and built a Landsat ETM+images-based water body extraction model for distributed remote sensing image.We took Weigan River basin as the study region,used several methods such as single-band threshold,relationship between multiband spectra and water index,etc. to conduct experiments of automatic water body extraction.Experimental results demonstrated that the model has higher identification accuracy,it can identify water body information quickly,and has stable scalability and stretchability as well.

  15. Nuclear stress perfusion imaging versus computed tomography coronary angiography for identifying patients with obstructive coronary artery disease as defined by conventional angiography: insights from the CorE-64 multicenter study

    Directory of Open Access Journals (Sweden)

    Yutaka Tanami

    2014-08-01

    Full Text Available We investigated the diagnostic accuracy of computed tomography angiography (CTA versus myocardial perfusion imaging (MPI for detecting obstructive coronary artery disease (CAD as defined by conventional quantitative coronary angiography (QCA. Sixty-three patients who were enrolled in the CorE-64 multicenter study underwent CTA, MPI, and QCA imaging. All subjects were referred for cardiac catheterization with suspected or known coronary artery disease. The diagnostic accuracy of quantitative CTA and MPI for identifying patients with 50% or greater coronary arterial stenosis by QCA was evaluated using receiver operating characteristic (ROC analysis. Pre-defined subgroups were patients with known CAD and those with a calcium score of 400 or over. Diagnostic accuracy by ROC analysis revealed greater area under the curve (AUC for CTA than MPI for all 63 patients: 0.95 [95% confidence interval (CI: 0.89-0.100] vs 0.65 (95%CI: 0.53-0.77, respectively (P<0.01. Sensitivity, specificity, positive and negative predictive values were 0.93, 0.95, 0.97, 0.88, respectively, for CTA and 0.85, 0.45, 0.74, 0.63, respectively, for MPI. In 48 patients without known CAD, AUC was 0.96 for CTA and to 0.67 for SPECT (P<0.01. There was no significant difference in AUC for CTA in patients with calcium score below 400 versus over 400 (0.93 vs 0.95, but AUC was different for SPECT (0.61 vs 0.95; P<0.01. In a direct comparison, CTA is markedly superior to MPI for detecting obstructive coronary artery disease in patients. Even in subgroups traditionally more challenging for CTA, SPECT does not offer similarly good diagnostic accuracy. CTA may be considered the non-invasive test of choice if diagnosis of obstructive CAD is the purpose of imaging.

  16. “Drug mules” as a radiological challenge: Sensitivity and specificity in identifying internal cocaine in body packers, body pushers and body stuffers by computed tomography, plain radiography and Lodox

    Energy Technology Data Exchange (ETDEWEB)

    Flach, Patricia M., E-mail: patricia.flach@irm.uzh.ch [Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Bern, Buehlstrasse 20, 3012 Bern (Switzerland); Department of Neuroradiology, Inselspital Bern, University of Bern, 3010 Bern (Switzerland); Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Department of Radiology, University Hospital USZ, University of Zurich, Raemistrasse 100, 8091 Zurich (Switzerland); Ross, Steffen G. [Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Bern, Buehlstrasse 20, 3012 Bern (Switzerland); Ampanozi, Garyfalia; Ebert, Lars [Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Bern, Buehlstrasse 20, 3012 Bern (Switzerland); Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Germerott, Tanja; Hatch, Gary M. [Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Bern, Buehlstrasse 20, 3012 Bern (Switzerland); Thali, Michael J. [Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Bern, Buehlstrasse 20, 3012 Bern (Switzerland); Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Patak, Michael A. [Department of Radiology, Inselspital Bern, University of Bern, 3010 Bern (Switzerland); Department of Radiology, University Hospital USZ, University of Zurich, Raemistrasse 100, 8091 Zurich (Switzerland)

    2012-10-15

    Purpose: The purpose of our study was to retrospectively evaluate the specificity, sensitivity and accuracy of computed tomography (CT), digital radiography (DR) and low-dose linear slit digital radiography (LSDR, Lodox{sup ®}) in the detection of internal cocaine containers. Methods: Institutional review board approval was obtained. The study collectively consisted of 83 patients (76 males, 7 females, 16–45 years) suspected of having incorporated cocaine drug containers. All underwent radiological imaging; a total of 135 exams were performed: nCT = 35, nDR = 70, nLSDR = 30. An overall calculation of all “drug mules” and a specific evaluation of body packers, pushers and stuffers were performed. The gold standard was stool examination in a dedicated holding cell equipped with a drug toilet. Results: There were 54 drug mules identified in this study. CT of all drug carriers showed the highest diagnostic accuracy 97.1%, sensitivity 100% and specificity 94.1%. DR in all cases was 71.4% accurate, 58.3% sensitive and 85.3% specific. LSDR of all patients with internal cocaine was 60% accurate, 57.9% sensitive and 63.4% specific. Conclusions: CT was the most accurate test studied. Therefore, the detection of internal cocaine drug packs should be performed by CT, rather than by conventional X-ray, in order to apply the most sensitive exam in the medico-legal investigation of suspected drug carriers. Nevertheless, the higher radiation applied by CT than by DR or LSDR needs to be considered. Future studies should include evaluation of low dose CT protocols in order to address germane issues and to reduce dosage.

  17. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  18. Cold acclimation induced genes of trifoliate orange (Poncirus trifoliata).

    Science.gov (United States)

    Zhang, Can-kui; Lang, Ping; Dane, Fenny; Ebel, Robert C; Singh, Narendra K; Locy, Robert D; Dozier, William A

    2005-03-01

    Commercial citrus varieties are sensitive to low temperature. Poncirus trifoliata is a close relative of Citrus species and has been widely used as a cold-hardy rootstock for citrus production in low-temperature environments. mRNA differential display-reverse transcription (DDRT)-PCR and quantitative relative-RT-PCR were used to study gene expression of P. trifoliata under a gradual cold-acclimation temperature regime. Eight up-regulated cDNA fragments were isolated and sequenced. These fragments showed high similarities at the amino acid level to the following genes with known functions: betaine/proline transporter, water channel protein, aldo-keto reductase, early light-induced protein, nitrate transporter, tetratricopeptide-repeat protein, F-box protein, and ribosomal protein L15. These cold-acclimation up-regulated genes in P. trifoliata are also regulated by osmotic and photo-oxidative signals in other plants.

  19. Comparison of cone-beam computed tomography and periapical radiographs in identifying periapical bone lesions%锥形束CT与根尖X线片识别根尖周骨病损形貌差异的研究

    Institute of Scientific and Technical Information of China (English)

    张杰; 岳林; 张万林

    2013-01-01

    Objective To compare the difference in identifing periapical bone lesions between cone-beam computed tomography (CBCT) and periapical radiographs (PR).Methods Ten healthy maxillary incisors and 38 maxillary incisors which were diagnosed as apical periodontitis in clinic were included.All the teeth were taken with PR,and periapical index (PAI) was determined.All the teeth were taken with CBCT,and the longest diameter of incisorgingival,mesiodistal and labiapalatal dimension of bone lesion were measured and compared in axial,sagittal and coronal view separately.Results When PAI was 1,the image of PR and CBCT showed normal state of periapical region.When PAI was 2,the image of CBCT showed bone lesion in periapical region.When PAI were 3,4,5,the range of longest diameter of incisorgingival dimension were 2.75-7.00,4.51-10.63,6.75-13.38 mm,mesiodistal dimension were 1.88-4.13,3.98-7.25,5.38-10.63 mm,labiapalatal dimension were 1.50-5.70,2.63-6.25,4.50-9.91 mm.Within every grade the range of the longest diameter of bone lesion in all three dimensions varied.Among the three grades,the range of the longest diameter in same dimension was overlapped.Conclusions CBCT could show the bone lesion that PR couldn't.Use of PR in identifying bone lesion may underestimate the size of the lesion.%目的 比较锥形束CT与根尖X线片识别根尖周骨病损形貌的差异,为临床准确判断根尖周炎严重程度及治疗效果提供参考.方法 收集临床上颌切牙诊断为慢性根尖周炎的患牙38颗和健康对照牙10颗,用根尖周指数(periapical index,PAI)对根尖X线片进行分级评定,同时进行锥形束CT扫描,在轴位、矢状位和冠状位图像上定点测量并分别比较根尖周骨质缺损切龈向、近远中向和唇腭向的最大径.结果 PAI为1时,锥形束CT显示根尖周骨质无破坏;PAI为2时,锥形束CT显示根尖周骨质明显缺损;PAI为3、4、5时,骨破坏切龈向最大径分别为2.75 ~7.00、4.51 ~ 10.63、6

  20. Computational Combustion

    Energy Technology Data Exchange (ETDEWEB)

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  1. Computer vision

    Science.gov (United States)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  2. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  3. Cloud Computing Technologies

    Directory of Open Access Journals (Sweden)

    Sean Carlin

    2012-06-01

    Full Text Available This paper outlines the key characteristics that cloud computing technologies possess and illustrates the cloud computing stack containing the three essential services (SaaS, PaaS and IaaS that have come to define the technology and its delivery model. The underlying virtualization technologies that make cloud computing possible are also identified and explained. The various challenges that face cloud computing technologies today are investigated and discussed. The future of cloud computing technologies along with its various applications and trends are also explored, giving a brief outlook of where and how the technology will progress into the future.

  4. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem conspicu...

  5. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  6. Desktop Computing Integration Project

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  7. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  8. Computational Social Creativity.

    Science.gov (United States)

    Saunders, Rob; Bown, Oliver

    2015-01-01

    This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.

  9. Sparse Linear Identifiable Multivariate Modeling

    DEFF Research Database (Denmark)

    Henao, Ricardo; Winther, Ole

    2011-01-01

    In this paper we consider sparse and identifiable linear latent variable (factor) and linear Bayesian network models for parsimonious analysis of multivariate data. We propose a computationally efficient method for joint parameter and model inference, and model comparison. It consists of a fully...... Bayesian hierarchy for sparse models using slab and spike priors (two-component δ-function and continuous mixtures), non-Gaussian latent factors and a stochastic search over the ordering of the variables. The framework, which we call SLIM (Sparse Linear Identifiable Multivariate modeling), is validated...... and bench-marked on artificial and real biological data sets. SLIM is closest in spirit to LiNGAM (Shimizu et al., 2006), but differs substantially in inference, Bayesian network structure learning and model comparison. Experimentally, SLIM performs equally well or better than LiNGAM with comparable...

  10. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  11. Analysing Java Identifier Names

    OpenAIRE

    Butler, Simon

    2016-01-01

    Identifier names are the principal means of recording and communicating ideas in source code and are a significant source of information for software developers and maintainers, and the tools that support their work. This research aims to increase understanding of identifier name content types - words, abbreviations, etc. - and phrasal structures - noun phrases, verb phrases, etc. - by improving techniques for the analysis of identifier names. The techniques and knowledge acquired can be appl...

  12. Identifiability in stochastic models

    CERN Document Server

    1992-01-01

    The problem of identifiability is basic to all statistical methods and data analysis, occurring in such diverse areas as Reliability Theory, Survival Analysis, and Econometrics, where stochastic modeling is widely used. Mathematics dealing with identifiability per se is closely related to the so-called branch of ""characterization problems"" in Probability Theory. This book brings together relevant material on identifiability as it occurs in these diverse fields.

  13. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    高振桥

    2002-01-01

    If you work with a computer,it is certain that you can not avoid dealing, with at least one computer virus.But how much do you know about it? Well,actually,a computer virus is not a biological' one as causes illnesses to people.It is a kind of computer program

  14. Grid Computing

    Indian Academy of Sciences (India)

    2016-05-01

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers on demand. In this article,we describe the grid computing model and enumerate themajor differences between grid and cloud computing.

  15. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  16. Cloud Computing (2/2)

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  17. Cloud Computing (1/2)

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  18. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  19. Computational chemistry

    OpenAIRE

    2000-01-01

    Computational chemistry has come of age. With significant strides in computer hardware and software over the last few decades, computational chemistry has achieved full partnership with theory and experiment as a tool for understanding and predicting the behavior of a broad range of chemical, physical, and biological phenomena. The Nobel Prize award to John Pople and Walter Kohn in 1998 highlighted the importance of these advances in computational chemistry. With massively parallel computers ...

  20. Computational approaches to identify functional genetic variants in cancer genomes

    DEFF Research Database (Denmark)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu......The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result...

  1. spatially identifying vulnerable areas

    African Journals Online (AJOL)

    System (SMDSS) to identify factors that make forest and game reserves vulnerable .... involve the creation of a Digital Elevation Model (DEM), Slope Settlement and ... Feature). Spatial. Analyst Tool. (Slope). Buffer Tool. Buffer Tool. Buffer Tool.

  2. Distributed Persistent Identifiers System Design

    Directory of Open Access Journals (Sweden)

    Pavel Golodoniuc

    2017-06-01

    Full Text Available The need to identify both digital and physical objects is ubiquitous in our society. Past and present persistent identifier (PID systems, of which there is a great variety in terms of technical and social implementation, have evolved with the advent of the Internet, which has allowed for globally unique and globally resolvable identifiers. PID systems have, by in large, catered for identifier uniqueness, integrity, and persistence, regardless of the identifier’s application domain. Trustworthiness of these systems has been measured by the criteria first defined by Bütikofer (2009 and further elaborated by Golodoniuc 'et al'. (2016 and Car 'et al'. (2017. Since many PID systems have been largely conceived and developed by a single organisation they faced challenges for widespread adoption and, most importantly, the ability to survive change of technology. We believe that a cause of PID systems that were once successful fading away is the centralisation of support infrastructure – both organisational and computing and data storage systems. In this paper, we propose a PID system design that implements the pillars of a trustworthy system – ensuring identifiers’ independence of any particular technology or organisation, implementation of core PID system functions, separation from data delivery, and enabling the system to adapt for future change. We propose decentralisation at all levels — persistent identifiers and information objects registration, resolution, and data delivery — using Distributed Hash Tables and traditional peer-to-peer networks with information replication and caching mechanisms, thus eliminating the need for a central PID data store. This will increase overall system fault tolerance thus ensuring its trustworthiness. We also discuss important aspects of the distributed system’s governance, such as the notion of the authoritative source and data integrity

  3. Duality Computing in Quantum Computers

    Institute of Scientific and Technical Information of China (English)

    LONG Gui-Lu; LIU Yang

    2008-01-01

    In this letter, we propose a duality computing mode, which resembles particle-wave duality property when a quantum system such as a quantum computer passes through a double-slit. In this mode, computing operations are not necessarily unitary. The duality mode provides a natural link between classical computing and quantum computing. In addition, the duality mode provides a new tool for quantum algorithm design.

  4. Core of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Prof. C.P.Chandgude

    2017-04-01

    Full Text Available Advancement in computing facilities marks back from 1960’s with introduction of mainframes. Each of the computing has one or the other issues, so keeping this in mind cloud computing was introduced. Cloud computing has its roots in older technologies such as hardware virtualization, distributed computing, internet technologies, and autonomic computing. Cloud computing can be described with two models, one is service model and second is deployment model. While providing several services, cloud management’s primary role is resource provisioning. While there are several such benefits of cloud computing, there are challenges in adopting public clouds because of dependency on infrastructure that is shared by many enterprises. In this paper, we present core knowledge of cloud computing, highlighting its key concepts, deployment models, service models, benefits as well as security issues related to cloud data. The aim of this paper is to provide a better understanding of the cloud computing and to identify important research directions in this field

  5. Computational manufacturing

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a general framework for computational manufacturing. The methodology of computational manufacturing aims at integrating computational geometry, machining principle, sensor information fusion, optimization, computational intelligence and virtual prototyping to solve problems of the modeling, reasoning, control, planning and scheduling of manufacturing processes and systems. There are three typical problems in computational manufacturing, i.e., scheduling (time-domain), geometric reasoning (space-domain) and decision- making (interaction between time-domain and space-domain). Some theoretical fundamentals of computational manufacturing are also discussed.

  6. Identifying Knowledge and Communication

    Directory of Open Access Journals (Sweden)

    Eduardo Coutinho Lourenço de Lima

    2006-12-01

    Full Text Available In this paper, I discuss how the principle of identifying knowledge which Strawson advances in ‘Singular Terms and Predication’ (1961, and in ‘Identifying Reference and Truth-Values’ (1964 turns out to constrain communication. The principle states that a speaker’s use of a referring expression should invoke identifying knowledge on the part of the hearer, if the hearer is to understand what the speaker is saying, and also that, in so referring, speakers are attentive to hearers’ epistemic states. In contrasting it with Russell’s Principle (Evans 1982, as well as with the principle of identifying descriptions (Donnellan 1970, I try to show that the principle of identifying knowledge, ultimately a condition for understanding, makes sense only in a situation of conversation. This allows me to conclude that the cooperative feature of communication (Grice 1975 and reference (Clark andWilkes-Gibbs 1986 holds also at the understanding level. Finally, I discuss where Strawson’s views seem to be unsatisfactory, and suggest how they might be improved.

  7. Global Microbial Identifier

    DEFF Research Database (Denmark)

    Wielinga, Peter; Hendriksen, Rene S.; Aarestrup, Frank Møller

    2017-01-01

    microbial identifier (GMI) initiative, aims to build a database of whole microbial genome sequencing data linked to relevant metadata, which can be used to identify microorganisms, their communities and the diseases they cause. It would be a platform for storing whole genome sequencing (WGS) data......) will likely also enable a much better understanding of the pathogenesis of the infection and the molecular basis of the host response to infection. But the full potential of these advances will only transpire if the data in this area become transferable and thereby comparable, preferably in open......-source systems. There is therefore an obvious need to develop a global system of whole microbial genome databases to aggregate, share, mine and use microbiological genomic data, to address global public health and clinical challenges, and most importantly to identify and diagnose infectious diseases. The global...

  8. Contextual Computing

    CERN Document Server

    Porzel, Robert

    2011-01-01

    This book uses the latest in knowledge representation and human-computer interaction to address the problem of contextual computing in artificial intelligence. It uses high-level context to solve some challenging problems in natural language understanding.

  9. Computer Algebra.

    Science.gov (United States)

    Pavelle, Richard; And Others

    1981-01-01

    Describes the nature and use of computer algebra and its applications to various physical sciences. Includes diagrams illustrating, among others, a computer algebra system and flow chart of operation of the Euclidean algorithm. (SK)

  10. Computational dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  11. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  12. Quantum computing

    OpenAIRE

    Li, Shu-Shen; Long, Gui-lu; Bai, Feng-Shan; Feng, Song-Lin; Zheng, Hou-Zhi

    2001-01-01

    Quantum computing is a quickly growing research field. This article introduces the basic concepts of quantum computing, recent developments in quantum searching, and decoherence in a possible quantum dot realization.

  13. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  14. Identifying learning styles.

    Science.gov (United States)

    Hughes, Grace

    2016-12-14

    What was the nature of the CPD activity, practice-related feedback and/or event and/or experience in your practice? The article explored different learning styles and outlined some of the models that can be used to identify them. It discussed the limitations of these models, indicating that although they can be helpful in identifying a student's preferred learning style, this is not 'fixed' and might change over time. Learning is also influenced by other factors, such as culture and age.

  15. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  16. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...... cybernetics and Maturana and Varela’s theory of autopoiesis, which are both erroneously taken to support info-computationalism....

  17. Computing fundamentals introduction to computers

    CERN Document Server

    Wempen, Faithe

    2014-01-01

    The absolute beginner's guide to learning basic computer skills Computing Fundamentals, Introduction to Computers gets you up to speed on basic computing skills, showing you everything you need to know to conquer entry-level computing courses. Written by a Microsoft Office Master Instructor, this useful guide walks you step-by-step through the most important concepts and skills you need to be proficient on the computer, using nontechnical, easy-to-understand language. You'll start at the very beginning, getting acquainted with the actual, physical machine, then progress through the most common

  18. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  19. Computational Complexity

    Directory of Open Access Journals (Sweden)

    J. A. Tenreiro Machado

    2017-02-01

    Full Text Available Complex systems (CS involve many elements that interact at different scales in time and space. The challenges in modeling CS led to the development of novel computational tools with applications in a wide range of scientific areas. The computational problems posed by CS exhibit intrinsic difficulties that are a major concern in Computational Complexity Theory. [...

  20. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    of the new microprocessors and network technologies. However, the understanding of the computer represented within this program poses a challenge for the intentions of the program. The computer is understood as a multitude of invisible intelligent information devices which confines the computer as a tool...

  1. Distributed Computing.

    Science.gov (United States)

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  2. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  3. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...

  4. Computer Ease.

    Science.gov (United States)

    Drenning, Susan; Getz, Lou

    1992-01-01

    Computer Ease is an intergenerational program designed to put an Ohio elementary school's computer lab, software library, staff, and students at the disposal of older adults desiring to become computer literate. Three 90-minute instructional sessions allow seniors to experience 1-to-1 high-tech instruction by enthusiastic, nonthreatening…

  5. Identifying and Managing Risk.

    Science.gov (United States)

    Abraham, Janice M.

    1999-01-01

    The role of the college or university chief financial officer in institutional risk management is (1) to identify risk (physical, casualty, fiscal, business, reputational, workplace safety, legal liability, employment practices, general liability), (2) to develop a campus plan to reduce and control risk, (3) to transfer risk, and (4) to track and…

  6. Identifying Nursing's Future Leaders.

    Science.gov (United States)

    Gunning, Carolyn S.; Hawken, Patty L.

    1990-01-01

    A study determined that encouraging and supporting students in professional activities while they were still in school would lead those students to participate in professional nursing organizations after they graduated. Organized nursing needs to identify the factors that influence nurses to join organizations and concentrate on these factors to…

  7. Granular computing: perspectives and challenges.

    Science.gov (United States)

    Yao, JingTao; Vasilakos, Athanasios V; Pedrycz, Witold

    2013-12-01

    Granular computing, as a new and rapidly growing paradigm of information processing, has attracted many researchers and practitioners. Granular computing is an umbrella term to cover any theories, methodologies, techniques, and tools that make use of information granules in complex problem solving. The aim of this paper is to review foundations and schools of research and to elaborate on current developments in granular computing research. We first review some basic notions of granular computing. Classification and descriptions of various schools of research in granular computing are given. We also present and identify some research directions in granular computing.

  8. Human Computation

    CERN Document Server

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  9. Computer science

    CERN Document Server

    Blum, Edward K

    2011-01-01

    Computer Science: The Hardware, Software and Heart of It focuses on the deeper aspects of the two recognized subdivisions of Computer Science, Software and Hardware. These subdivisions are shown to be closely interrelated as a result of the stored-program concept. Computer Science: The Hardware, Software and Heart of It includes certain classical theoretical computer science topics such as Unsolvability (e.g. the halting problem) and Undecidability (e.g. Godel's incompleteness theorem) that treat problems that exist under the Church-Turing thesis of computation. These problem topics explain in

  10. Computer Science Research: Computation Directorate

    Energy Technology Data Exchange (ETDEWEB)

    Durst, M.J. (ed.); Grupe, K.F. (ed.)

    1988-01-01

    This report contains short papers in the following areas: large-scale scientific computation; parallel computing; general-purpose numerical algorithms; distributed operating systems and networks; knowledge-based systems; and technology information systems.

  11. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  12. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  13. Distributing an executable job load file to compute nodes in a parallel computer

    Science.gov (United States)

    Gooding, Thomas M.

    2016-08-09

    Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications link over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.

  14. Distributing an executable job load file to compute nodes in a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Gooding, Thomas M.

    2016-09-13

    Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications link over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.

  15. Identifying motifs in folktales using topic models

    NARCIS (Netherlands)

    Karsdorp, F.; Bosch, A.P.J. van den

    2013-01-01

    With the undertake of various folktale digitalization initiatives, the need for computational aids to explore these collections is increasing. In this paper we compare Labeled LDA (L-LDA) to a simple retrieval model on the task of identifying motifs in folktales. We show that both methods are well a

  16. Computer Literacy: Teaching Computer Ethics.

    Science.gov (United States)

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  17. Featured Image: Identifying Weird Galaxies

    Science.gov (United States)

    Kohler, Susanna

    2017-08-01

    Hoags Object, an example of a ring galaxy. [NASA/Hubble Heritage Team/Ray A. Lucas (STScI/AURA)]The above image (click for the full view) shows PanSTARRSobservationsof some of the 185 galaxies identified in a recent study as ring galaxies bizarre and rare irregular galaxies that exhibit stars and gas in a ring around a central nucleus. Ring galaxies could be formed in a number of ways; one theory is that some might form in a galaxy collision when a smaller galaxy punches through the center of a larger one, triggering star formation around the center. In a recent study, Ian Timmis and Lior Shamir of Lawrence Technological University in Michigan explore ways that we may be able to identify ring galaxies in the overwhelming number of images expected from large upcoming surveys. They develop a computer analysis method that automatically finds ring galaxy candidates based on their visual appearance, and they test their approach on the 3 million galaxy images from the first PanSTARRS data release. To see more of the remarkable galaxies the authors found and to learn more about their identification method, check out the paper below.CitationIan Timmis and Lior Shamir 2017 ApJS 231 2. doi:10.3847/1538-4365/aa78a3

  18. Collectively loading an application in a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Miller, Samuel J.; Mundy, Michael B.

    2016-01-05

    Collectively loading an application in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: identifying, by a parallel computer control system, a subset of compute nodes in the parallel computer to execute a job; selecting, by the parallel computer control system, one of the subset of compute nodes in the parallel computer as a job leader compute node; retrieving, by the job leader compute node from computer memory, an application for executing the job; and broadcasting, by the job leader to the subset of compute nodes in the parallel computer, the application for executing the job.

  19. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  20. Random Cell Identifiers Assignment

    Directory of Open Access Journals (Sweden)

    Robert Bestak

    2012-01-01

    Full Text Available Despite integration of advanced functions that enable Femto Access Points (FAPs to be deployed in a plug-and-play manner, the femtocell concept still cause several opened issues to be resolved. One of them represents an assignment of Physical Cell Identifiers (PCIs to FAPs. This paper analyses a random based assignment algorithm in LTE systems operating in diverse femtocell scenarios. The performance of the algorithm is evaluated by comparing the number of confusions for various femtocell densities, PCI ranges and knowledge of vicinity. Simulation results show that better knowledge of vicinity can significantly reduce the number of confusions events.

  1. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  2. Quantum Computing

    CERN Document Server

    Steane, A M

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarise not just quantum computing, but the whole subject of quantum information theory. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, the review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the EPR experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from classical information theory, and, arguably, quantum from classical physics. Basic quantum information ideas are described, including key distribution, teleportation, data compression, quantum error correction, the universal quantum computer and qua...

  3. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Computer viruses are small software programs that are designed to spread from one computerto another and to interfere with computer operation.A virus might delete data on your computer,use your e-mail program to spread itself to othercomputers,or even erase everything on your hard disk.Viruses are most easily spread by attach-ments in e-mail messages or instant messaging messages.That is why it is essential that you never

  4. Programming the social computer.

    Science.gov (United States)

    Robertson, David; Giunchiglia, Fausto

    2013-03-28

    The aim of 'programming the global computer' was identified by Milner and others as one of the grand challenges of computing research. At the time this phrase was coined, it was natural to assume that this objective might be achieved primarily through extending programming and specification languages. The Internet, however, has brought with it a different style of computation that (although harnessing variants of traditional programming languages) operates in a style different to those with which we are familiar. The 'computer' on which we are running these computations is a social computer in the sense that many of the elementary functions of the computations it runs are performed by humans, and successful execution of a program often depends on properties of the human society over which the program operates. These sorts of programs are not programmed in a traditional way and may have to be understood in a way that is different from the traditional view of programming. This shift in perspective raises new challenges for the science of the Web and for computing in general.

  5. Biological computation

    CERN Document Server

    Lamm, Ehud

    2011-01-01

    Introduction and Biological BackgroundBiological ComputationThe Influence of Biology on Mathematics-Historical ExamplesBiological IntroductionModels and Simulations Cellular Automata Biological BackgroundThe Game of Life General Definition of Cellular Automata One-Dimensional AutomataExamples of Cellular AutomataComparison with a Continuous Mathematical Model Computational UniversalitySelf-Replication Pseudo Code Evolutionary ComputationEvolutionary Biology and Evolutionary ComputationGenetic AlgorithmsExample ApplicationsAnalysis of the Behavior of Genetic AlgorithmsLamarckian Evolution Genet

  6. Cloud Computing

    CERN Document Server

    Mirashe, Shivaji P

    2010-01-01

    Computing as you know it is about to change, your applications and documents are going to move from the desktop into the cloud. I'm talking about cloud computing, where applications and files are hosted on a "cloud" consisting of thousands of computers and servers, all linked together and accessible via the Internet. With cloud computing, everything you do is now web based instead of being desktop based. You can access all your programs and documents from any computer that's connected to the Internet. How will cloud computing change the way you work? For one thing, you're no longer tied to a single computer. You can take your work anywhere because it's always accessible via the web. In addition, cloud computing facilitates group collaboration, as all group members can access the same programs and documents from wherever they happen to be located. Cloud computing might sound far-fetched, but chances are you're already using some cloud applications. If you're using a web-based email program, such as Gmail or Ho...

  7. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  8. Computational Sustainability

    OpenAIRE

    Eaton, Eric; University of Pennsylvania; Gomes, Carla P.; Cornell University; Williams, Brian; Massachusetts Institute of Technology

    2014-01-01

    Computational sustainability problems, which exist in dynamic environments with high amounts of uncertainty, provide a variety of unique challenges to artificial intelligence research and the opportunity for significant impact upon our collective future. This editorial provides an overview of artificial intelligence for computational sustainability, and introduces this special issue of AI Magazine.

  9. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  10. Grid Computing

    Science.gov (United States)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  11. Computational Deception

    NARCIS (Netherlands)

    Nijholt, Antinus; Acosta, P.S.; Cravo, P.

    2010-01-01

    In the future our daily life interactions with other people, with computers, robots and smart environments will be recorded and interpreted by computers or embedded intelligence in environments, furniture, robots, displays, and wearables. These sensors record our activities, our behaviour, and our

  12. Computational Science

    Institute of Scientific and Technical Information of China (English)

    K. Li

    2007-01-01

    @@ Computer science is the discipline that anchors the computer industry which has been improving processor performance, communication bandwidth and storage capacity on the so called "Moore's law" curve or at the rate of doubling every 18 to 24 months during the past decades.

  13. Cheater identifiable visual secret sharing scheme

    Institute of Scientific and Technical Information of China (English)

    Gan Zhi; Chen Kefei

    2005-01-01

    The visual secret sharing scheme proposed by Naor and Shamir provides a way to encrypt a secret black-white image into shares. A qualified group of participants can recover the secret message without using any cryptographic computation. But the original scheme can easily be corrupted by malicious participant. We propose an extension of VSS(visual secret sharing) to identify cheaters before the secret is recovered. Without the need for any additional information and cryptographic computation, every participant can verify the validity of shares of other participants, thus the security of VSS is enhanced.

  14. Granular Computing

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    The basic ideas and principles of granular computing (GrC) have been studied explicitly or implicitly in many fields in isolation. With the recent renewed and fast growing interest, it is time to extract the commonality from a diversity of fields and to study systematically and formally the domain independent principles of granular computing in a unified model. A framework of granular computing can be established by applying its own principles. We examine such a framework from two perspectives,granular computing as structured thinking and structured problem solving. From the philosophical perspective or the conceptual level,granular computing focuses on structured thinking based on multiple levels of granularity. The implementation of such a philosophy in the application level deals with structured problem solving.

  15. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    on Theory of Computing, pages 25-334, May 2000. [3]Tal Rabin and Michael Ben-Or. Verifiable secret sharing and multiparty protocols with honest majority (extended abstract). In Proceedings of the Twenty First Annual ACM Symposium on Theory of Computing, pages 73-85, Seattle, Washington, 15-17 May 1989.......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... an impossibility result indicating that a similar equivalence does not hold for Multiparty Computation (MPC): we show that even if protocols are given black-box access for free to an idealized secret sharing scheme secure for the access structure in question, it is not possible to handle all relevant access...

  16. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  17. Ubiquitous Computing: Potentials and Challenges

    CERN Document Server

    Sen, Jaydip

    2010-01-01

    The world is witnessing the birth of a revolutionary computing paradigm that promises to have a profound effect on the way we interact with computers, devices, physical spaces, and other people. This new technology, called ubiquitous computing, envisions a world where embedded processors, computers, sensors, and digital communications are inexpensive commodities that are available everywhere. Ubiquitous computing will surround users with a comfortable and convenient information environment that merges physical and computational infrastructures into an integrated habitat. This habitat will feature a proliferation of hundreds or thousands of computing devices and sensors that will provide new functionality, offer specialized services, and boost productivity and interaction. This paper presents a comprehensive discussion on the central trends in ubiquitous computing considering them form technical, social and economic perspectives. It clearly identifies different application areas and sectors that will benefit f...

  18. Computer/Information Science

    Science.gov (United States)

    Birman, Ken; Roughgarden, Tim; Seltzer, Margo; Spohrer, Jim; Stolterman, Erik; Kearsley, Greg; Koszalka, Tiffany; de Jong, Ton

    2013-01-01

    Scholars representing the field of computer/information science were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Ken Birman, Jennifer Rexford, Tim Roughgarden, Margo Seltzer, Jim Spohrer, and…

  19. Where to look when identifying roadkilled amphibians?

    Directory of Open Access Journals (Sweden)

    Marc Franch

    2015-12-01

    Full Text Available Roads have multiple effects on wildlife; amphibians are one of the groups more intensely affected by roadkills. Monitoring roadkills is expensive and time consuming. Automated mapping systems for detecting roadkills, based on robotic computer vision techniques, are largely necessary. Amphibians can be recognised by a set of features as shape, size, colouration, habitat and location. This species identification by using multiple features at the same time is known as “jizz”. In a similar way to human vision, computer vision algorithms must incorporate a prioritisation process when analysing the objects in an image. Our main goal here was to give a numerical priority sequence of particular characteristics of roadkilled amphibians to improve the computing and learning process of algorithms. We asked hundred and five amateur and professional herpetologists to answer a simple test of five sets with ten images each of roadkilled amphibians, in order to determine which body parts or characteristics (body form, colour, and other patterns are used to identify correctly the species. Anura was the group most easily identified when it was roadkilled and Caudata was the most difficult. The lower the taxonomic level of amphibian, the higher the difficulty of identifying them, both in Anura and Caudata. Roadkilled amphibians in general and Anura group were mostly identified by the Form, by the combination of Form and Colour, and finally by Colour. Caudata was identified mainly on Form and Colour and on Colour. Computer vision algorithms must incorporate these combinations of features, avoiding to work exclusively in one specific feature.

  20. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  1. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  2. Cloud computing security.

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.

    2010-10-01

    Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for both academia and government, including configuration options, hardware issues, challenges, and solutions.

  3. Computer networks forensics

    Directory of Open Access Journals (Sweden)

    Ratomir Đ. Đokić

    2013-02-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 Digital forensics is a set of scientific methods and procedures for collection, analysis and presentation of evidence that can be found on the computers, servers, computer networks, databases, mobile devices, as well as all other devices on which can store (save data. Digital forensics, computer networks is an examination of digital evidence that can be found on servers and user devices, which are exchanged internal or external communication through local or public networks. Also there is a need for identifying sites and modes of origin messages, establish user identification, and detection types of manipulation by logging in to your account. This paper presents the basic elements of computer networks, software used to communicate and describe the methods of collecting digital evidence and their analysis.

  4. Chromatin computation.

    Directory of Open Access Journals (Sweden)

    Barbara Bryant

    Full Text Available In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this "chromatin computer" to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal--and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines.

  5. Computing methods

    CERN Document Server

    Berezin, I S

    1965-01-01

    Computing Methods, Volume 2 is a five-chapter text that presents the numerical methods of solving sets of several mathematical equations. This volume includes computation sets of linear algebraic equations, high degree equations and transcendental equations, numerical methods of finding eigenvalues, and approximate methods of solving ordinary differential equations, partial differential equations and integral equations.The book is intended as a text-book for students in mechanical mathematical and physics-mathematical faculties specializing in computer mathematics and persons interested in the

  6. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  7. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  8. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  9. Computer Science Research at Langley

    Science.gov (United States)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  10. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  11. COMPUTERS HAZARDS

    Directory of Open Access Journals (Sweden)

    Andrzej Augustynek

    2007-01-01

    Full Text Available In June 2006, over 12.6 million Polish users of the Web registered. On the average, each of them spent 21 hours and 37 minutes monthly browsing the Web. That is why the problems of the psychological aspects of computer utilization have become an urgent research subject. The results of research into the development of Polish information society carried out in AGH University of Science and Technology, under the leadership of Leslaw H. Haber, in the period from 2000 until present time, indicate the emergence dynamic changes in the ways of computer utilization and their circumstances. One of the interesting regularities has been the inverse proportional relation between the level of computer skills and the frequency of the Web utilization.It has been found that in 2005, compared to 2000, the following changes occurred:- A significant drop in the number of students who never used computers and the Web;- Remarkable increase in computer knowledge and skills (particularly pronounced in the case of first years student- Decreasing gap in computer skills between students of the first and the third year; between male and female students;- Declining popularity of computer games.It has been demonstrated also that the hazard of computer screen addiction was the highest in he case of unemployed youth outside school system. As much as 12% of this group of young people were addicted to computer. A lot of leisure time that these youths enjoyed inducted them to excessive utilization of the Web. Polish housewives are another population group in risk of addiction to the Web. The duration of long Web charts carried out by younger and younger youths has been another matter of concern. Since the phenomenon of computer addiction is relatively new, no specific therapy methods has been developed. In general, the applied therapy in relation to computer addition syndrome is similar to the techniques applied in the cases of alcohol or gambling addiction. Individual and group

  12. Quantum Computers

    Science.gov (United States)

    2010-03-04

    efficient or less costly than their classical counterparts. A large-scale quantum computer is certainly an extremely ambi- tious goal, appearing to us...outperform the largest classical supercomputers in solving some specific problems important for data encryption. In the long term, another application...which the quantum computer depends, causing the quantum mechanically destructive process known as decoherence . Decoherence comes in several forms

  13. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  14. Regions of constrained maximum likelihood parameter identifiability

    Science.gov (United States)

    Lee, C.-H.; Herget, C. J.

    1975-01-01

    This paper considers the parameter identification problem of general discrete-time, nonlinear, multiple-input/multiple-output dynamic systems with Gaussian-white distributed measurement errors. Knowledge of the system parameterization is assumed to be known. Regions of constrained maximum likelihood (CML) parameter identifiability are established. A computation procedure employing interval arithmetic is proposed for finding explicit regions of parameter identifiability for the case of linear systems. It is shown that if the vector of true parameters is locally CML identifiable, then with probability one, the vector of true parameters is a unique maximal point of the maximum likelihood function in the region of parameter identifiability and the CML estimation sequence will converge to the true parameters.

  15. Computational oncology.

    Science.gov (United States)

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  16. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  17. Mathematics and the computer revolution

    Science.gov (United States)

    Atiyah, M. F.

    2016-08-01

    Computers have transformed modern society and mathematics has not escaped. This article written 30 year ago, highlighted the impact on mathematics at many levels. Despite the exponential growth of computer power and sophistication, the dangers identified 30 years ago remain as pertinent as ever.

  18. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  19. Security Architecture of Cloud Computing

    Directory of Open Access Journals (Sweden)

    V.KRISHNA REDDY

    2011-09-01

    Full Text Available The Cloud Computing offers service over internet with dynamically scalable resources. Cloud Computing services provides benefits to the users in terms of cost and ease of use. Cloud Computing services need to address the security during the transmission of sensitive data and critical applications to shared and public cloud environments. The cloud environments are scaling large for data processing and storage needs. Cloud computing environment have various advantages as well as disadvantages on the data security of service consumers. This paper aims to emphasize the main security issues existing in cloud computing environments. The security issues at various levels of cloud computing environment is identified in this paper and categorized based on cloud computing architecture. This paper focuses on the usage of Cloud services and security issues to build these cross-domain Internet-connected collaborations.

  20. Possibilities for Healthcare Computing

    Institute of Scientific and Technical Information of China (English)

    Peter Szolovits

    2011-01-01

    Advances in computing technology promise to aid in achieving the goals of healthcare.We review how such changes can support each of the goá1s of healthcare as identified by the U.S.Institute of Medicine:safety,effectiveness,patient-centricity,timeliness,efficiency,and equitability.We also describe current foci of computing technology research aimed at realizing the ambitious goals for health information technology that have been set by the American Recovery and Reinvestment Act of 2009 and the Health Reform Act of 2010.Finally,we mention efforts to build health information technologies to support improved healthcare delivery in developing countries.

  1. Central nervous system and computation.

    Science.gov (United States)

    Guidolin, Diego; Albertin, Giovanna; Guescini, Michele; Fuxe, Kjell; Agnati, Luigi F

    2011-12-01

    Computational systems are useful in neuroscience in many ways. For instance, they may be used to construct maps of brain structure and activation, or to describe brain processes mathematically. Furthermore, they inspired a powerful theory of brain function, in which the brain is viewed as a system characterized by intrinsic computational activities or as a "computational information processor. "Although many neuroscientists believe that neural systems really perform computations, some are more cautious about computationalism or reject it. Thus, does the brain really compute? Answering this question requires getting clear on a definition of computation that is able to draw a line between physical systems that compute and systems that do not, so that we can discern on which side of the line the brain (or parts of it) could fall. In order to shed some light on the role of computational processes in brain function, available neurobiological data will be summarized from the standpoint of a recently proposed taxonomy of notions of computation, with the aim of identifying which brain processes can be considered computational. The emerging picture shows the brain as a very peculiar system, in which genuine computational features act in concert with noncomputational dynamical processes, leading to continuous self-organization and remodeling under the action of external stimuli from the environment and from the rest of the organism.

  2. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    The second half of the 20th century has been characterized by an explosive development in information technology (Maney, Hamm, & O'Brien, 2011). Processing power, storage capacity and network bandwidth have increased exponentially, resulting in new possibilities and shifting IT paradigms. In step...... with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production...

  3. Quantum computers.

    Science.gov (United States)

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  4. Computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  5. Computational Artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature of that wh...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  6. Distributed computing

    CERN Document Server

    Van Renesse, R

    1991-01-01

    This series will start with an introduction to distributed computing systems. Distributed computing paradigms will be presented followed by a discussion on how several important contemporary distributed operating systems use these paradigms. Topics will include processing paradigms, storage paradigms, scalability and robustness. Throughout the course everything will be illustrated by modern distributed systems notably the Amoeba distributed operating system of the Free University in Amsterdam and the Plan 9 operating system of AT&T Bell Laboratories. Plan 9 is partly designed and implemented by Ken Thompson, the main person behind the successful UNIX operating system.

  7. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  8. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  9. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  10. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  11. COMPUTATIONAL THINKING

    OpenAIRE

    Evgeniy K. Khenner

    2016-01-01

    Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education;...

  12. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  13. Computer immunology.

    Science.gov (United States)

    Forrest, Stephanie; Beauchemin, Catherine

    2007-04-01

    This review describes a body of work on computational immune systems that behave analogously to the natural immune system. These artificial immune systems (AIS) simulate the behavior of the natural immune system and in some cases have been used to solve practical engineering problems such as computer security. AIS have several strengths that can complement wet lab immunology. It is easier to conduct simulation experiments and to vary experimental conditions, for example, to rule out hypotheses; it is easier to isolate a single mechanism to test hypotheses about how it functions; agent-based models of the immune system can integrate data from several different experiments into a single in silico experimental system.

  14. Computer systems

    Science.gov (United States)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  15. Computer viruses

    Science.gov (United States)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  16. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  17. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...... set of skills rather than one single skill. Skills acquisition at these layers can be tailored to the specific needs of students. The work presented here builds upon experience from courses for such students from the Humanities in which programming is taught as a tool for other purposes. Results...

  18. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  19. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  20. Computational Physics.

    Science.gov (United States)

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the…

  1. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  2. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  3. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  4. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  5. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  6. Computational trigonometry

    Energy Technology Data Exchange (ETDEWEB)

    Gustafson, K. [Univ. of Colorado, Boulder, CO (United States)

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  7. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  8. Computational evolution: taking liberties.

    Science.gov (United States)

    Correia, Luís

    2010-09-01

    Evolution has, for a long time, inspired computer scientists to produce computer models mimicking its behavior. Evolutionary algorithm (EA) is one of the areas where this approach has flourished. EAs have been used to model and study evolution, but they have been especially developed for their aptitude as optimization tools for engineering. Developed models are quite simple in comparison with their natural sources of inspiration. However, since EAs run on computers, we have the freedom, especially in optimization models, to test approaches both realistic and outright speculative, from the biological point of view. In this article, we discuss different common evolutionary algorithm models, and then present some alternatives of interest. These include biologically inspired models, such as co-evolution and, in particular, symbiogenetics and outright artificial operators and representations. In each case, the advantages of the modifications to the standard model are identified. The other area of computational evolution, which has allowed us to study basic principles of evolution and ecology dynamics, is the development of artificial life platforms for open-ended evolution of artificial organisms. With these platforms, biologists can test theories by directly manipulating individuals and operators, observing the resulting effects in a realistic way. An overview of the most prominent of such environments is also presented. If instead of artificial platforms we use the real world for evolving artificial life, then we are dealing with evolutionary robotics (ERs). A brief description of this area is presented, analyzing its relations to biology. Finally, we present the conclusions and identify future research avenues in the frontier of computation and biology. Hopefully, this will help to draw the attention of more biologists and computer scientists to the benefits of such interdisciplinary research.

  9. Verifiable Computation with Massively Parallel Interactive Proofs

    CERN Document Server

    Thaler, Justin; Mitzenmacher, Michael; Pfister, Hanspeter

    2012-01-01

    As the cloud computing paradigm has gained prominence, the need for verifiable computation has grown increasingly urgent. The concept of verifiable computation enables a weak client to outsource difficult computations to a powerful, but untrusted, server. Protocols for verifiable computation aim to provide the client with a guarantee that the server performed the requested computations correctly, without requiring the client to perform the computations herself. By design, these protocols impose a minimal computational burden on the client. However, existing protocols require the server to perform a large amount of extra bookkeeping in order to enable a client to easily verify the results. Verifiable computation has thus remained a theoretical curiosity, and protocols for it have not been implemented in real cloud computing systems. Our goal is to leverage GPUs to reduce the server-side slowdown for verifiable computation. To this end, we identify abundant data parallelism in a state-of-the-art general-purpose...

  10. Digital Identifier Systems: Comparative Evaluation

    Directory of Open Access Journals (Sweden)

    Hamid Reza Khedmatgozar

    2015-02-01

    Full Text Available Identifier is one of the main elements in identifying an object in digital environment. Digital identifier systems were developed followed by a lot of problems such as violation of persistency and uniqueness of physical identifiers and URL in digital environment. These identifiers try to guarantee uniqueness and persistency of hostnames by using indirect names for Domain Name System (DNS. The main objective of this research is to identify qualified digital identifier system among other systems. To achieve the research objective, researchers have considered two major steps: first, identifying main criteria for distinguishing digital identifier based on literature review and focus group interview; and second, performing a comparative evaluation on common identifier systems in the world. Findings of first step demonstrated seven main criteria in three domains for distinguishing digital identifier systems: identifier uniqueness and persistency in the identifier features domain, digital identification, digital uniqueness, digital persistency and digital actionability in the digital coverage domain, and globality in the comprehensiveness of scope domain. In the second step, results of the comparative evaluation on common identifier systems indicated that six identifier systems, included, DOI, Handle, UCI, URN, ARK and PURL, are appropriate choices for using as a digital identifier system. Also, according to these results, three identification systems Including NBN, MARIAM and ISNI were identified as suitable choices for digital identification in certain specialized fields. According to many benefits of using these identifiers in important applied fields, such as, digital content chains and networks integration, digital right management, cross referencing, digital libraries and citation analysis, results of this study can help digital environment experts to diagnose digital identifier and their effective use in applied fields.

  11. Computational principles of memory.

    Science.gov (United States)

    Chaudhuri, Rishidev; Fiete, Ila

    2016-03-01

    The ability to store and later use information is essential for a variety of adaptive behaviors, including integration, learning, generalization, prediction and inference. In this Review, we survey theoretical principles that can allow the brain to construct persistent states for memory. We identify requirements that a memory system must satisfy and analyze existing models and hypothesized biological substrates in light of these requirements. We also highlight open questions, theoretical puzzles and problems shared with computer science and information theory.

  12. Computationally modeling interpersonal trust

    OpenAIRE

    Jin Joo eLee; Brad eKnox; Jolie eBaumann; Cynthia eBreazeal; David eDeSteno

    2013-01-01

    We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our pr...

  13. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  14. Computer programs as accounting object

    Directory of Open Access Journals (Sweden)

    I.V. Perviy

    2015-03-01

    Full Text Available Existing approaches to the regulation of accounting software as one of the types of intangible assets have been considered. The features and current state of the legal protection of computer programs have been analyzed. The reasons for the need to use patent law as a means of legal protection of individual elements of computer programs have been discovered. The influence of the legal aspects of the use of computer programs for national legislation to their accounting reflection has been analyzed. The possible options for the transfer of rights from computer programs copyright owners have been analyzed that should be considered during creation of software accounting system at the enterprise. Identified and analyzed the characteristics of computer software as an intangible asset under the current law. General economic characteristics of computer programs as one of the types of intangible assets have been grounded. The main distinguishing features of software compared to other types of intellectual property have been all ocated

  15. Studi Perbandingan Layanan Cloud Computing

    Directory of Open Access Journals (Sweden)

    Afdhal Afdhal

    2014-03-01

    Full Text Available In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud delivery service, correlation and inter-dependency. This article compares and contrasts the different levels of delivery services and the development models, identify issues, and future directions on cloud computing. The end-users comprehension of cloud computing delivery service classification will equip them with knowledge to determine and decide which business model that will be chosen and adopted securely and comfortably. The last part of this article provides several recommendations for cloud computing service providers and end-users.

  16. Identifying driver mutations in sequenced cancer genomes

    DEFF Research Database (Denmark)

    Raphael, Benjamin J; Dobson, Jason R; Oesper, Layla

    2014-01-01

    High-throughput DNA sequencing is revolutionizing the study of cancer and enabling the measurement of the somatic mutations that drive cancer development. However, the resulting sequencing datasets are large and complex, obscuring the clinically important mutations in a background of errors, noise......, and random mutations. Here, we review computational approaches to identify somatic mutations in cancer genome sequences and to distinguish the driver mutations that are responsible for cancer from random, passenger mutations. First, we describe approaches to detect somatic mutations from high-throughput DNA...... sequencing data, particularly for tumor samples that comprise heterogeneous populations of cells. Next, we review computational approaches that aim to predict driver mutations according to their frequency of occurrence in a cohort of samples, or according to their predicted functional impact on protein...

  17. Computational Electromagnetics

    CERN Document Server

    Rylander, Thomas; Bondeson, Anders

    2013-01-01

    Computational Electromagnetics is a young and growing discipline, expanding as a result of the steadily increasing demand for software for the design and analysis of electrical devices. This book introduces three of the most popular numerical methods for simulating electromagnetic fields: the finite difference method, the finite element method and the method of moments. In particular it focuses on how these methods are used to obtain valid approximations to the solutions of Maxwell's equations, using, for example, "staggered grids" and "edge elements." The main goal of the book is to make the reader aware of different sources of errors in numerical computations, and also to provide the tools for assessing the accuracy of numerical methods and their solutions. To reach this goal, convergence analysis, extrapolation, von Neumann stability analysis, and dispersion analysis are introduced and used frequently throughout the book. Another major goal of the book is to provide students with enough practical understan...

  18. Computational Physics

    Science.gov (United States)

    Thijssen, Jos

    2013-10-01

    1. Introduction; 2. Quantum scattering with a spherically symmetric potential; 3. The variational method for the Schrödinger equation; 4. The Hartree-Fock method; 5. Density functional theory; 6. Solving the Schrödinger equation in periodic solids; 7. Classical equilibrium statistical mechanics; 8. Molecular dynamics simulations; 9. Quantum molecular dynamics; 10. The Monte Carlo method; 11. Transfer matrix and diagonalisation of spin chains; 12. Quantum Monte Carlo methods; 13. The infinite element method for partial differential equations; 14. The lattice Boltzmann method for fluid dynamics; 15. Computational methods for lattice field theories; 16. High performance computing and parallelism; Appendix A. Numerical methods; Appendix B. Random number generators; References; Index.

  19. Computational Electromagnetics

    Science.gov (United States)

    2011-02-20

    a collaboration between Caltech’s postdoctoral associate N. Albin and OB) have shown that, for a variety of reasons, the first-order...KZK approximation", Nathan Albin , Oscar P. Bruno, Theresa Y. Cheung and Robin O. Cleveland, preprint, (2011) "A Spectral FC Solver for the Compressible...Navier-Stokes Equations in General Domains I: Explicit time-stepping" Nathan Albin and Oscar P. Bruno, To appear in Journal of Computational Physics

  20. Computer files.

    Science.gov (United States)

    Malik, M

    1995-02-01

    From what has been said, several recommendations can be made for users of small personal computers regardless of which operating system they use. If your computer has a large hard disk not specially required by any single application, organize the disk into a small number of volumes. You will then be using the computer as if it had several smaller disks, which will help you to create a logical file structure. The size of individual volumes has to be selected carefully with respect to the files kept in each volume. Otherwise, it may be that you will have too much space in one volume and not enough in another. In each volume, organize the structure of directories and subdirectories logically so that they correspond to the logic of your file content. Be aware of the fact that the directories suggested as default when installing new software are often not the optimum. For instance, it is better to put different graphics packages under a common subdirectory rather than to install them at the same level as all other packages including statistics, text processors, etc. Create a special directory for each task you use the computer. Note that it is a bad practice to keep many different and logically unsorted files in the root directory of any of your volumes. Only system and important service files should be kept there. Although any file may be written all over the disk, access to it will be faster if it is written over the minimum number of cylinders. From time to time, use special programs that reorganize your files in this way.(ABSTRACT TRUNCATED AT 250 WORDS)

  1. Everything Computes

    Institute of Scientific and Technical Information of China (English)

    Bill; Hofmann

    1999-01-01

    Dear American Professor, I am a student in Beijing. At the beginning of last semester, we fourroommates gathered some 10,000 yuan (a big sum here. approximately 1150USD ) and bought a computer, which is our joint-property. Since the computercame into our room, it was used round the clock except the time we were havingc1asses. So even at midnight, when I woke up from the dream, I could still see

  2. Identifying Broadband Rotational Spectra with Neural Networks

    Science.gov (United States)

    Zaleski, Daniel P.; Prozument, Kirill

    2017-06-01

    A typical broadband rotational spectrum may contain several thousand observable transitions, spanning many species. Identifying the individual spectra, particularly when the dynamic range reaches 1,000:1 or even 10,000:1, can be challenging. One approach is to apply automated fitting routines. In this approach, combinations of 3 transitions can be created to form a "triple", which allows fitting of the A, B, and C rotational constants in a Watson-type Hamiltonian. On a standard desktop computer, with a target molecule of interest, a typical AUTOFIT routine takes 2-12 hours depending on the spectral density. A new approach is to utilize machine learning to train a computer to recognize the patterns (frequency spacing and relative intensities) inherit in rotational spectra and to identify the individual spectra in a raw broadband rotational spectrum. Here, recurrent neural networks have been trained to identify different types of rotational spectra and classify them accordingly. Furthermore, early results in applying convolutional neural networks for spectral object recognition in broadband rotational spectra appear promising. Perez et al. "Broadband Fourier transform rotational spectroscopy for structure determination: The water heptamer." Chem. Phys. Lett., 2013, 571, 1-15. Seifert et al. "AUTOFIT, an Automated Fitting Tool for Broadband Rotational Spectra, and Applications to 1-Hexanal." J. Mol. Spectrosc., 2015, 312, 13-21. Bishop. "Neural networks for pattern recognition." Oxford university press, 1995.

  3. Computer Spectrometers

    Science.gov (United States)

    Dattani, Nikesh S.

    2017-06-01

    Ideally, the cataloguing of spectroscopic linelists would not demand laborious and expensive experiments. Whatever an experiment might achieve, the same information would be attainable by running a calculation on a computer. Kolos and Wolniewicz were the first to demonstrate that calculations on a computer can outperform even the most sophisticated molecular spectroscopic experiments of the time, when their 1964 calculations of the dissociation energies of H_2 and D_{2} were found to be more than 1 cm^{-1} larger than the best experiments by Gerhard Herzberg, suggesting the experiment violated a strict variational principle. As explained in his Nobel Lecture, it took 5 more years for Herzberg to perform an experiment which caught up to the accuracy of the 1964 calculations. Today, numerical solutions to the Schrödinger equation, supplemented with relativistic and higher-order quantum electrodynamics (QED) corrections can provide ro-vibrational spectra for molecules that we strongly believe to be correct, even in the absence of experimental data. Why do we believe these calculated spectra are correct if we do not have experiments against which to test them? All evidence seen so far suggests that corrections due to gravity or other forces are not needed for a computer simulated QED spectrum of ro-vibrational energy transitions to be correct at the precision of typical spectrometers. Therefore a computer-generated spectrum can be considered to be as good as one coming from a more conventional spectrometer, and this has been shown to be true not just for the H_2 energies back in 1964, but now also for several other molecules. So are we at the stage where we can launch an array of calculations, each with just the atomic number changed in the input file, to reproduce the NIST energy level databases? Not quite. But I will show that for the 6e^- molecule Li_2, we have reproduced the vibrational spacings to within 0.001 cm^{-1} of the experimental spectrum, and I will

  4. 48 CFR 212.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  5. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  6. Impact of new computing systems on finite element computations

    Science.gov (United States)

    Noor, A. K.; Storassili, O. O.; Fulton, R. E.

    1983-01-01

    Recent advances in computer technology that are likely to impact finite element computations are reviewed. The characteristics of supersystems, highly parallel systems, and small systems (mini and microcomputers) are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario is presented for future hardware/software environment and finite element systems. A number of research areas which have high potential for improving the effectiveness of finite element analysis in the new environment are identified.

  7. Tensor computations in computer algebra systems

    CERN Document Server

    Korolkova, A V; Sevastyanov, L A

    2014-01-01

    This paper considers three types of tensor computations. On their basis, we attempt to formulate criteria that must be satisfied by a computer algebra system dealing with tensors. We briefly overview the current state of tensor computations in different computer algebra systems. The tensor computations are illustrated with appropriate examples implemented in specific systems: Cadabra and Maxima.

  8. Computational crystallization.

    Science.gov (United States)

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  9. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  10. Computationally modeling interpersonal trust.

    Science.gov (United States)

    Lee, Jin Joo; Knox, W Bradley; Wormwood, Jolie B; Breazeal, Cynthia; Desteno, David

    2013-01-01

    We present a computational model capable of predicting-above human accuracy-the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind's readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naiveté of this domain knowledge. We then present the construction of hidden Markov models to investigate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  11. Proactive health computing.

    Science.gov (United States)

    Timpka, T

    2001-08-01

    In an analysis departing from the global health situation, the foundation for a change of paradigm in health informatics based on socially embedded information infrastructures and technologies is identified and discussed. It is shown how an increasing computing and data transmitting capacity can be employed for proactive health computing. As a foundation for ubiquitous health promotion and prevention of disease and injury, proactive health systems use data from multiple sources to supply individuals and communities evidence-based information on means to improve their state of health and avoid health risks. The systems are characterised by: (1) being profusely connected to the world around them, using perceptual interfaces, sensors and actuators; (2) responding to external stimuli at faster than human speeds; (3) networked feedback loops; and (4) humans remaining in control, while being left outside the primary computing loop. The extended scientific mission of this new partnership between computer science, electrical engineering and social medicine is suggested to be the investigation of how the dissemination of information and communication technology on democratic grounds can be made even more important for global health than sanitation and urban planning became a century ago.

  12. Identifying Targets from Filtering Effects

    Science.gov (United States)

    2012-10-24

    I. Abarbanel, R. Brown., J. J. Sidorowich, and L. S. Tsmring, "The Analysis of Observed Chaotic Data in Physical Systems," Reviews of Modern ...A. Taflove and S. C. Hagness, Computational Electrodynamics : The Finite Difference Time Domain Method. Norwood, MA: Artech House, 2005.

  13. Identifying marker typing incompatibilities in linkage analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stringham, H.M.; Boehnke, M. [Univ. of Michigan, Ann Arbor, MI (United States)

    1996-10-01

    A common problem encountered in linkage analyses is that execution of the computer program is halted because of genotypes in the data that are inconsistent with Mendelian inheritance. Such inconsistencies may arise because of pedigree errors or errors in typing. In some cases, the source of the inconsistencies is easily identified by examining the pedigree. In others, the error is not obvious, and substantial time and effort are required to identify the responsible genotypes. We have developed two methods for automatically identifying those individuals whose genotypes are most likely the cause of the inconsistencies. First, we calculate the posterior probability of genotyping error for each member of the pedigree, given the marker data on all pedigree members and allowing anyone in the pedigree to have an error. Second, we identify those individuals whose genotypes could be solely responsible for the inconsistency in the pedigree. We illustrate these methods with two examples: one a pedigree error, the second a genotyping error. These methods have been implemented as a module of the pedigree analysis program package MENDEL. 9 refs., 2 figs., 2 tabs.

  14. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  15. Social Computing

    CERN Document Server

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  16. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  17. Brain computer

    Directory of Open Access Journals (Sweden)

    Sarah N. Abdulkader

    2015-07-01

    Full Text Available Brain computer interface technology represents a highly growing field of research with application systems. Its contributions in medical fields range from prevention to neuronal rehabilitation for serious injuries. Mind reading and remote communication have their unique fingerprint in numerous fields such as educational, self-regulation, production, marketing, security as well as games and entertainment. It creates a mutual understanding between users and the surrounding systems. This paper shows the application areas that could benefit from brain waves in facilitating or achieving their goals. We also discuss major usability and technical challenges that face brain signals utilization in various components of BCI system. Different solutions that aim to limit and decrease their effects have also been reviewed.

  18. Computational micromechanics

    Science.gov (United States)

    Ortiz, M.

    1996-09-01

    Selected issues in computational micromechanics are reviewed, with particular emphasis on multiple-scale problems and micromechanical models of material behavior. Examples considered include: the bridging of atomistic and continuum scales, with application to nanoindentation and the brittle-to-ductile transition; the development of dislocation-based constitutive relations for pure metallic crystals and intermetallic compounds, with applications to fracture of single crystals and bicrystals; the simulation of non-planar three-dimensional crack growth at the microscale, with application to mixed mode I III effective behavior and crack trapping and bridging in fiber-reinforced composites; and the direct micromechanical simulation of fragmentation of brittle solids and subsequent flow of the comminuted phase.

  19. New criteria to identify spectrum

    DEFF Research Database (Denmark)

    Jensen, Arne; Krishna, M.

    2005-01-01

    In this paper we give some new criteria for identifying the components of a probability measure, in its Lebesgue decomposition. This enables us to give new criteria to identify spectral types of self-adjoint operators on Hilbert spaces, especially those of interest....

  20. New Criteria to Identify Spectrum

    Indian Academy of Sciences (India)

    A Jensen; M Krishna

    2005-05-01

    In this paper we give some new criteria for identifying the components of a probability measure, in its Lebesgue decomposition. This enables us to give new criteria to identify spectral types of self-adjoint operators on Hilbert spaces, especially those of interest.

  1. Cloud Computing Security: A Survey

    Directory of Open Access Journals (Sweden)

    Issa M. Khalil

    2014-02-01

    Full Text Available Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing and outsourcing, create new challenges to the security community. Addressing these challenges requires, in addition to the ability to cultivate and tune the security measures developed for traditional computing systems, proposing new security policies, models, and protocols to address the unique cloud security challenges. In this work, we provide a comprehensive study of cloud computing security and privacy concerns. We identify cloud vulnerabilities, classify known security threats and attacks, and present the state-of-the-art practices to control the vulnerabilities, neutralize the threats, and calibrate the attacks. Additionally, we investigate and identify the limitations of the current solutions and provide insights of the future security perspectives. Finally, we provide a cloud security framework in which we present the various lines of defense and identify the dependency levels among them. We identify 28 cloud security threats which we classify into five categories. We also present nine general cloud attacks along with various attack incidents, and provide effectiveness analysis of the proposed countermeasures.

  2. Author Identifiers in Scholarly Repositories

    CERN Document Server

    Warner, Simeon

    2010-01-01

    Bibliometric and usage-based analyses and tools highlight the value of information about scholarship contained within the network of authors, articles and usage data. Less progress has been made on populating and using the author side of this network than the article side, in part because of the difficulty of unambiguously identifying authors. I briefly review a sample of author identifier schemes, and consider use in scholarly repositories. I then describe preliminary work at arXiv to implement public author identifiers, services based on them, and plans to make this information useful beyond the boundaries of arXiv.

  3. Wavelets in scientific computing

    DEFF Research Database (Denmark)

    Nielsen, Ole Møller

    1998-01-01

    such a function well. These properties of wavelets have lead to some very successful applications within the field of signal processing. This dissertation revolves around the role of wavelets in scientific computing and it falls into three parts: Part I gives an exposition of the theory of orthogonal, compactly...... is an investigation of the potential for using the special properties of wavelets for solving partial differential equations numerically. Several approaches are identified and two of them are described in detail. The algorithms developed are applied to the nonlinear Schrödinger equation and Burgers' equation...

  4. Computer aided control engineering

    DEFF Research Database (Denmark)

    Szymkat, Maciej; Ravn, Ole

    1997-01-01

    levels.The major conclusions of the paper are related with identifying the factors affecting the software tool integration in a way needed to facilitate design "inter-phase" communication. These are: standard application interfaces, dynamic data exchange mechanisms, code generation techniques and general......Current developments in the field of Computer Aided Control Engineering (CACE) have a visible impact on the design methodologies and the structure of the software tools supporting them. Today control engineers has at their disposal libraries, packages or programming environments that may...

  5. Identifiability, exchangeability and confounding revisited

    OpenAIRE

    Greenland, Sander; Robins, James Matthew

    2009-01-01

    In 1986 the International Journal of Epidemiology published "Identifiability, Exchangeability and Epidemiological Confounding". We review the article from the perspective of a quarter century after it was first drafted and relate it to subsequent developments on confounding, ignorability, and collapsibility.

  6. A Framework for Heterotic Computing

    Directory of Open Access Journals (Sweden)

    Susan Stepney

    2012-10-01

    Full Text Available Computational devices combining two or more different parts, one controlling the operation of the other, for example, derive their power from the interaction, in addition to the capabilities of the parts. Non-classical computation has tended to consider only single computational models: neural, analog, quantum, chemical, biological, neglecting to account for the contribution from the experimental controls. In this position paper, we propose a framework suitable for analysing combined computational models, from abstract theory to practical programming tools. Focusing on the simplest example of one system controlled by another through a sequence of operations in which only one system is active at a time, the output from one system becomes the input to the other for the next step, and vice versa. We outline the categorical machinery required for handling diverse computational systems in such combinations, with their interactions explicitly accounted for. Drawing on prior work in refinement and retrenchment, we suggest an appropriate framework for developing programming tools from the categorical framework. We place this work in the context of two contrasting concepts of "efficiency": theoretical comparisons to determine the relative computational power do not always reflect the practical comparison of real resources for a finite-sized computational task, especially when the inputs include (approximations of real numbers. Finally we outline the limitations of our simple model, and identify some of the extensions that will be required to treat more complex interacting computational systems.

  7. Identifying discharge practice training needs.

    Science.gov (United States)

    Lees, L; Emmerson, K

    A training needs analysis tool was developed to identify nurses' discharge training needs and to improve discharge practice. The tool includes 49 elements of discharge practice subdivided into four areas: corporate, operational, clinical and nurse-led discharge. The tool was disseminated to 15 wards on two hospital sites with assistance from the practice development team. Analysis of discharge training is important to assess discharge training needs and to identify staff who may assist with training.

  8. Experimental DNA computing

    NARCIS (Netherlands)

    Henkel, Christiaan

    2005-01-01

    Because of their information storing and processing capabilities, nucleic acids are interesting building blocks for molecular scale computers. Potential applications of such DNA computers range from massively parallel computation to computational gene therapy. In this thesis, several implementations

  9. Individual Identifiability Predicts Population Identifiability in Forensic Microsatellite Markers.

    Science.gov (United States)

    Algee-Hewitt, Bridget F B; Edge, Michael D; Kim, Jaehee; Li, Jun Z; Rosenberg, Noah A

    2016-04-01

    Highly polymorphic genetic markers with significant potential for distinguishing individual identity are used as a standard tool in forensic testing [1, 2]. At the same time, population-genetic studies have suggested that genetically diverse markers with high individual identifiability also confer information about genetic ancestry [3-6]. The dual influence of polymorphism levels on ancestry inference and forensic desirability suggests that forensically useful marker sets with high levels of individual identifiability might also possess substantial ancestry information. We study a standard forensic marker set-the 13 CODIS loci used in the United States and elsewhere [2, 7-9]-together with 779 additional microsatellites [10], using direct population structure inference to test whether markers with substantial individual identifiability also produce considerable information about ancestry. Despite having been selected for individual identification and not for ancestry inference [11], the CODIS markers generate nontrivial model-based clustering patterns similar to those of other sets of 13 tetranucleotide microsatellites. Although the CODIS markers have relatively low values of the F(ST) divergence statistic, their high heterozygosities produce greater ancestry inference potential than is possessed by less heterozygous marker sets. More generally, we observe that marker sets with greater individual identifiability also tend toward greater population identifiability. We conclude that population identifiability regularly follows as a byproduct of the use of highly polymorphic forensic markers. Our findings have implications for the design of new forensic marker sets and for evaluations of the extent to which individual characteristics beyond identification might be predicted from current and future forensic data.

  10. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  11. Study of Quantum Computing

    Directory of Open Access Journals (Sweden)

    Prashant Anil Patil

    2012-04-01

    Full Text Available This paper gives the detailed information about Quantum computer, and difference between quantum computer and traditional computers, the basis of Quantum computers which are slightly similar but still different from traditional computer. Many research groups are working towards the highly technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. Quantum computer is very much use full for computation purpose in field of Science and Research. Large amount of data and information will be computed, processing, storing, retrieving, transmitting and displaying information in less time with that much of accuracy which is not provided by traditional computers.

  12. Dynamic Method for Identifying Collected Sample Mass

    Science.gov (United States)

    Carson, John

    2008-01-01

    G-Sample is designed for sample collection missions to identify the presence and quantity of sample material gathered by spacecraft equipped with end effectors. The software method uses a maximum-likelihood estimator to identify the collected sample's mass based on onboard force-sensor measurements, thruster firings, and a dynamics model of the spacecraft. This makes sample mass identification a computation rather than a process requiring additional hardware. Simulation examples of G-Sample are provided for spacecraft model configurations with a sample collection device mounted on the end of an extended boom. In the absence of thrust knowledge errors, the results indicate that G-Sample can identify the amount of collected sample mass to within 10 grams (with 95-percent confidence) by using a force sensor with a noise and quantization floor of 50 micrometers. These results hold even in the presence of realistic parametric uncertainty in actual spacecraft inertia, center-of-mass offset, and first flexibility modes. Thrust profile knowledge is shown to be a dominant sensitivity for G-Sample, entering in a nearly one-to-one relationship with the final mass estimation error. This means thrust profiles should be well characterized with onboard accelerometers prior to sample collection. An overall sample-mass estimation error budget has been developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  13. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  14. Lecture 4: Cloud Computing in Large Computer Centers

    CERN Document Server

    CERN. Geneva

    2013-01-01

    This lecture will introduce Cloud Computing concepts identifying and analyzing its characteristics, models, and applications. Also, you will learn how CERN built its Cloud infrastructure and which tools are been used to deploy and manage it. About the speaker: Belmiro Moreira is an enthusiastic software engineer passionate about the challenges and complexities of architecting and deploying Cloud Infrastructures in ve...

  15. Computing with functionals—computability theory or computer science?

    OpenAIRE

    Normann, Dag

    2006-01-01

    We review some of the history of the computability theory of functionals of higher types, and we will demonstrate how contributions from logic and theoretical computer science have shaped this still active subject.

  16. Non-Boolean computing with nanomagnets for computer vision applications

    Science.gov (United States)

    Bhanja, Sanjukta; Karunaratne, D. K.; Panchumarthy, Ravi; Rajaram, Srinath; Sarkar, Sudeep

    2016-02-01

    The field of nanomagnetism has recently attracted tremendous attention as it can potentially deliver low-power, high-speed and dense non-volatile memories. It is now possible to engineer the size, shape, spacing, orientation and composition of sub-100 nm magnetic structures. This has spurred the exploration of nanomagnets for unconventional computing paradigms. Here, we harness the energy-minimization nature of nanomagnetic systems to solve the quadratic optimization problems that arise in computer vision applications, which are computationally expensive. By exploiting the magnetization states of nanomagnetic disks as state representations of a vortex and single domain, we develop a magnetic Hamiltonian and implement it in a magnetic system that can identify the salient features of a given image with more than 85% true positive rate. These results show the potential of this alternative computing method to develop a magnetic coprocessor that might solve complex problems in fewer clock cycles than traditional processors.

  17. CLOUD COMPUTING SECURITY ISSUES

    Directory of Open Access Journals (Sweden)

    Florin OGIGAU-NEAMTIU

    2012-01-01

    Full Text Available The term “cloud computing” has been in the spotlights of IT specialists the last years because of its potential to transform this industry. The promised benefits have determined companies to invest great sums of money in researching and developing this domain and great steps have been made towards implementing this technology. Managers have traditionally viewed IT as difficult and expensive and the promise of cloud computing leads many to think that IT will now be easy and cheap. The reality is that cloud computing has simplified some technical aspects of building computer systems, but the myriad challenges facing IT environment still remain. Organizations which consider adopting cloud based services must also understand the many major problems of information policy, including issues of privacy, security, reliability, access, and regulation. The goal of this article is to identify the main security issues and to draw the attention of both decision makers and users to the potential risks of moving data into “the cloud”.

  18. An approach to identify urban groundwater recharge

    Directory of Open Access Journals (Sweden)

    E. Vázquez-Suñé

    2010-10-01

    Full Text Available Evaluating the proportion in which waters from different origins are mixed in a given water sample is relevant for many hydrogeological problems, such as quantifying total recharge, assessing groundwater pollution risks, or managing water resources. Our work is motivated by urban hydrogeology, where waters with different chemical signature can be identified (losses from water supply and sewage networks, infiltration from surface runoff and other water bodies, lateral aquifers inflows, .... The relative contribution of different sources to total recharge can be quantified by means of solute mass balances, but application is hindered by the large number of potential origins. Hence, the need to incorporate data from a large number of conservative species, the uncertainty in sources concentrations and measurement errors. We present a methodology to compute mixing ratios and end-members composition, which consists of (i Identification of potential recharge sources, (ii Selection of tracers, (iii Characterization of the hydrochemical composition of potential recharge sources and mixed water samples, and (iv Computation of mixing ratios and reevaluation of end-members. The analysis performed in a data set from samples of the Barcelona city aquifers suggests that the main contributors to total recharge are the water supply network losses (22%, the sewage network losses (30%, rainfall, concentrated in the non-urbanized areas (17%, from runoff infiltration (20%, and the Besòs River (11%. Regarding species, halogens (chloride, fluoride and bromide, sulfate, total nitrogen, and stable isotopes (18O, 2H, and 34S behaved quite conservatively. Boron, residual alkalinity, EDTA and Zn did not. Yet, including these species in the computations did not affect significantly the proportion estimations.

  19. An approach to identify urban groundwater recharge

    Directory of Open Access Journals (Sweden)

    E. Vázquez-Suñé

    2010-04-01

    Full Text Available Evaluating the proportion in which waters from different origins are mixed in a given water sample is relevant for many hydrogeological problems, such as quantifying total recharge, assessing groundwater pollution risks, or managing water resources. Our work is motivated by urban hydrogeology, where waters with different chemical signature can be identified (losses from water supply and sewage networks, infiltration from surface runoff and other water bodies, lateral aquifers inflows, .... The relative contribution of different sources to total recharge can be quantified by means of solute mass balances, but application is hindered by the large number of potential origins. Hence, the need to incorporate data from a large number of conservative species, the uncertainty in sources concentrations and measurement errors. We present a methodology to compute mixing ratios and end-members composition, which consists of (i Identification of potential recharge sources, (ii Selection of tracers, (iii Characterization of the hydrochemical composition of potential recharge sources and mixed water samples, and (iv Computation of mixing ratios and reevaluation of end-members. The analysis performed in a data set from samples of the Barcelona city aquifers suggests that the main contributors to total recharge are the water supply network losses (22%, the sewage network losses (30%, rainfall, concentrated in the non-urbanized areas (17%, from runoff infiltration (20%, and the Besòs River (11%. Regarding species, halogens (chloride, fluoride and bromide, sulfate, total nitrogen, and stable isotopes (18O2H, and 34S behaved quite conservatively. Boron, residual alkalinity, EDTA and Zn did not. Yet, including these species in the computations did not affect significantly the proportion estimations.

  20. Football refereeing: Identifying innovative methods

    Directory of Open Access Journals (Sweden)

    Reza MohammadKazemi

    2014-08-01

    Full Text Available The aim of the present study is to identify the potentials innovation in football industry. Data were collected from 10 national and international referees, assistant referees and referees’ supervisors in Iran. In this study, technological innovations are identified that assist better refereeing performances. The analysis revealed a significant relationship between using new technologies and referees ‘performance. The results indicate that elite referees, assistant referees and supervisors agreed to use new technological innovations during the game. According to their comments, this kind of technology causes the referees’ performance development.

  1. Locally identifying coloring of graphs

    CERN Document Server

    Esperet, Louis; Montassier, Mickael; Ochem, Pascal; Parreau, Aline

    2010-01-01

    A vertex-coloring of a graph G is said to be locally identifying if for any pair (u,v) of adjacent vertices of G, with distinct closed neighborhood, the set of colors that appears in the closed neighborhoods of u and v are distinct. In this paper, we give several bounds on the minimum number of colors needed in such a coloring for different families of graphs (planar graphs, some subclasses of perfect graphs, graphs with bounded maximum degree) and prove that deciding whether a subcubic bipartite graph with large girth has a locally identifying coloring with 3 colors is an NP-complete problem.

  2. Important computer competencies for the nursing profession.

    Science.gov (United States)

    Jiang, Wey-Wen; Chen, Wei; Chen, Yu-Chih

    2004-09-01

    Nursing requires computer competencies. This study aimed at identifying those competencies required for the nursing profession in Taiwan. The Delphi technique was deployed in this study. In the Delphi questionnaires, computer competencies were sorted into seven domains: concepts of hardware, software, and networks; principles of computer applications; skills of computer usage; program design; limitations of the computer; personal and social issues; attitudes toward the computer. In three Delphi questionnaires, nursing informatics experts gave us their opinions on the importance of each computer competency for the nursing profession. The experts also designated when the competency should be cultivated. This study provides a comprehensive list for nursing professionals to check on their computer competence. The results of this study should also serve as good references for teachers and schools in designing related curriculums.

  3. Academic Training Lecture Regular Programme: Cloud Computing

    CERN Multimedia

    2012-01-01

    Cloud Computing (1/2), by Belmiro Rodrigues Moreira (LIP Laboratorio de Instrumentacao e Fisica Experimental de Part).   Wednesday, May 30, 2012 from 11:00 to 12:00 (Europe/Zurich) at CERN ( 500-1-001 - Main Auditorium ) Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  4. Computer technology forecast study for general aviation

    Science.gov (United States)

    Seacord, C. L.; Vaughn, D.

    1976-01-01

    A multi-year, multi-faceted program is underway to investigate and develop potential improvements in airframes, engines, and avionics for general aviation aircraft. The objective of this study was to assemble information that will allow the government to assess the trends in computer and computer/operator interface technology that may have application to general aviation in the 1980's and beyond. The current state of the art of computer hardware is assessed, technical developments in computer hardware are predicted, and nonaviation large volume users of computer hardware are identified.

  5. The importance of trust in computer security

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2014-01-01

    The computer security community has traditionally regarded security as a “hard” property that can be modelled and formally proven under certain simplifying assumptions. Traditional security technologies assume that computer users are either malicious, e.g. hackers or spies, or benevolent, competent...... of the fundamental assumptions that underpin existing computer security technologies and that a new view of computer security is long overdue. In this paper, we examine traditionalmodels, policies and mechanisms of computer security in order to identify areas where the fundamental assumptions may fail. In particular...

  6. Grammar Rules as Computer Algorithms.

    Science.gov (United States)

    Rieber, Lloyd

    1992-01-01

    One college writing teacher engaged his class in the revision of a computer program to check grammar, focusing on improvement of the algorithms for identifying inappropriate uses of the passive voice. Process and problems of constructing new algorithms, effects on student writing, and other algorithm applications are discussed. (MSE)

  7. Evaluating Computer Technology Integration in a Centralized School System

    Science.gov (United States)

    Eteokleous, N.

    2008-01-01

    The study evaluated the current situation in Cyprus elementary classrooms regarding computer technology integration in an attempt to identify ways of expanding teachers' and students' experiences with computer technology. It examined how Cypriot elementary teachers use computers, and the factors that influence computer integration in their…

  8. Security Problems in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Rola Motawie

    2016-12-01

    Full Text Available Cloud is a pool of computing resources which are distributed among cloud users. Cloud computing has many benefits like scalability, flexibility, cost savings, reliability, maintenance and mobile accessibility. Since cloud-computing technology is growing day by day, it comes with many security problems. Securing the data in the cloud environment is most critical challenges which act as a barrier when implementing the cloud. There are many new concepts that cloud introduces, such as resource sharing, multi-tenancy, and outsourcing, create new challenges for the security community. In this work, we provide a comparable study of cloud computing privacy and security concerns. We identify and classify known security threats, cloud vulnerabilities, and attacks.

  9. Translational Perspectives for Computational Neuroimaging.

    Science.gov (United States)

    Stephan, Klaas E; Iglesias, Sandra; Heinzle, Jakob; Diaconescu, Andreea O

    2015-08-19

    Functional neuroimaging has made fundamental contributions to our understanding of brain function. It remains challenging, however, to translate these advances into diagnostic tools for psychiatry. Promising new avenues for translation are provided by computational modeling of neuroimaging data. This article reviews contemporary frameworks for computational neuroimaging, with a focus on forward models linking unobservable brain states to measurements. These approaches-biophysical network models, generative models, and model-based fMRI analyses of neuromodulation-strive to move beyond statistical characterizations and toward mechanistic explanations of neuroimaging data. Focusing on schizophrenia as a paradigmatic spectrum disease, we review applications of these models to psychiatric questions, identify methodological challenges, and highlight trends of convergence among computational neuroimaging approaches. We conclude by outlining a translational neuromodeling strategy, highlighting the importance of openly available datasets from prospective patient studies for evaluating the clinical utility of computational models. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. SOCIODEMOGRAPHIC DATA USED FOR IDENTIFYING ...

    Science.gov (United States)

    Due to unique social and demographic characteristics, various segments of the population may experience exposures different from those of the general population, which, in many cases, may be greater. When risk assessments do not characterize subsets of the general population, the populations that may experience the greatest risk remain unidentified. When such populations are not identified, the social and demographic data relevant to these populations is not considered when preparing exposure estimates, which can underestimate exposure and risk estimates for at-risk populations. Thus, it is necessary for risk or exposure assessors characterizing a diverse population, to first identify and then enumerate certain groups within the general population who are at risk for greater contaminant exposures. The document entitled Sociodemographic Data Used for Identifying Potentially Highly Exposed Populations (also referred to as the Highly Exposed Populations document), assists assessors in identifying and enumerating potentially highly exposed populations. This document presents data relating to factors which potentially impact an individual or group's exposure to environmental contaminants based on activity patterns (how time is spent), microenvironments (locations where time is spent), and other socio-demographic data such as age, gender, race and economic status. Populations potentially more exposed to various chemicals of concern, relative to the general population

  11. Identifying the Gifted Child Humorist.

    Science.gov (United States)

    Fern, Tami L.

    1991-01-01

    This study attempted to identify gifted child humorists among 1,204 children in grades 3-6. Final identification of 13 gifted child humorists was determined through application of such criteria as funniness, originality, and exemplary performance or product. The influence of intelligence, development, social factors, sex differences, family…

  12. SNP interaction pattern identifier (SIPI)

    DEFF Research Database (Denmark)

    Lin, Hui-Yi; Chen, Dung-Tsa; Huang, Po-Yu

    2016-01-01

    MOTIVATION: Testing SNP-SNP interactions is considered as a key for overcoming bottlenecks of genetic association studies. However, related statistical methods for testing SNP-SNP interactions are underdeveloped. RESULTS: We propose the SNP Interaction Pattern Identifier (SIPI), which tests 45...

  13. Identifying high-risk medication

    DEFF Research Database (Denmark)

    Sædder, Eva; Brock, Birgitte; Nielsen, Lars Peter

    2014-01-01

    salicylic acid, and beta-blockers; 30 drugs or drug classes caused 82 % of all serious MEs. The top ten drugs involved in fatal events accounted for 73 % of all drugs identified. CONCLUSION: Increasing focus on seven drugs/drug classes can potentially reduce hospitalizations, extended hospitalizations...

  14. Identifying high-level components in combinational circuits

    Energy Technology Data Exchange (ETDEWEB)

    Doom, T.; White, J.; Wojcik, A. [Michigan State Univ., East Lansing, MI (United States). Dept. of Computer Science; Chisholm, G. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1998-07-01

    The problem of finding meaningful subcircuits in a logic layout appears in many contexts in computer-aided design. Existing techniques rely upon finding exact matchings of subcircuit structure within the layout. These syntactic techniques fail to identify functionally equivalent subcircuits that are differently implemented, optimized, or otherwise obfuscated. The authors present a mechanism for identifying functionally equivalent subcircuits that can overcome many of these limitations. Such semantic matching is particularly useful in the field of design recovery.

  15. Program Facilitates Distributed Computing

    Science.gov (United States)

    Hui, Joseph

    1993-01-01

    KNET computer program facilitates distribution of computing between UNIX-compatible local host computer and remote host computer, which may or may not be UNIX-compatible. Capable of automatic remote log-in. User communicates interactively with remote host computer. Data output from remote host computer directed to local screen, to local file, and/or to local process. Conversely, data input from keyboard, local file, or local process directed to remote host computer. Written in ANSI standard C language.

  16. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    Science.gov (United States)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  17. Persistent Identifier Practice for Big Data Management at NCI

    Directory of Open Access Journals (Sweden)

    Jingbo Wang

    2017-04-01

    Full Text Available The National Computational Infrastructure (NCI manages over 10 PB research data, which is co-located with the high performance computer (Raijin and an HPC class 3000 core OpenStack cloud system (Tenjin. In support of this integrated High Performance Computing/High Performance Data (HPC/HPD infrastructure, NCI’s data management practices includes building catalogues, DOI minting, data curation, data publishing, and data delivery through a variety of data services. The metadata catalogues, DOIs, THREDDS, and Vocabularies, all use different Uniform Resource Locator (URL styles. A Persistent IDentifier (PID service provides an important utility to manage URLs in a consistent, controlled and monitored manner to support the robustness of our national ‘Big Data’ infrastructure. In this paper we demonstrate NCI’s approach of utilising the NCI’s 'PID Service 'to consistently manage its persistent identifiers with various applications.

  18. Identifying patient risks during hospitalization

    Directory of Open Access Journals (Sweden)

    Lucélia Ferreira Lima

    2008-12-01

    Full Text Available Objective: To identify the risks reported at a public institution andto know the main patient risks from the nursing staff point of view.Methods: A retrospective, descriptive and exploratory study. Thesurvey was developed at a hospital in the city of Taboão da Serra, SãoPaulo, Brazil. The study included all nurses working in care areas whoagreed to participate in the study. At the same time, sentinel eventsoccurring in the period from July 2006 to July 2007 were identified.Results: There were 440 sentinel events reported, and the main risksincluded patient falls, medication errors and pressure ulcers. Sixty-fivenurses were interviewed. They also reported patient falls, medicationerrors and pressure ulcers as the main risks. Conclusions: Riskassessment and implementation of effective preventive actions arenecessary to ensure patient’s safety. Involvement of a multidisciplinaryteam is one of the steps for a successful process.

  19. On Identifying which Intermediate Nodes Should Code in Multicast Networks

    DEFF Research Database (Denmark)

    Pinto, Tiago; Roetter, Daniel Enrique Lucani; Médard, Muriel

    2013-01-01

    the data packets. Previous work has shown that in lossless wireline networks, the performance of tree-packing mechanisms is comparable to network coding, albeit with added complexity at the time of computing the trees. This means that most nodes in the network need not code. Thus, mechanisms that identify...

  20. Identifiability of Model Properties in Over-Parameterized Model Classes

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2013-01-01

    in the data. In this paper we make some initial steps to extend and adapt basic concepts of computational learnability and statistical identifiability to provide a foundation for investigating learnability in such broader contexts. We exemplify the use of the framework in three different applications...

  1. Identifying the Key Weaknesses in Network Security at Colleges.

    Science.gov (United States)

    Olsen, Florence

    2000-01-01

    A new study identifies and ranks the 10 security gaps responsible for most outsider attacks on college computer networks. The list is intended to help campus system administrators establish priorities as they work to increase security. One network security expert urges that institutions utilize multiple security layers. (DB)

  2. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  3. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  4. Identifying dependability requirements for space software systems

    Directory of Open Access Journals (Sweden)

    Edgar Toshiro Yano

    2010-09-01

    Full Text Available Computer systems are increasingly used in space, whether in launch vehicles, satellites, ground support and payload systems. Software applications used in these systems have become more complex, mainly due to the high number of features to be met, thus contributing to a greater probability of hazards related to software faults. Therefore, it is fundamental that the specification activity of requirements have a decisive role in the effort of obtaining systems with high quality and safety standards. In critical systems like the embedded software of the Brazilian Satellite Launcher, ambiguity, non-completeness, and lack of good requirements can cause serious accidents with economic, material and human losses. One way to assure quality with safety, reliability and other dependability attributes may be the use of safety analysis techniques during the initial phases of the project in order to identify the most adequate dependability requirements to minimize possible fault or failure occurrences during the subsequent phases. This paper presents a structured software dependability requirements analysis process that uses system software requirement specifications and traditional safety analysis techniques. The main goal of the process is to help to identify a set of essential software dependability requirements which can be added to the software requirement previously specified for the system. The final results are more complete, consistent, and reliable specifications.

  5. Parameter Identifiability in Statistical Machine Learning: A Review.

    Science.gov (United States)

    Ran, Zhi-Yong; Hu, Bao-Gang

    2017-05-01

    This review examines the relevance of parameter identifiability for statistical models used in machine learning. In addition to defining main concepts, we address several issues of identifiability closely related to machine learning, showing the advantages and disadvantages of state-of-the-art research and demonstrating recent progress. First, we review criteria for determining the parameter structure of models from the literature. This has three related issues: parameter identifiability, parameter redundancy, and reparameterization. Second, we review the deep influence of identifiability on various aspects of machine learning from theoretical and application viewpoints. In addition to illustrating the utility and influence of identifiability, we emphasize the interplay among identifiability theory, machine learning, mathematical statistics, information theory, optimization theory, information geometry, Riemann geometry, symbolic computation, Bayesian inference, algebraic geometry, and others. Finally, we present a new perspective together with the associated challenges.

  6. Computational thinking and thinking about computing.

    Science.gov (United States)

    Wing, Jeannette M

    2008-10-28

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing.

  7. 76 Computer Assisted Language Learning (CALL) Software ...

    African Journals Online (AJOL)

    Ike Odimegwu

    paper identified and characterised features and processes through which computer ..... sorting through the numerous resources that exist and getting students ready to use ... Muegge, R Fully Automatic High Quality Machine. Translation of ...

  8. Understanding and Predicting Attitudes towards Computers.

    Science.gov (United States)

    Pancer, S. Mark; And Others

    1992-01-01

    The ability of the theory of reasoned action to predict computer-related attitudes and behavior was demonstrated through two studies: a questionnaire on computer behaviors and attitudes; and word processing training involving various levels of persuasive communication based on belief statements identified in the first study. (22 references) (MES)

  9. Computer Literacy Development. A Project Report.

    Science.gov (United States)

    Kapp, Ann B.; Knickerbocker, Addie H.

    The Computer Literacy Development Project in Research and Training in Vocational Education was a multi-phased project designed to determine the feasibility of changing the attitudes of vocational educators toward the use of computers, to identify the classroom and administrative applications of microcomputers, to determine the applications of…

  10. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  11. Theory-Guided Technology in Computer Science.

    Science.gov (United States)

    Ben-Ari, Mordechai

    2001-01-01

    Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…

  12. Computer Viruses and Safe Educational Practices.

    Science.gov (United States)

    Azarmsa, Reza

    1991-01-01

    This discussion of computer viruses explains how these viruses may be transmitted, describes their effects on data and/or computer application programs, and identifies three groups that propagate them. Ten major viruses are listed and described, and measures to deal with them are discussed. Nineteen antiviral programs are also listed and…

  13. Theory-Guided Technology in Computer Science.

    Science.gov (United States)

    Ben-Ari, Mordechai

    2001-01-01

    Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…

  14. Identifying flares in rheumatoid arthritis

    DEFF Research Database (Denmark)

    Bykerk, Vivian P; Bingham, Clifton O; Choy, Ernest H

    2016-01-01

    Set. METHODS: Candidate flare questions and legacy measures were administered at consecutive visits to Canadian Early Arthritis Cohort (CATCH) patients between November 2011 and November 2014. The American College of Rheumatology (ACR) core set indicators were recorded. Concordance to identify flares...... to flare, with escalation planned in 61%. CONCLUSIONS: Flares are common in rheumatoid arthritis (RA) and are often preceded by treatment reductions. Patient/MD/DAS agreement of flare status is highest in patients worsening from R/LDA. OMERACT RA flare questions can discriminate between patients with...

  15. Identifying the health conscious consumer.

    Science.gov (United States)

    Kraft, F B; Goodell, P W

    1993-01-01

    Individuals who lead a "wellness-oriented" lifestyle are concerned with nutrition, fitness, stress, and their environment. They accept responsibility for their health and are excellent customers for health-related products and services. Those who lack a wellness orientation are identified as higher health risks and become candidates for health promotion program intervention. The authors report a new scale by which to measure the wellness-oriented lifestyle. Scale development procedures are detailed, followed by information from five studies that support its validity. The authors suggest ways health care marketers may use the Wellness Scale to segment and target potential customers and position their products and services.

  16. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  17. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  18. Cloud Computing (4)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ 8 Case Study Cloud computing is still a new phenomenon. Although many IT giants are developing their own cloud computing infrastructures,platforms, software, and services, few have really succeeded in becoming cloud computing providers.

  19. PR Educators Stress Computers.

    Science.gov (United States)

    Fleming, Charles A.

    1988-01-01

    Surveys the varied roles computers play in public relations education. Asserts that, because computers are used extensively in the public relations field, students should become acquainted with the varied capabilities of computers and their role in public relations practice. (MM)

  20. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    Science.gov (United States)

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  1. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  2. Avoiding Computer Viruses.

    Science.gov (United States)

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  3. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  4. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special ... the Head? What is CT Scanning of the Head? Computed tomography, more commonly known as a CT ...

  5. ASCR Workshop on Quantum Computing for Science

    Energy Technology Data Exchange (ETDEWEB)

    Aspuru-Guzik, Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Van Dam, Wim [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Farhi, Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gaitan, Frank [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Humble, Travis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jordan, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Landahl, Andrew J [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lucas, Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Preskill, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Muller, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Svore, Krysta [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wiebe, Nathan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Williams, Carl [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms for linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.

  6. Cloud Computing Principles and Paradigms

    CERN Document Server

    Buyya, Rajkumar; Goscinski, Andrzej M

    2010-01-01

    The primary purpose of this book is to capture the state-of-the-art in Cloud Computing technologies and applications. The book will also aim to identify potential research directions and technologies that will facilitate creation a global market-place of cloud computing services supporting scientific, industrial, business, and consumer applications. We expect the book to serve as a reference for larger audience such as systems architects, practitioners, developers, new researchers and graduate level students. This area of research is relatively recent, and as such has no existing reference boo

  7. Distributed Computing: An Overview

    OpenAIRE

    Md. Firoj Ali; Rafiqul Zaman Khan

    2015-01-01

    Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. Distributed computing systems offer the potential for improved performance and resource sharing. In this paper we have made an overview on distributed computing. In this paper we studied the difference between parallel and distributed computing, terminologies used in distributed computing, task allocation in distribute...

  8. Introduction to computers

    OpenAIRE

    Rajaraman, A

    1995-01-01

    An article on computer application for knowledge processing intended to generate awareness among librarians on the possiblities offered by ICT to improve services. Compares computers and the human brain, provides a historical perspective of the development of computer technology, explains the components of the computer and the computer languages, identifes the areas where computers can be applied and its benefits. Explains available storage systems and database management process. Points out ...

  9. A Review on Modern Distributed Computing Paradigms: Cloud Computing, Jungle Computing and Fog Computing

    OpenAIRE

    Hajibaba, Majid; Gorgin, Saeid

    2014-01-01

    The distributed computing attempts to improve performance in large-scale computing problems by resource sharing. Moreover, rising low-cost computing power coupled with advances in communications/networking and the advent of big data, now enables new distributed computing paradigms such as Cloud, Jungle and Fog computing.Cloud computing brings a number of advantages to consumers in terms of accessibility and elasticity. It is based on centralization of resources that possess huge processing po...

  10. Feasibility study: PASS computer environment

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-03-10

    The Policy Analysis Screening System (PASS) is a computerized information-retrieval system designed to provide analysts in the Department of Energy, Assistant Secretary for Environment, Office of Technology Impacts (DOE-ASEV-OTI) with automated access to articles, computer simulation outputs, energy-environmental statistics, and graphics. Although it is essential that PASS respond quickly to user queries, problems at the computer facility where it was originally installed seriously slowed PASS's operations. Users attempting to access the computer by telephone repeatedly encountered busy signals and, once logged on, experienced unsatisfactory delays in response to commands. Many of the problems stemmed from the system's facility manager having brought another large user onto the system shortly after PASS was implemented, thereby significantly oversubscribing the facility. Although in March 1980 Energy Information Administration (EIA) transferred operations to its own computer facility, OTI has expressed concern that any improvement in computer access time and response time may not be sufficient or permanent. Consequently, a study was undertaken to assess the current status of the system, to identify alternative computer environments, and to evaluate the feasibility of each alternative in terms of its cost and its ability to alleviate current problems.

  11. Computing on Anonymous Quantum Network

    CERN Document Server

    Kobayashi, Hirotada; Tani, Seiichiro

    2010-01-01

    This paper considers distributed computing on an anonymous quantum network, a network in which no party has a unique identifier and quantum communication and computation are available. It is proved that the leader election problem can exactly (i.e., without error in bounded time) be solved with at most the same complexity up to a constant factor as that of exactly computing symmetric functions (without intermediate measurements for a distributed and superposed input), if the number of parties is given to every party. A corollary of this result is a more efficient quantum leader election algorithm than existing ones: the new quantum algorithm runs in O(n) rounds with bit complexity O(mn^2), on an anonymous quantum network with n parties and m communication links. Another corollary is the first quantum algorithm that exactly computes any computable Boolean function with round complexity O(n) and with smaller bit complexity than that of existing classical algorithms in the worst case over all (computable) Boolea...

  12. Curriculum modules, software laboratories, and an inexpensive hardware platform for teaching computational methods to undergraduate computer science students

    Science.gov (United States)

    Peck, Charles Franklin

    Computational methods are increasingly important to 21st century research and education; bioinformatics and climate change are just two examples of this trend. In this context computer scientists play an important role, facilitating the development and use of the methods and tools used to support computationally-based approaches. The undergraduate curriculum in computer science is one place where computational tools and methods can be introduced to facilitate the development of appropriately prepared computer scientists. To facilitate the evolution of the pedagogy, this dissertation identifies, develops, and organizes curriculum materials, software laboratories, and the reference design for an inexpensive portable cluster computer, all of which are specifically designed to support the teaching of computational methods to undergraduate computer science students. Keywords. computational science, computational thinking, computer science, undergraduate curriculum.

  13. Cloud Computing (1)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: Cloud computing is a topic of intense interest in the Internet field. Major IT giants have launched their own cloud computing products. This four-part lecture series will discuss cloud computing technology in the following aspects: The first part provides a brief description of the origin and characteristics of cloud computing from the users view of point; the other parts introduce typical applications of cloud computing, technically analyze the specific content within the cloud, its components, architecture and computational paradigm, compare cloud computing to other distributed computing technologies, and discuss its successful cases, commercial models, related technical and economic issues, and development trends.

  14. Cloud Computing (2)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: Cloud computing is a topic of intense interest in the Internet field. Major IT giants have launched their own cloud computing products. This four-part lecture series discusses cloud computing technology in the following aspects: The first part provided a brief description of the origin and characteristics of cloud computing from the users view of point; the other parts introduce typical applications of cloud computing, technically analyze the specific content within the cloud, its components, architecture and computational paradigm, compare cloud computing to other distributed computing technologies, and discuss its successful cases, commercial models, related technical and economic issues, and development trends.

  15. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  16. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  17. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  18. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance Computing The ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  19. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  20. From Computer Forensics to Forensic Computing: Investigators Investigate, Scientists Associate

    OpenAIRE

    Dewald, Andreas; Freiling, Felix C.

    2014-01-01

    This paper draws a comparison of fundamental theories in traditional forensic science and the state of the art in current computer forensics, thereby identifying a certain disproportion between the perception of central aspects in common theory and the digital forensics reality. We propose a separation of what is currently demanded of practitioners in digital forensics into a rigorous scientific part on the one hand, and a more general methodology of searching and seizing digital evidence an...

  1. Computer virus information update CIAC-2301

    Energy Technology Data Exchange (ETDEWEB)

    Orvis, W.J.

    1994-01-15

    While CIAC periodically issues bulletins about specific computer viruses, these bulletins do not cover all the computer viruses that affect desktop computers. The purpose of this document is to identify most of the known viruses for the MS-DOS and Macintosh platforms and give an overview of the effects of each virus. The authors also include information on some windows, Atari, and Amiga viruses. This document is revised periodically as new virus information becomes available. This document replaces all earlier versions of the CIAC Computer virus Information Update. The date on the front cover indicates date on which the information in this document was extracted from CIAC`s Virus database.

  2. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...... are generated through the template in ICAS-MoT and translated into a model object. Once in ICAS-MoT, the model is numerical analyzed, solved and identified. A computer-aided modeling framework integrating systematic model derivation and development tools has been developed. It includes features for model...

  3. Computational enzyme design

    Science.gov (United States)

    Bolon, Daniel N.

    2002-08-01

    The long-term objective of computational enzyme design is the ability to generate efficient protein catalysts for any chemical reaction. This thesis develops and experimentally validates a general computational approach for the design of enzymes with novel function. In order to include catalytic mechanism in protein design, a high-energy state (HES) rotamer (side chain representation) was constructed. In this rotamer, substrate atoms are in a HES. In addition, at least one amino acid side chain is positioned to interact favorably with substrate atoms in their HES and facilitate the reaction. Including an amino acid side chain in the HES rotamer automatically positions substrate relative to a protein scaffold and allows protein design algorithms to search for sequences capable of interacting favorably with the substrate. Because chemical similarity exists between the transition state and the high-energy state, optimizing the protein sequence to interact favorably with the HES rotamer should lead to transition state stabilization. In addition, the HES rotamer model focuses the subsequent computational active site design on a relevant phase space where an amino acid is capable of interacting in a catalytically active geometry with substrate. Using a HES rotamer model of the histidine mediated nucleophilic hydrolysis of p-nitrophenyl acetate, the catalytically inert 108 residue E. coli thioredoxin as a scaffold, and the ORBIT protein design software to compute sequences, an active site scan identified two promising active site designs. Experimentally, both candidate ?protozymes? demonstrated catalytic activity significantly above background. In addition, the rate enhancement of one of these ?protozymes? was the same order of magnitude as the first catalytic antibodies. Because polar groups are frequently buried at enzyme-substrate interfaces, improved modeling of buried polar interactions may benefit enzyme design. By studying native protein structures, rules have been

  4. RECOVIR Software for Identifying Viruses

    Science.gov (United States)

    Chakravarty, Sugoto; Fox, George E.; Zhu, Dianhui

    2013-01-01

    Most single-stranded RNA (ssRNA) viruses mutate rapidly to generate a large number of strains with highly divergent capsid sequences. Determining the capsid residues or nucleotides that uniquely characterize these strains is critical in understanding the strain diversity of these viruses. RECOVIR (an acronym for "recognize viruses") software predicts the strains of some ssRNA viruses from their limited sequence data. Novel phylogenetic-tree-based databases of protein or nucleic acid residues that uniquely characterize these virus strains are created. Strains of input virus sequences (partial or complete) are predicted through residue-wise comparisons with the databases. RECOVIR uses unique characterizing residues to identify automatically strains of partial or complete capsid sequences of picorna and caliciviruses, two of the most highly diverse ssRNA virus families. Partition-wise comparisons of the database residues with the corresponding residues of more than 300 complete and partial sequences of these viruses resulted in correct strain identification for all of these sequences. This study shows the feasibility of creating databases of hitherto unknown residues uniquely characterizing the capsid sequences of two of the most highly divergent ssRNA virus families. These databases enable automated strain identification from partial or complete capsid sequences of these human and animal pathogens.

  5. Computer Viruses. Technology Update.

    Science.gov (United States)

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  6. Great Principles of Computing

    OpenAIRE

    Denning, Peter J.

    2008-01-01

    The Great Principles of Computing is a framework for understanding computing as a field of science. The website ...April 2008 (Rev. 8/31/08) The Great Principles of Computing is a framework for understanding computing as a field of science.

  7. The Computer Manpower Evolution

    Science.gov (United States)

    Rooney, Joseph J.

    1975-01-01

    Advances and employment outlook in the field of computer science are discussed as well as the problems related to improving the quality of computer education. Specific computer jobs discussed include: data processing machine repairers, systems analysts, programmers, computer and peripheral equipment operators, and keypunch operators. (EA)

  8. Elementary School Computer Literacy.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents lessons for computer literacy instruction in the elementary grades. The first section of the guide includes 22 lessons on hardware, covering such topics as how computers work, keyboarding, word processing, and computer peripherals. The 13 lessons in the second section cover social topics related to the computer,…

  9. My Computer Romance

    Science.gov (United States)

    Campbell, Gardner

    2007-01-01

    In this article, the author relates the big role of computers in his life as a writer. The author narrates that he has been using a computer for nearly twenty years now. He relates that computers has set his writing free. When he started writing, he was just using an electric typewriter. He also relates that his romance with computers is also a…

  10. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  11. Identifying crucial parameter correlations maintaining bursting activity.

    Directory of Open Access Journals (Sweden)

    Anca Doloc-Mihu

    2014-06-01

    Full Text Available Recent experimental and computational studies suggest that linearly correlated sets of parameters (intrinsic and synaptic properties of neurons allow central pattern-generating networks to produce and maintain their rhythmic activity regardless of changing internal and external conditions. To determine the role of correlated conductances in the robust maintenance of functional bursting activity, we used our existing database of half-center oscillator (HCO model instances of the leech heartbeat CPG. From the database, we identified functional activity groups of burster (isolated neuron and half-center oscillator model instances and realistic subgroups of each that showed burst characteristics (principally period and spike frequency similar to the animal. To find linear correlations among the conductance parameters maintaining functional leech bursting activity, we applied Principal Component Analysis (PCA to each of these four groups. PCA identified a set of three maximal conductances (leak current, [Formula: see text]Leak; a persistent K current, [Formula: see text]K2; and of a persistent Na+ current, [Formula: see text]P that correlate linearly for the two groups of burster instances but not for the HCO groups. Visualizations of HCO instances in a reduced space suggested that there might be non-linear relationships between these parameters for these instances. Experimental studies have shown that period is a key attribute influenced by modulatory inputs and temperature variations in heart interneurons. Thus, we explored the sensitivity of period to changes in maximal conductances of [Formula: see text]Leak, [Formula: see text]K2, and [Formula: see text]P, and we found that for our realistic bursters the effect of these parameters on period could not be assessed because when varied individually bursting activity was not maintained.

  12. On the Identifiability of Transmission Dynamic Models for Infectious Diseases.

    Science.gov (United States)

    Lintusaari, Jarno; Gutmann, Michael U; Kaski, Samuel; Corander, Jukka

    2016-03-01

    Understanding the transmission dynamics of infectious diseases is important for both biological research and public health applications. It has been widely demonstrated that statistical modeling provides a firm basis for inferring relevant epidemiological quantities from incidence and molecular data. However, the complexity of transmission dynamic models presents two challenges: (1) the likelihood function of the models is generally not computable, and computationally intensive simulation-based inference methods need to be employed, and (2) the model may not be fully identifiable from the available data. While the first difficulty can be tackled by computational and algorithmic advances, the second obstacle is more fundamental. Identifiability issues may lead to inferences that are driven more by prior assumptions than by the data themselves. We consider a popular and relatively simple yet analytically intractable model for the spread of tuberculosis based on classical IS6110 fingerprinting data. We report on the identifiability of the model, also presenting some methodological advances regarding the inference. Using likelihood approximations, we show that the reproductive value cannot be identified from the data available and that the posterior distributions obtained in previous work have likely been substantially dominated by the assumed prior distribution. Further, we show that the inferences are influenced by the assumed infectious population size, which generally has been kept fixed in previous work. We demonstrate that the infectious population size can be inferred if the remaining epidemiological parameters are already known with sufficient precision.

  13. Students’ Choice for Computers

    Institute of Scientific and Technical Information of China (English)

    Cai; Wei

    2015-01-01

    Nowadays,computers are widely used as useful tools for our daily life.So you can see students using computers everywhere.The purpose of our survey is to find out the answers to the following questions:1.What brand of computers do students often choose?2.What is the most important factor of choosing computers in students’idea?3.What do students want to do with computers most?After that,we hope the students will know what kind of computers they really need and how many factors must be thought about when buying computers.

  14. Study on Parallel Computing

    Institute of Scientific and Technical Information of China (English)

    Guo-Liang Chen; Guang-Zhong Sun; Yun-Quan Zhang; Ze-Yao Mo

    2006-01-01

    In this paper, we present a general survey on parallel computing. The main contents include parallel computer system which is the hardware platform of parallel computing, parallel algorithm which is the theoretical base of parallel computing, parallel programming which is the software support of parallel computing. After that, we also introduce some parallel applications and enabling technologies. We argue that parallel computing research should form an integrated methodology of "architecture - algorithm - programming - application". Only in this way, parallel computing research becomes continuous development and more realistic.

  15. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  16. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  17. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  18. Computation in Classical Mechanics

    CERN Document Server

    Timberlake, Todd

    2007-01-01

    There is a growing consensus that physics majors need to learn computational skills, but many departments are still devoid of computation in their physics curriculum. Some departments may lack the resources or commitment to create a dedicated course or program in computational physics. One way around this difficulty is to include computation in a standard upper-level physics course. An intermediate classical mechanics course is particularly well suited for including computation. We discuss the ways we have used computation in our classical mechanics courses, focusing on how computational work can improve students' understanding of physics as well as their computational skills. We present examples of computational problems that serve these two purposes. In addition, we provide information about resources for instructors who would like to include computation in their courses.

  19. Quantum computing with defects.

    Science.gov (United States)

    Weber, J R; Koehl, W F; Varley, J B; Janotti, A; Buckley, B B; Van de Walle, C G; Awschalom, D D

    2010-05-11

    Identifying and designing physical systems for use as qubits, the basic units of quantum information, are critical steps in the development of a quantum computer. Among the possibilities in the solid state, a defect in diamond known as the nitrogen-vacancy (NV(-1)) center stands out for its robustness--its quantum state can be initialized, manipulated, and measured with high fidelity at room temperature. Here we describe how to systematically identify other deep center defects with similar quantum-mechanical properties. We present a list of physical criteria that these centers and their hosts should meet and explain how these requirements can be used in conjunction with electronic structure theory to intelligently sort through candidate defect systems. To illustrate these points in detail, we compare electronic structure calculations of the NV(-1) center in diamond with those of several deep centers in 4H silicon carbide (SiC). We then discuss the proposed criteria for similar defects in other tetrahedrally coordinated semiconductors.

  20. Large scale cluster computing workshop

    Energy Technology Data Exchange (ETDEWEB)

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  1. Research on Comparison of Cloud Computing and Grid Computing

    OpenAIRE

    Liu Yuxi; Wang Jianhua

    2012-01-01

    The development of computer industry is promoted by the progress of distributed computing, parallel computing and grid computing, so the cloud computing movement rises. This study describes the types of cloud computing services, the similarities and differences of cloud computing and grid computing, meanwhile discusses the better aspect of cloud computing than grid computing, and refers the common problems faced to the both computing, and some security issues.

  2. Indexing molecules with chemical graph identifiers.

    Science.gov (United States)

    Gregori-Puigjané, Elisabet; Garriga-Sust, Rut; Mestres, Jordi

    2011-09-01

    Fast and robust algorithms for indexing molecules have been historically considered strategic tools for the management and storage of large chemical libraries. This work introduces a modified and further extended version of the molecular equivalence number naming adaptation of the Morgan algorithm (J Chem Inf Comput Sci 2001, 41, 181-185) for the generation of a chemical graph identifier (CGI). This new version corrects for the collisions recognized in the original adaptation and includes the ability to deal with graph canonicalization, ensembles (salts), and isomerism (tautomerism, regioisomerism, optical isomerism, and geometrical isomerism) in a flexible manner. Validation of the current CGI implementation was performed on the open NCI database and the drug-like subset of the ZINC database containing 260,071 and 5,348,089 structures, respectively. The results were compared with those obtained with some of the most widely used indexing codes, such as the CACTVS hash code and the new InChIKey. The analyses emphasize the fact that compound management activities, like duplicate analysis of chemical libraries, are sensitive to the exact definition of compound uniqueness and thus still depend, to a minor extent, on the type and flexibility of the molecular index being used.

  3. Identifying ELIXIR Core Data Resources.

    Science.gov (United States)

    Durinx, Christine; McEntyre, Jo; Appel, Ron; Apweiler, Rolf; Barlow, Mary; Blomberg, Niklas; Cook, Chuck; Gasteiger, Elisabeth; Kim, Jee-Hyub; Lopez, Rodrigo; Redaschi, Nicole; Stockinger, Heinz; Teixeira, Daniel; Valencia, Alfonso

    2016-01-01

    The core mission of ELIXIR is to build a stable and sustainable infrastructure for biological information across Europe. At the heart of this are the data resources, tools and services that ELIXIR offers to the life-sciences community, providing stable and sustainable access to biological data. ELIXIR aims to ensure that these resources are available long-term and that the life-cycles of these resources are managed such that they support the scientific needs of the life-sciences, including biological research. ELIXIR Core Data Resources are defined as a set of European data resources that are of fundamental importance to the wider life-science community and the long-term preservation of biological data. They are complete collections of generic value to life-science, are considered an authority in their field with respect to one or more characteristics, and show high levels of scientific quality and service. Thus, ELIXIR Core Data Resources are of wide applicability and usage. This paper describes the structures, governance and processes that support the identification and evaluation of ELIXIR Core Data Resources. It identifies key indicators which reflect the essence of the definition of an ELIXIR Core Data Resource and support the promotion of excellence in resource development and operation. It describes the specific indicators in more detail and explains their application within ELIXIR's sustainability strategy and science policy actions, and in capacity building, life-cycle management and technical actions. The identification process is currently being implemented and tested for the first time. The findings and outcome will be evaluated by the ELIXIR Scientific Advisory Board in March 2017. Establishing the portfolio of ELIXIR Core Data Resources and ELIXIR Services is a key priority for ELIXIR and publicly marks the transition towards a cohesive infrastructure.

  4. Cloud Computing (3)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: In the preceding two parts of this series, several aspects of cloud computing-including definition, classification, characteristics, typical applications, and service levels-were discussed. This part continues with a discussion of Cloud Computing Oopen Architecture and Market-Oriented Cloud. A comparison is made between cloud computing and other distributed computing technologies, and Google's cloud platform is analyzed to determine how distributed computing is implemented in its particular model.

  5. Distributed computing in bioinformatics.

    Science.gov (United States)

    Jain, Eric

    2002-01-01

    This paper provides an overview of methods and current applications of distributed computing in bioinformatics. Distributed computing is a strategy of dividing a large workload among multiple computers to reduce processing time, or to make use of resources such as programs and databases that are not available on all computers. Participating computers may be connected either through a local high-speed network or through the Internet.

  6. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  7. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  8. Ion Trap Quantum Computing

    Science.gov (United States)

    2011-12-01

    an inspiring speech at the MIT Physics of Computation 1st Conference in 1981, Feynman proposed the development of a computer that would obey the...on ion trap based 36 quantum computing for physics and computer science students would include lecture notes, slides, lesson plans, a syllabus...reading lists, videos, demonstrations, and laboratories. 37 LIST OF REFERENCES [1] R. P. Feynman , “Simulating physics with computers,” Int. J

  9. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  10. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    Resistance to Change ; Stress; Adaptation to Computers ABSTRACT (Continue on reverie of necessary and identify by block number) This thesis is a study of...OF RESISTANCE TO CHANGE -------------- 48 B. OVERCOMING RESISTANCE TO CHANGE ------------- 50 C. SPECIFIC RECOMMENDATIONS TO OVERCOME RESISTANCE...greater his bewilderment and the greater his bewilderment, the greater his resistance will be [Ref. 7:p. 539]. Overcoming man’s resistance to change

  11. Algorithmic Detection of Computer Generated Text

    CERN Document Server

    Lavoie, Allen

    2010-01-01

    Computer generated academic papers have been used to expose a lack of thorough human review at several computer science conferences. We assess the problem of classifying such documents. After identifying and evaluating several quantifiable features of academic papers, we apply methods from machine learning to build a binary classifier. In tests with two hundred papers, the resulting classifier correctly labeled papers either as human written or as computer generated with no false classifications of computer generated papers as human and a 2% false classification rate for human papers as computer generated. We believe generalizations of these features are applicable to similar classification problems. While most current text-based spam detection techniques focus on the keyword-based classification of email messages, a new generation of unsolicited computer-generated advertisements masquerade as legitimate postings in online groups, message boards and social news sites. Our results show that taking the formatti...

  12. The importance of trust in computer security

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2014-01-01

    The computer security community has traditionally regarded security as a “hard” property that can be modelled and formally proven under certain simplifying assumptions. Traditional security technologies assume that computer users are either malicious, e.g. hackers or spies, or benevolent, competent...... and well informed about the security policies. Over the past two decades, however, computing has proliferated into all aspects of modern society and the spread of malicious software (malware) like worms, viruses and botnets have become an increasing threat. This development indicates a failure in some...... of the fundamental assumptions that underpin existing computer security technologies and that a new view of computer security is long overdue. In this paper, we examine traditionalmodels, policies and mechanisms of computer security in order to identify areas where the fundamental assumptions may fail. In particular...

  13. Identifying features in biological sequences: Sixth workshop report

    Energy Technology Data Exchange (ETDEWEB)

    Burks, C. [Los Alamos National Lab., NM (United States); Myers, E. [Univ. of Arizona (United States); Pearson, W.R. [Univ. of Virginia (United States)

    1995-12-31

    This report covers the sixth of an annual series of workshops held at the Aspen Center for Physics concentrating particularly on the identification of features in DNA sequence, and more broadly on related topics in computational molecular biology. The workshop series originally focused primarily on discussion of current needs and future strategies for identifying and predicting the presence of complex functional units on sequenced, but otherwise uncharacterized, genomic DNA. We addressed the need for computationally-based, automatic tools for synthesizing available data about individual consensus sequences and local compositional patterns into the composite objects (e.g., genes) that are -- as composite entities -- the true object of interest when scanning DNA sequences. The workshop was structured to promote sustained informal contact and exchange of expertise between molecular biologists, computer scientists, and mathematicians.

  14. Information-Theoretic Methods for Identifying Relationships among Climate Variables

    CERN Document Server

    Knuth, Kevin H; Rossow, William B

    2014-01-01

    Information-theoretic quantities, such as entropy, are used to quantify the amount of information a given variable provides. Entropies can be used together to compute the mutual information, which quantifies the amount of information two variables share. However, accurately estimating these quantities from data is extremely challenging. We have developed a set of computational techniques that allow one to accurately compute marginal and joint entropies. These algorithms are probabilistic in nature and thus provide information on the uncertainty in our estimates, which enable us to establish statistical significance of our findings. We demonstrate these methods by identifying relations between cloud data from the International Satellite Cloud Climatology Project (ISCCP) and data from other sources, such as equatorial pacific sea surface temperatures (SST).

  15. CY15 Livermore Computing Focus Areas

    Energy Technology Data Exchange (ETDEWEB)

    Connell, Tom M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Cupps, Kim C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); D' Hooge, Trent E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fahey, Tim J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fox, Dave M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Futral, Scott W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gary, Mark R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Goldstone, Robin J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hamilton, Pam G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Heer, Todd M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Long, Jeff W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mark, Rich J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Morrone, Chris J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shoopman, Jerry D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Slavec, Joe A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Smith, David W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Springmeyer, Becky R [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Stearman, Marc D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Watson, Py C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-20

    The LC team undertook a survey of primary Center drivers for CY15. Identified key drivers included enhancing user experience and productivity, pre-exascale platform preparation, process improvement, data-centric computing paradigms and business expansion. The team organized critical supporting efforts into three cross-cutting focus areas; Improving Service Quality; Monitoring, Automation, Delegation and Center Efficiency; and Next Generation Compute and Data Environments In each area the team detailed high level challenges and identified discrete actions to address these issues during the calendar year. Identifying the Center’s primary drivers, issues, and plans is intended to serve as a lens focusing LC personnel, resources, and priorities throughout the year.

  16. Heterogeneous Distributed Computing for Computational Aerosciences

    Science.gov (United States)

    Sunderam, Vaidy S.

    1998-01-01

    The research supported under this award focuses on heterogeneous distributed computing for high-performance applications, with particular emphasis on computational aerosciences. The overall goal of this project was to and investigate issues in, and develop solutions to, efficient execution of computational aeroscience codes in heterogeneous concurrent computing environments. In particular, we worked in the context of the PVM[1] system and, subsequent to detailed conversion efforts and performance benchmarking, devising novel techniques to increase the efficacy of heterogeneous networked environments for computational aerosciences. Our work has been based upon the NAS Parallel Benchmark suite, but has also recently expanded in scope to include the NAS I/O benchmarks as specified in the NHT-1 document. In this report we summarize our research accomplishments under the auspices of the grant.

  17. Understanding Student Computational Thinking with Computational Modeling

    CERN Document Server

    Aiken, John M; Douglas, Scott S; Burk, John B; Scanlon, Erin M; Thoms, Brian D; Schatz, Michael F

    2012-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". Students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than obs...

  18. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  19. DNA based computers II

    CERN Document Server

    Landweber, Laura F; Baum, Eric B

    1998-01-01

    The fledgling field of DNA computers began in 1994 when Leonard Adleman surprised the scientific community by using DNA molecules, protein enzymes, and chemicals to solve an instance of a hard computational problem. This volume presents results from the second annual meeting on DNA computers held at Princeton only one and one-half years after Adleman's discovery. By drawing on the analogy between DNA computing and cutting-edge fields of biology (such as directed evolution), this volume highlights some of the exciting progress in the field and builds a strong foundation for the theory of molecular computation. DNA computing is a radically different approach to computing that brings together computer science and molecular biology in a way that is wholly distinct from other disciplines. This book outlines important advances in the field and offers comprehensive discussion on potential pitfalls and the general practicality of building DNA based computers.

  20. Duality quantum computing

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this article,we make a review on the development of a newly proposed quantum computer,duality computer,or the duality quantum computer and the duality mode of quantum computers.The duality computer is based on the particle-wave duality principle of quantum mechanics.Compared to an ordinary quantum computer,the duality quantum computer is a quantum computer on the move and passing through a multi-slit.It offers more computing operations than is possible with an ordinary quantum computer.The most two distinct operations are:the quantum division operation and the quantum combiner operation.The division operation divides the wave function of a quantum computer into many attenuated,and identical parts.The combiner operation combines the wave functions in different parts into a single part.The duality mode is a way in which a quantum computer with some extra qubit resource simulates a duality computer.The main structure of duality quantum computer and duality mode,the duality mode,their mathematical description and algorithm designs are reviewed.

  1. Structure-based drug design identifies novel LPA3 antagonists.

    Science.gov (United States)

    Fells, James I; Tsukahara, Ryoko; Liu, Jianxiong; Tigyi, Gabor; Parrill, Abby L

    2009-11-01

    Compound 5 ([5-(3-nitrophenoxy)-1,3-dioxo-1,3-dihydro-2-isoindol-2-yl]acetic acid) was identified as a weak selective LPA(3) antagonist (IC(50)=4504 nM) in a virtual screening effort to optimize a dual LPA(2 and 3) antagonist. Structure-based drug design techniques were used to prioritize similarity search matches of compound 5. This strategy rapidly identified 10 novel antagonists. The two most efficacious compounds identified inhibit activation of the LPA(3) receptor by 200 nM LPA with IC(50) values of 752 nM and 2992 nM. These compounds additionally define changes to our previously reported pharmacophore that will improve its ability to identify more potent and selective LPA(3) receptor antagonists. The results of the combined computational and experimental screening are reported.

  2. Computers as Cognitive Media: Examining the Potential of Computers in Education.

    Science.gov (United States)

    Hokanson, B.; Hooper, S.

    2000-01-01

    Examines criticisms of educational computer use, considers how society and schools have reacted to previous technological trends, and outlines relationships between diverse approaches to computer use and the outcomes that can be expected. Describes two approaches to media use, representational and generative, to identify instructional approaches…

  3. On Identifying which Intermediate Nodes Should Code in Multicast Networks

    DEFF Research Database (Denmark)

    Pinto, Tiago; Roetter, Daniel Enrique Lucani; Médard, Muriel

    2013-01-01

    the data packets. Previous work has shown that in lossless wireline networks, the performance of tree-packing mechanisms is comparable to network coding, albeit with added complexity at the time of computing the trees. This means that most nodes in the network need not code. Thus, mechanisms that identify...... intermediate nodes that do require coding is instrumental for the efficient operation of coded networks and can have a significant impact in overall energy consumption. We present a distributed, low complexity algorithm that allows every node to identify if it should code and, if so, through what output link...

  4. Lung cancer screening: identifying the high risk cohort

    OpenAIRE

    Marcus, Michael W.; Raji, Olaide Y; John K. Field

    2015-01-01

    Low dose computed tomography (LDCT) is a viable screening tool for early lung cancer detection and mortality reduction. In practice, the success of any lung cancer screening programme will depend on successful identification of individuals at high risk in order to maximise the benefit-harm ratio. Risk prediction models incorporating multiple risk factors have been recognised as a method of identifying individuals at high risk of developing lung cancer. Identification of individuals at high ri...

  5. Using biologically interrelated experiments to identify pathway genes in Arabidopsis

    OpenAIRE

    Kim, Kyungpil; Jiang, Keni; Teng, Siew Leng; Feldman, Lewis J.; Huang, Haiyan

    2012-01-01

    Motivation: Pathway genes are considered as a group of genes that work cooperatively in the same pathway constituting a fundamental functional grouping in a biological process. Identifying pathway genes has been one of the major tasks in understanding biological processes. However, due to the difficulty in characterizing/inferring different types of biological gene relationships, as well as several computational issues arising from dealing with high-dimensional biological data, deducing ge...

  6. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  7. Language and Computers

    CERN Document Server

    Dickinson, Markus; Meurers, Detmar

    2012-01-01

    Language and Computers introduces students to the fundamentals of how computers are used to represent, process, and organize textual and spoken information. Concepts are grounded in real-world examples familiar to students’ experiences of using language and computers in everyday life. A real-world introduction to the fundamentals of how computers process language, written specifically for the undergraduate audience, introducing key concepts from computational linguistics. Offers a comprehensive explanation of the problems computers face in handling natural language Covers a broad spectru

  8. Computer techniques for electromagnetics

    CERN Document Server

    Mittra, R

    1973-01-01

    Computer Techniques for Electromagnetics discusses the ways in which computer techniques solve practical problems in electromagnetics. It discusses the impact of the emergence of high-speed computers in the study of electromagnetics. This text provides a brief background on the approaches used by mathematical analysts in solving integral equations. It also demonstrates how to use computer techniques in computing current distribution, radar scattering, and waveguide discontinuities, and inverse scattering. This book will be useful for students looking for a comprehensive text on computer techni

  9. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw; Gokhale, Maya B.; McCabe, Kevin Peter

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  10. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  11. Aggregating job exit statuses of a plurality of compute nodes executing a parallel application

    Energy Technology Data Exchange (ETDEWEB)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Mundy, Michael B.

    2015-07-21

    Aggregating job exit statuses of a plurality of compute nodes executing a parallel application, including: identifying a subset of compute nodes in the parallel computer to execute the parallel application; selecting one compute node in the subset of compute nodes in the parallel computer as a job leader compute node; initiating execution of the parallel application on the subset of compute nodes; receiving an exit status from each compute node in the subset of compute nodes, where the exit status for each compute node includes information describing execution of some portion of the parallel application by the compute node; aggregating each exit status from each compute node in the subset of compute nodes; and sending an aggregated exit status for the subset of compute nodes in the parallel computer.

  12. 29 CFR 541.401 - Computer manufacture and repair.

    Science.gov (United States)

    2010-07-01

    ... programming or other similarly skilled computer-related occupations identified in § 541.400(b), are also not... 29 Labor 3 2010-07-01 2010-07-01 false Computer manufacture and repair. 541.401 Section 541.401... DEFINING AND DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND...

  13. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  14. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  15. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  16. Computer Intrusions and Attacks.

    Science.gov (United States)

    Falk, Howard

    1999-01-01

    Examines some frequently encountered unsolicited computer intrusions, including computer viruses, worms, Java applications, trojan horses or vandals, e-mail spamming, hoaxes, and cookies. Also discusses virus-protection software, both for networks and for individual users. (LRW)

  17. The Global Computer

    DEFF Research Database (Denmark)

    Sharp, Robin

    2002-01-01

    This paper describes a Danish project, involving partners from Copenhagen University, DTU, the University of Southern Denmark, Aalborg University, Copenhagen Business School and UNI-C, for exploiting Grid technology to provide computer resources for applications with very large computational...

  18. Optical Quantum Computing

    National Research Council Canada - National Science Library

    Jeremy L. O'Brien

    2007-01-01

    In 2001, all-optical quantum computing became feasible with the discovery that scalable quantum computing is possible using only single-photon sources, linear optical elements, and single-photon detectors...

  19. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known as a ... of page What are some common uses of the procedure? CT of the sinuses is primarily used ...

  20. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...