WorldWideScience

Sample records for computationally identified trifoliate

  1. EVALUATION OF COLOUR IN WHITE AND YELLOW TRIFOLIATE ...

    African Journals Online (AJOL)

    IBUKUN

    2010-03-20

    Mar 20, 2010 ... 2Department of Food Technology, University of Ibadan, Oyo State, Nigeria. ... Therefore, this work determines the colour in white and yellow trifoliate ... Freshly harvested trifoliate yam tubers were prepared into flour using four.

  2. California mild CTV strains that break resistance in Trifoliate Orange

    Science.gov (United States)

    This is the final report of a project to characterize California isolates of Citrus tristeza virus (CTV) that replicate in Poncirus trifoliata (trifoliate orange). Next Generation Sequencing (NGS) of viral small interfering RNAs (siRNAs) and assembly of full-length sequences of mild California CTV i...

  3. Evaluation of colour in white and yellow trifoliate yam flours in ...

    African Journals Online (AJOL)

    Colour is one of the important sensory properties that determine the acceptability of food products. Therefore, this work determines the colour in white and yellow trifoliate yam flours in relation to harvesting periods and pre-processing methods. Freshly harvested trifoliate yam tubers were prepared into flour using four ...

  4. Deep sequencing discovery of novel and conserved microRNAs in trifoliate orange (Citrus trifoliata

    Directory of Open Access Journals (Sweden)

    Yu Huaping

    2010-07-01

    Full Text Available Abstract Background MicroRNAs (miRNAs play a critical role in post-transcriptional gene regulation and have been shown to control many genes involved in various biological and metabolic processes. There have been extensive studies to discover miRNAs and analyze their functions in model plant species, such as Arabidopsis and rice. Deep sequencing technologies have facilitated identification of species-specific or lowly expressed as well as conserved or highly expressed miRNAs in plants. Results In this research, we used Solexa sequencing to discover new microRNAs in trifoliate orange (Citrus trifoliata which is an important rootstock of citrus. A total of 13,106,753 reads representing 4,876,395 distinct sequences were obtained from a short RNA library generated from small RNA extracted from C. trifoliata flower and fruit tissues. Based on sequence similarity and hairpin structure prediction, we found that 156,639 reads representing 63 sequences from 42 highly conserved miRNA families, have perfect matches to known miRNAs. We also identified 10 novel miRNA candidates whose precursors were all potentially generated from citrus ESTs. In addition, five miRNA* sequences were also sequenced. These sequences had not been earlier described in other plant species and accumulation of the 10 novel miRNAs were confirmed by qRT-PCR analysis. Potential target genes were predicted for most conserved and novel miRNAs. Moreover, four target genes including one encoding IRX12 copper ion binding/oxidoreductase and three genes encoding NB-LRR disease resistance protein have been experimentally verified by detection of the miRNA-mediated mRNA cleavage in C. trifoliata. Conclusion Deep sequencing of short RNAs from C. trifoliata flowers and fruits identified 10 new potential miRNAs and 42 highly conserved miRNA families, indicating that specific miRNAs exist in C. trifoliata. These results show that regulatory miRNAs exist in agronomically important trifoliate orange

  5. Trifoliate hybrids as rootstocks for Pêra sweet orange tree

    Directory of Open Access Journals (Sweden)

    Jorgino Pompeu Junior

    2014-03-01

    Full Text Available The Rangpur lime (Citrus limonia has been used as the main rootstock for Pêra sweet orange (C. sinensis trees. However, its susceptibility to citrus blight and citrus sudden death has led to the use of disease-tolerant rootstocks, such as Cleopatra mandarin reshni, Sunki mandarin (C. sunki and Swingle citrumelo (C. paradisi x Poncirus trifoliata, which are more susceptible to drought than the Rangpur lime. These mandarin varieties are also less resistant to root rot caused by Phytophthora, and the Swingle citrumelo showed to be incompatible with the Pêra sweet orange. In search of new rootstock varieties, this study aimed at assessing the fruit precocity and yield, susceptibility to tristeza and blight and occurrence of incompatibility of Pêra sweet orange trees grafted on 12 trifoliate hybrids, on Rangpur lime EEL and Goutou sour orange, without irrigation. Tristeza and blight are endemic in the experimental area. The Sunki x English (1628 and Changsha x English Small (1710 citrandarins and two other selections of Cleopatra x Rubidoux provided the highest cumulative yields, in the first three crops and in the total of six crops evaluated. The Cleopatra x Rubidoux (1660 and Sunki x Benecke (1697 citrandarins induced early yield, while the Cravo x Swingle citromonia and C-13 citrange induced later yield. None of the rootstock varieties caused alternate bearing. Pêra sweet orange trees grafted on Swingle citrumelo, Cleopatra x Swingle (1654 citrandarin and on two selections of Rangpur lime x Carrizo citrange showed bud-union-ring symptoms of incompatibility. None of the plants presented symptoms of tristeza or blight.

  6. Identification and Characterization of Citrus tristeza virus Isolates Breaking Resistance in Trifoliate Orange in California.

    Science.gov (United States)

    Yokomi, Raymond K; Selvaraj, Vijayanandraj; Maheshwari, Yogita; Saponari, Maria; Giampetruzzi, Annalisa; Chiumenti, Michela; Hajeri, Subhas

    2017-07-01

    Most Citrus tristeza virus (CTV) isolates in California are biologically mild and symptomless in commercial cultivars on CTV tolerant rootstocks. However, to better define California CTV isolates showing divergent serological and genetic profiles, selected isolates were subjected to deep sequencing of small RNAs. Full-length sequences were assembled, annotated and trifoliate orange resistance-breaking (RB) isolates of CTV were identified. Phylogenetic relationships based on their full genomes placed three isolates in the RB clade: CA-RB-115, CA-RB-AT25, and CA-RB-AT35. The latter two isolates were obtained by aphid transmission from Murcott and Dekopon trees, respectively, containing CTV mixtures. The California RB isolates were further distinguished into two subclades. Group I included CA-RB-115 and CA-RB-AT25 with 99% nucleotide sequence identity with RB type strain NZRB-G90; and group II included CA-RB-AT35 with 99 and 96% sequence identity with Taiwan Pumelo/SP/T1 and HA18-9, respectively. The RB phenotype was confirmed by detecting CTV replication in graft-inoculated Poncirus trifoliata and transmission from P. trifoliata to sweet orange. The California RB isolates induced mild symptoms compared with severe isolates in greenhouse indexing tests. Further examination of 570 CTV accessions, acquired from approximately 1960 and maintained in planta at the Central California Tristeza Eradication Agency, revealed 16 RB positive isolates based on partial p65 sequences. Six isolates collected from 1992 to 2011 from Tulare and Kern counties were CA-RB-115-like; and 10 isolates collected from 1968 to 2010 from Riverside, Fresno, and Kern counties were CA-RB-AT35-like. The presence of the RB genotype is relevant because P. trifoliata and its hybrids are the most popular rootstocks in California.

  7. Textural and sensory properties of trifoliate yam (Dioscorea dumetorum) flour and stiff dough 'amala'.

    Science.gov (United States)

    Abiodun, O A; Akinoso, R

    2015-05-01

    The use of trifoliate yam (Dioscorea dumetorum) flour for stiff dough 'amala' production is one of the ways to curb under-utilization of the tuber. The study evaluates the textural and sensory properties of trifoliate yam flour and stiff dough. Freshly harvested trifoliate yam tubers were peeled, washed, sliced and blanched (60 (°)C for 10 min). The sliced yam were soaked in water for 12 h, dried and milled into flour. Pasting viscosities, functional properties, brown index and sensory attributes of the flour and stiff dough were analyzed. Peak, holding strength and final viscosities ranged from 84.09 to 213.33 RVU, 81.25 to 157.00 RVU and 127.58 to 236.17 RVU respectively. White raw flour had higher viscosity than the yellow flours. The swelling index, water absorption capacity and bulk density ranged from 1.46 to 2.28, 2.11 to 2.92 ml H2O/g and 0.71 to 0.88 g/cm(3) respectively. Blanching method employed improved the swelling index and water absorption capacity of flour. The brown index values of flour and stiff dough ranged from 6.73 to 18.36 and 14.63-46.72 respectively. Sensory evaluation revealed significant differences in the colour, odour and general acceptability of the product when compared with the stiff dough from white yam.

  8. Identifying failure in a tree network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.

    2010-08-24

    Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.

  9. A Computer-Based Instrument That Identifies Common Science Misconceptions

    Science.gov (United States)

    Larrabee, Timothy G.; Stein, Mary; Barman, Charles

    2006-01-01

    This article describes the rationale for and development of a computer-based instrument that helps identify commonly held science misconceptions. The instrument, known as the Science Beliefs Test, is a 47-item instrument that targets topics in chemistry, physics, biology, earth science, and astronomy. The use of an online data collection system…

  10. A computer model for identifying security system upgrades

    International Nuclear Information System (INIS)

    Lamont, A.

    1988-01-01

    This paper describes a prototype safeguards analysis tool that automatically identifies system weaknesses against an insider adversary and suggest possible upgrades to improve the probability that the adversary will be detected. The tool is based on this premise: as the adversary acts, he or she creates a set of facts that can be detected by safeguards components. Whenever an adversary's planned set of actions create a set of facts which the security personnel would consider irregular or unusual, we can improve the security system by implementing safeguards that detect those facts. Therefore, an intelligent computer program can suggest upgrades to the facility if we construct a knowledge base that contains information about: (1) the facts created by each possible adversary action, (2) the facts that each possible safeguard can detect, and (3) groups of facts which will be considered irregular whenever they occur together. The authors describe the structure of the knowledge base and show how the above information can be represented in it. They also describe the procedures that a computer program can use to identify missing or weak safeguards and to suggest upgrades

  11. Identifying logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Science.gov (United States)

    Davis, Kristan D.; Faraj, Daniel A.

    2016-03-01

    In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.

  12. Computational approaches to identify functional genetic variants in cancer genomes

    DEFF Research Database (Denmark)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu......The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result...... of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype....

  13. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  14. Identifying Benefits and risks associated with utilizing cloud computing

    OpenAIRE

    Shayan, Jafar; Azarnik, Ahmad; Chuprat, Suriayati; Karamizadeh, Sasan; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is an emerging computing model where IT and computing operations are delivered as services in highly scalable and cost effective manner. Recently, embarking this new model in business has become popular. Companies in diverse sectors intend to leverage cloud computing architecture, platforms and applications in order to gain higher competitive advantages. Likewise other models, cloud computing brought advantages to attract business but meanwhile fostering cloud has led to some ...

  15. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    Science.gov (United States)

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  16. 76 FR 37111 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Science.gov (United States)

    2011-06-24

    ... Business Information by Computer Sciences Corporation and Its Identified Subcontractors AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY: EPA has authorized its contractor, Computer Sciences Corporation of Chantilly, VA and Its Identified Subcontractors, to access information which has...

  17. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  18. Identifying a Computer Forensics Expert: A Study to Measure the Characteristics of Forensic Computer Examiners

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2010-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The usage of digital evidence from electronic devices has been rapidly expanding within litigation, and along with this increased usage, the reliance upon forensic computer examiners to acquire, analyze, and report upon this evidence is also rapidly growing. This growing demand for forensic computer examiners raises questions concerning the selection of individuals qualified to perform this work. While courts have mechanisms for qualifying witnesses that provide testimony based on scientific data, such as digital data, the qualifying criteria covers a wide variety of characteristics including, education, experience, training, professional certifications, or other special skills. In this study, we compare task performance responses from forensic computer examiners with an expert review panel and measure the relationship with the characteristics of the examiners to their quality responses. The results of this analysis provide insight into identifying forensic computer examiners that provide high-quality responses. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  19. Alleviation of drought stress by mycorrhizas is related to increased root H2O2 efflux in trifoliate orange.

    Science.gov (United States)

    Huang, Yong-Ming; Zou, Ying-Ning; Wu, Qiang-Sheng

    2017-02-08

    The Non-invasive Micro-test Technique (NMT) is used to measure dynamic changes of specific ions/molecules non-invasively, but information about hydrogen peroxide (H 2 O 2 ) fluxes in different classes of roots by mycorrhiza is scarce in terms of NMT. Effects of Funneliformis mosseae on plant growth, H 2 O 2 , superoxide radical (O 2 ·- ), malondialdehyde (MDA) concentrations, and H 2 O 2 fluxes in the taproot (TR) and lateral roots (LRs) of trifoliate orange seedlings under well-watered (WW) and drought stress (DS) conditions were studied. DS strongly inhibited mycorrhizal colonization in the TR and LRs, whereas mycorrhizal inoculation significantly promoted plant growth and biomass production. H 2 O 2 , O 2 ·- , and MDA concentrations in leaves and roots were dramatically lower in mycorrhizal seedlings than in non-mycorrhizal seedlings under DS. Compared with non-mycorrhizal seedlings, mycorrhizal seedlings had relatively higher net root H 2 O 2 effluxes in the TR and LRs especially under WW, as well as significantly higher total root H 2 O 2 effluxes in the TR and LRs under WW and DS. Total root H 2 O 2 effluxes were significantly positively correlated with root colonization but negatively with root H 2 O 2 and MDA concentrations. It suggested that mycorrhizas induces more H 2 O 2 effluxes of the TR and LRs, thus, alleviating oxidative damage of DS in the host plant.

  20. Boron alleviates the aluminum toxicity in trifoliate orange by regulating antioxidant defense system and reducing root cell injury.

    Science.gov (United States)

    Riaz, Muhammad; Yan, Lei; Wu, Xiuwen; Hussain, Saddam; Aziz, Omar; Wang, Yuhan; Imran, Muhammad; Jiang, Cuncang

    2018-02-15

    Aluminium (Al) toxicity is the most important soil constraint for plant growth and development in acid soils (pH Boron (B) is an essential micronutrient for the growth and development of higher plants. The results of previous studies propose that B might ameliorate Al toxicity; however, none of the studies have been conducted on trifoliate orange to study this effect. Thus, a study was carried out in hydroponics comprising of two different Al concentrations, 0 and 400 μM. For every concentration, two B treatments (0 and 10 μM as H 3 BO 3 ) were applied to investigate the B-induced alleviation of Al toxicity and exploring the underneath mechanisms. The results revealed that Al toxicity under B deficiency severely hampered the root growth and physiology of plant, caused oxidative stress and membrane damage, leading to severe root injury and damage. However, application of B under Al toxicity improved the root elongation and photosynthesis, while reduced Al uptake and mobilization into plant parts. Moreover, B supply regulated the activities of antioxidant enzymes, proline, secondary metabolites (phenylalanine ammonia lyase and polyphenol oxidase) contents, and stabilized integrity of proteins. Our study results imply that B supply promoted root growth as well as defense system by reducing reactive oxygen species (ROS) and Al concentrations in plant parts thus B induced alleviation of Al toxicity; a fact that might be significant for higher productivity of agricultural plants grown in acidic conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Direct and indirect effects of glomalin, mycorrhizal hyphae, and roots on aggregate stability in rhizosphere of trifoliate orange.

    Science.gov (United States)

    Wu, Qiang-Sheng; Cao, Ming-Qin; Zou, Ying-Ning; He, Xin-hua

    2014-07-25

    To test direct and indirect effects of glomalin, mycorrhizal hyphae, and roots on aggregate stability, perspex pots separated by 37-μm nylon mesh in the middle were used to form root-free hyphae and root/hyphae chambers, where trifoliate orange (Poncirus trifoliata) seedlings were colonized by Funneliformis mosseae or Paraglomus occultum in the root/hyphae chamber. Both fungal species induced significantly higher plant growth, root total length, easily-extractable glomalin-related soil protein (EE-GRSP) and total GRSP (T-GRSP), and mean weight diameter (an aggregate stability indicator). The Pearson correlation showed that root colonization or soil hyphal length significantly positively correlated with EE-GRSP, difficultly-extractable GRSP (DE-GRSP), T-GRSP, and water-stable aggregates in 2.00-4.00, 0.50-1.00, and 0.25-0.50 mm size fractions. The path analysis indicated that in the root/hyphae chamber, aggregate stability derived from a direct effect of root colonization, EE-GRSP or DE-GRSP. Meanwhile, the direct effect was stronger by EE-GRSP or DE-GRSP than by mycorrhizal colonization. In the root-free hyphae chamber, mycorrhizal-mediated aggregate stability was due to total effect but not direct effect of soil hyphal length, EE-GRSP and T-GRSP. Our results suggest that GRSP among these tested factors may be the primary contributor to aggregate stability in the citrus rhizosphere.

  2. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    Science.gov (United States)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  3. Influence of Cultivar on the Postharvest Hardening of Trifoliate Yam (Dioscorea dumetorum Tubers

    Directory of Open Access Journals (Sweden)

    Christian Siadjeu

    2016-01-01

    Full Text Available The influence of cultivar on the postharvest hardening of Dioscorea dumetorum tubers was assessed. 32 cultivars of D. dumetorum tubers were planted in April 2014, harvested at physiological maturity, and stored under prevailing tropical ambient conditions (19–28°C, 60–85% RH for 0, 5, 14, 21, and 28 days. Samples were evaluated for cooked hardness. Results showed that one cultivar, Ibo sweet 3, was not affected by the hardening phenomenon. The remaining 31 were all subject to the hardening phenomenon at different degree. Cooked hardness increased more rapidly in cultivars with many roots on the tuber surface compared to cultivars with few roots on the tuber surface. When both the characteristics flesh colour and number of roots on tuber surface were associated, cooked hardness in cultivars with yellow flesh and many roots increased more rapidly than in cultivars with white flesh and many roots, whereas cooked hardness in cultivars with yellow flesh and few roots increased more slowly than in cultivars with white flesh and few roots. Accessions collected in high altitude increased more rapidly compared to accessions collected in low altitude. The cultivar Ibo sweet 3 identified in this study could provide important information for breeding program of D. dumetorum against postharvest hardening phenomenon.

  4. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    Science.gov (United States)

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  5. Identification and comparative profiling of miRNAs in an early flowering mutant of trifoliate orange and its wild type by genome-wide deep sequencing.

    Directory of Open Access Journals (Sweden)

    Lei-Ming Sun

    Full Text Available MicroRNAs (miRNAs are a new class of small, endogenous RNAs that play a regulatory role in various biological and metabolic processes by negatively affecting gene expression at the post-transcriptional level. While the number of known Arabidopsis and rice miRNAs is continuously increasing, information regarding miRNAs from woody plants such as citrus remains limited. Solexa sequencing was performed at different developmental stages on both an early flowering mutant of trifoliate orange (precocious trifoliate orange, Poncirus trifoliata L. Raf. and its wild-type in this study, resulting in the obtainment of 141 known miRNAs belonging to 99 families and 75 novel miRNAs in four libraries. A total of 317 potential target genes were predicted based on the 51 novel miRNAs families, GO and KEGG annotation revealed that high ranked miRNA-target genes are those implicated in diverse cellular processes in plants, including development, transcription, protein degradation and cross adaptation. To characterize those miRNAs expressed at the juvenile and adult development stages of the mutant and its wild-type, further analysis on the expression profiles of several miRNAs through real-time PCR was performed. The results revealed that most miRNAs were down-regulated at adult stage compared with juvenile stage for both the mutant and its wild-type. These results indicate that both conserved and novel miRNAs may play important roles in citrus growth and development, stress responses and other physiological processes.

  6. Global identifiability of linear compartmental models--a computer algebra algorithm.

    Science.gov (United States)

    Audoly, S; D'Angiò, L; Saccomani, M P; Cobelli, C

    1998-01-01

    A priori global identifiability deals with the uniqueness of the solution for the unknown parameters of a model and is, thus, a prerequisite for parameter estimation of biological dynamic models. Global identifiability is however difficult to test, since it requires solving a system of algebraic nonlinear equations which increases both in nonlinearity degree and number of terms and unknowns with increasing model order. In this paper, a computer algebra tool, GLOBI (GLOBal Identifiability) is presented, which combines the topological transfer function method with the Buchberger algorithm, to test global identifiability of linear compartmental models. GLOBI allows for the automatic testing of a priori global identifiability of general structure compartmental models from general multi input-multi output experiments. Examples of usage of GLOBI to analyze a priori global identifiability of some complex biological compartmental models are provided.

  7. Identify and rank key factors influencing the adoption of cloud computing for a healthy Electronics

    Directory of Open Access Journals (Sweden)

    Javad Shukuhy

    2015-02-01

    Full Text Available Cloud computing as a new technology with Internet infrastructure and new approaches can be significant benefits in providing medical services electronically. Aplying this technology in E-Health requires consideration of various factors. The main objective of this study is to identify and rank the factors influencing the adoption of e-health cloud. Based on the Technology-Organization-Environment (TOE framework and Human-Organization-Technology fit (HOT-fit model, 16 sub-factors were identified in four major factors. With survey of 60 experts, academics and experts in health information technology and with the help of fuzzy analytic hierarchy process had ranked these sub-factors and factors. In the literature, considering newness this study, no internal or external study, have not alluded these number of criteria. The results show that when deciding to adopt cloud computing in E-Health, respectively, must be considered technological, human, organizational and environmental factors.

  8. Vehicle systems and payload requirements evaluation. [computer programs for identifying launch vehicle system requirements

    Science.gov (United States)

    Rea, F. G.; Pittenger, J. L.; Conlon, R. J.; Allen, J. D.

    1975-01-01

    Techniques developed for identifying launch vehicle system requirements for NASA automated space missions are discussed. Emphasis is placed on development of computer programs and investigation of astrionics for OSS missions and Scout. The Earth Orbit Mission Program - 1 which performs linear error analysis of launch vehicle dispersions for both vehicle and navigation system factors is described along with the Interactive Graphic Orbit Selection program which allows the user to select orbits which satisfy mission requirements and to evaluate the necessary injection accuracy.

  9. Híbridos de trifoliata como porta-enxertos para a laranjeira 'Valência' Trifoliate hybrids as rootstocks for sweet orange 'Valência'

    Directory of Open Access Journals (Sweden)

    Jorgino Pompeu Junior

    2009-07-01

    Full Text Available O objetivo deste trabalho foi avaliar a produtividade e as características agronômicas de laranjeira 'Valência', enxertadas em porta-enxertos de híbridos de trifoliata (Poncirus trifoliata. A produção de frutos, a de sólidos solúveis totais por planta, as dimensões e a eficiência produtiva de copas de laranjeira 'Valência', enxertadas em 13 híbridos de trifoliata, cultivados sem irrigação, foram avaliados por períodos que variaram de três a oito anos. As plantas também foram avaliadas, visualmente, quanto à manifestação dos sintomas de tristeza (Citrus tristeza virus e de declínio-dos-citros, e foi utilizado o teste diagnóstico "dot immunobinding assay" (DIBA, para detecção da ocorrência do declínio antes do aparecimento dos sintomas. As plantas tinham oito anos de idade, no início das avaliações. Verificou-se que o citrandarin 'Sunki' x 'English' induziu as maiores produções de frutos em oito colheitas, sem diferir significativamente do citrange 'Troyer'. Em três anos de análise, o citrandarin 'Sunki' x 'English', sem diferir dos citranges 'Troyer' e 'Carrizo', também induziu as maiores produções de frutos e sólidos solúveis por planta. O citrentin 'Clementina' x trifoliata, os citrandarins 'Cleópatra' x 'Swingle' (715 e (1.614, 'Cleópatra' x 'Rubidoux' (1.600 e 'Cleópatra' x 'Christian' induziram a formação de laranjeiras da cultivar Valência com alturas iguais ou inferiores a 2,5 m. Nenhuma das plantas apresentou sintomas de tristeza ou declínio-dos-citros. Foi constatada a incompatibilidade entre a cultivar Valência e o trangpur 'Cravo' x 'Carrizo'.The objective of this work was to evaluate the productivity and agronomic traits of 'Valência' sweet orange tree budded onto trifoliate (Poncirus trifoliata hybrids rootstocks. Fruit production, total soluble solids production per plant, canopy production efficiency and dimensions of 'Valência' sweet orange trees budded onto 13 trifoliate hybrids

  10. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words.

    Science.gov (United States)

    Wang, Bingkun; Huang, Yongfeng; Wu, Xian; Li, Xing

    2015-01-01

    With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods.

  11. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words

    Directory of Open Access Journals (Sweden)

    Bingkun Wang

    2015-01-01

    Full Text Available With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods.

  12. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words

    Science.gov (United States)

    Huang, Yongfeng; Wu, Xian; Li, Xing

    2015-01-01

    With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods. PMID:26106409

  13. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  14. Application of Computer Simulation to Identify Erosion Resistance of Materials of Wet-steam Turbine Blades

    Science.gov (United States)

    Korostelyov, D. A.; Dergachyov, K. V.

    2017-10-01

    A problem of identifying the efficiency of using materials, coatings, linings and solderings of wet-steam turbine rotor blades by means of computer simulation is considered. Numerical experiments to define erosion resistance of materials of wet-steam turbine blades are described. Kinetic curves for erosion area and weight of the worn rotor blade material of turbines K-300-240 LMP and atomic icebreaker “Lenin” have been defined. The conclusion about the effectiveness of using different erosion-resistant materials and protection configuration of rotor blades is also made.

  15. Application of artificial neural networks to identify equilibration in computer simulations

    Science.gov (United States)

    Leibowitz, Mitchell H.; Miller, Evan D.; Henry, Michael M.; Jankowski, Eric

    2017-11-01

    Determining which microstates generated by a thermodynamic simulation are representative of the ensemble for which sampling is desired is a ubiquitous, underspecified problem. Artificial neural networks are one type of machine learning algorithm that can provide a reproducible way to apply pattern recognition heuristics to underspecified problems. Here we use the open-source TensorFlow machine learning library and apply it to the problem of identifying which hypothetical observation sequences from a computer simulation are “equilibrated” and which are not. We generate training populations and test populations of observation sequences with embedded linear and exponential correlations. We train a two-neuron artificial network to distinguish the correlated and uncorrelated sequences. We find that this simple network is good enough for > 98% accuracy in identifying exponentially-decaying energy trajectories from molecular simulations.

  16. Computational modeling identifies key gene regulatory interactions underlying phenobarbital-mediated tumor promotion

    Science.gov (United States)

    Luisier, Raphaëlle; Unterberger, Elif B.; Goodman, Jay I.; Schwarz, Michael; Moggs, Jonathan; Terranova, Rémi; van Nimwegen, Erik

    2014-01-01

    Gene regulatory interactions underlying the early stages of non-genotoxic carcinogenesis are poorly understood. Here, we have identified key candidate regulators of phenobarbital (PB)-mediated mouse liver tumorigenesis, a well-characterized model of non-genotoxic carcinogenesis, by applying a new computational modeling approach to a comprehensive collection of in vivo gene expression studies. We have combined our previously developed motif activity response analysis (MARA), which models gene expression patterns in terms of computationally predicted transcription factor binding sites with singular value decomposition (SVD) of the inferred motif activities, to disentangle the roles that different transcriptional regulators play in specific biological pathways of tumor promotion. Furthermore, transgenic mouse models enabled us to identify which of these regulatory activities was downstream of constitutive androstane receptor and β-catenin signaling, both crucial components of PB-mediated liver tumorigenesis. We propose novel roles for E2F and ZFP161 in PB-mediated hepatocyte proliferation and suggest that PB-mediated suppression of ESR1 activity contributes to the development of a tumor-prone environment. Our study shows that combining MARA with SVD allows for automated identification of independent transcription regulatory programs within a complex in vivo tissue environment and provides novel mechanistic insights into PB-mediated hepatocarcinogenesis. PMID:24464994

  17. Rapid, computer vision-enabled murine screening system identifies neuropharmacological potential of two new mechanisms

    Directory of Open Access Journals (Sweden)

    Steven L Roberds

    2011-09-01

    Full Text Available The lack of predictive in vitro models for behavioral phenotypes impedes rapid advancement in neuropharmacology and psychopharmacology. In vivo behavioral assays are more predictive of activity in human disorders, but such assays are often highly resource-intensive. Here we describe the successful application of a computer vision-enabled system to identify potential neuropharmacological activity of two new mechanisms. The analytical system was trained using multiple drugs that are used clinically to treat depression, schizophrenia, anxiety, and other psychiatric or behavioral disorders. During blinded testing the PDE10 inhibitor TP-10 produced a signature of activity suggesting potential antipsychotic activity. This finding is consistent with TP-10’s activity in multiple rodent models that is similar to that of clinically used antipsychotic drugs. The CK1ε inhibitor PF-670462 produced a signature consistent with anxiolytic activity and, at the highest dose tested, behavioral effects similar to that of opiate analgesics. Neither TP-10 nor PF-670462 was included in the training set. Thus, computer vision-based behavioral analysis can facilitate drug discovery by identifying neuropharmacological effects of compounds acting through new mechanisms.

  18. Highly efficient computer algorithm for identifying layer thickness of atomically thin 2D materials

    Science.gov (United States)

    Lee, Jekwan; Cho, Seungwan; Park, Soohyun; Bae, Hyemin; Noh, Minji; Kim, Beom; In, Chihun; Yang, Seunghoon; Lee, Sooun; Seo, Seung Young; Kim, Jehyun; Lee, Chul-Ho; Shim, Woo-Young; Jo, Moon-Ho; Kim, Dohun; Choi, Hyunyong

    2018-03-01

    The fields of layered material research, such as transition-metal dichalcogenides (TMDs), have demonstrated that the optical, electrical and mechanical properties strongly depend on the layer number N. Thus, efficient and accurate determination of N is the most crucial step before the associated device fabrication. An existing experimental technique using an optical microscope is the most widely used one to identify N. However, a critical drawback of this approach is that it relies on extensive laboratory experiences to estimate N; it requires a very time-consuming image-searching task assisted by human eyes and secondary measurements such as atomic force microscopy and Raman spectroscopy, which are necessary to ensure N. In this work, we introduce a computer algorithm based on the image analysis of a quantized optical contrast. We show that our algorithm can apply to a wide variety of layered materials, including graphene, MoS2, and WS2 regardless of substrates. The algorithm largely consists of two parts. First, it sets up an appropriate boundary between target flakes and substrate. Second, to compute N, it automatically calculates the optical contrast using an adaptive RGB estimation process between each target, which results in a matrix with different integer Ns and returns a matrix map of Ns onto the target flake position. Using a conventional desktop computational power, the time taken to display the final N matrix was 1.8 s on average for the image size of 1280 pixels by 960 pixels and obtained a high accuracy of 90% (six estimation errors among 62 samples) when compared to the other methods. To show the effectiveness of our algorithm, we also apply it to TMD flakes transferred on optically transparent c-axis sapphire substrates and obtain a similar result of the accuracy of 94% (two estimation errors among 34 samples).

  19. Integration of experimental and computational methods for identifying geometric, thermal and diffusive properties of biomaterials

    Science.gov (United States)

    Weres, Jerzy; Kujawa, Sebastian; Olek, Wiesław; Czajkowski, Łukasz

    2016-04-01

    Knowledge of physical properties of biomaterials is important in understanding and designing agri-food and wood processing industries. In the study presented in this paper computational methods were developed and combined with experiments to enhance identification of agri-food and forest product properties, and to predict heat and water transport in such products. They were based on the finite element model of heat and water transport and supplemented with experimental data. Algorithms were proposed for image processing, geometry meshing, and inverse/direct finite element modelling. The resulting software system was composed of integrated subsystems for 3D geometry data acquisition and mesh generation, for 3D geometry modelling and visualization, and for inverse/direct problem computations for the heat and water transport processes. Auxiliary packages were developed to assess performance, accuracy and unification of data access. The software was validated by identifying selected properties and using the estimated values to predict the examined processes, and then comparing predictions to experimental data. The geometry, thermal conductivity, specific heat, coefficient of water diffusion, equilibrium water content and convective heat and water transfer coefficients in the boundary layer were analysed. The estimated values, used as an input for simulation of the examined processes, enabled reduction in the uncertainty associated with predictions.

  20. Disruption of mycorrhizal extraradical mycelium and changes in leaf water status and soil aggregate stability in rootbox-grown trifoliate orange

    Directory of Open Access Journals (Sweden)

    Ying-Ning eZou

    2015-03-01

    Full Text Available Arbuscular mycorrhizas possess well developed extraradical mycelium (ERM network that enlarge the surrounding soil for better acquisition of water and nutrients, besides soil aggregation. Distinction in ERM functioning was studied under a rootbox system, which consisted of root+hyphae and root-free hyphae compartments separated by 37-μm nylon mesh with an air gap. Trifoliate orange (Poncirus trifoliata seedlings were inoculated with Funneliformis mosseae in root+hyphae compartment, and the ERM network was established between the two compartments. The ERM network of air gap was disrupted before 8 h of the harvest (one time disruption or multiple disruptions during seedlings acclimation. Our results showed that mycorrhizal inoculation induced a significant increase in growth (plant height, stem diameter, and leaf, stem, and root biomass and physiological characters (leaf relative water content, leaf water potential, and transpiration rate, irrespective of ERM status. Easily-extractable glomalin-related soil protein (EE-GRSP and total GRSP (T-GRSP concentration and mean weight diameter (MWD, an indicator of soil aggregate stability were significantly higher in mycorrhizosphere of root+hyphae and root-free hyphae compartments than non-mycorrhizosphere. One time disruption of ERM network did not influence plant growth and soil properties but only notably decreased leaf water. Periodical disruption of ERM network at weekly interval markedly inhibited the mycorrhizal roles on plant growth, leaf water, GRSP production, and MWD in root+hyphae and hyphae chambers. EE-GRSP was the most responsive GRSP fraction to changes in leaf water and MWD under root+hyphae and hyphae conditions. It suggests that effect of peridical disruption of ERM network was more impactful than one-time disruption of ERM network with regard to leaf water, plant growth, and aggregate stability responses, thereby, implying ERM network aided in developing the host plant metabolically

  1. A computational technique to identify the optimal stiffness matrix for a discrete nuclear fuel assembly model

    International Nuclear Information System (INIS)

    Park, Nam-Gyu; Kim, Kyoung-Joo; Kim, Kyoung-Hong; Suh, Jung-Min

    2013-01-01

    Highlights: ► An identification method of the optimal stiffness matrix for a fuel assembly structure is discussed. ► The least squares optimization method is introduced, and a closed form solution of the problem is derived. ► The method can be expanded to the system with the limited number of modes. ► Identification error due to the perturbed mode shape matrix is analyzed. ► Verification examples show that the proposed procedure leads to a reliable solution. -- Abstract: A reactor core structural model which is used to evaluate the structural integrity of the core contains nuclear fuel assembly models. Since the reactor core consists of many nuclear fuel assemblies, the use of a refined fuel assembly model leads to a considerable amount of computing time for performing nonlinear analyses such as the prediction of seismic induced vibration behaviors. The computational time could be reduced by replacing the detailed fuel assembly model with a simplified model that has fewer degrees of freedom, but the dynamic characteristics of the detailed model must be maintained in the simplified model. Such a model based on an optimal design method is proposed in this paper. That is, when a mass matrix and a mode shape matrix are given, the optimal stiffness matrix of a discrete fuel assembly model can be estimated by applying the least squares minimization method. The verification of the method is completed by comparing test results and simulation results. This paper shows that the simplified model's dynamic behaviors are quite similar to experimental results and that the suggested method is suitable for identifying reliable mathematical model for fuel assemblies

  2. Transcriptator: An Automated Computational Pipeline to Annotate Assembled Reads and Identify Non Coding RNA.

    Directory of Open Access Journals (Sweden)

    Kumar Parijat Tripathi

    Full Text Available RNA-seq is a new tool to measure RNA transcript counts, using high-throughput sequencing at an extraordinary accuracy. It provides quantitative means to explore the transcriptome of an organism of interest. However, interpreting this extremely large data into biological knowledge is a problem, and biologist-friendly tools are lacking. In our lab, we developed Transcriptator, a web application based on a computational Python pipeline with a user-friendly Java interface. This pipeline uses the web services available for BLAST (Basis Local Search Alignment Tool, QuickGO and DAVID (Database for Annotation, Visualization and Integrated Discovery tools. It offers a report on statistical analysis of functional and Gene Ontology (GO annotation's enrichment. It helps users to identify enriched biological themes, particularly GO terms, pathways, domains, gene/proteins features and protein-protein interactions related informations. It clusters the transcripts based on functional annotations and generates a tabular report for functional and gene ontology annotations for each submitted transcript to the web server. The implementation of QuickGo web-services in our pipeline enable the users to carry out GO-Slim analysis, whereas the integration of PORTRAIT (Prediction of transcriptomic non coding RNA (ncRNA by ab initio methods helps to identify the non coding RNAs and their regulatory role in transcriptome. In summary, Transcriptator is a useful software for both NGS and array data. It helps the users to characterize the de-novo assembled reads, obtained from NGS experiments for non-referenced organisms, while it also performs the functional enrichment analysis of differentially expressed transcripts/genes for both RNA-seq and micro-array experiments. It generates easy to read tables and interactive charts for better understanding of the data. The pipeline is modular in nature, and provides an opportunity to add new plugins in the future. Web application is

  3. Role of peripheral quantitative computed tomography in identifying disuse osteoporosis in paraplegia

    International Nuclear Information System (INIS)

    Coupaud, Sylvie; McLean, Alan N.; Allan, David B.

    2009-01-01

    Disuse osteoporosis is a major long-term health consequence of spinal cord injury (SCI) that still needs to be addressed. Its management in SCI should begin with accurate diagnosis, followed by targeted treatments in the most vulnerable subgroups. We present data quantifying disuse osteoporosis in a cross-section of the Scottish paraplegic population to identify subgroups with lowest bone mineral density (BMD). Forty-seven people with chronic SCI at levels T2-L2 were scanned using peripheral quantitative computed tomography at four tibial sites and two femoral sites, at the Queen Elizabeth National Spinal Injuries Unit, Glasgow (UK). At the distal epiphyses, trabecular BMD (BMDtrab), total BMD, total bone cross-sectional area (CSA) and bone mineral content (BMC) were determined. In the diaphyses, cortical BMD, total bone CSA, cortical CSA and BMC were calculated. Bone, muscle and fat CSAs were estimated in the lower leg and thigh. BMDtrab decreased exponentially with time since injury at different rates in the tibia and femur. At most sites, female paraplegics had significantly lower BMC, total bone CSA and muscle CSA than male paraplegics. Subjects with lumbar SCI tended to have lower bone values and smaller muscle CSAs than in thoracic SCI. At the distal epiphyses of the tibia and femur, there is generally a rapid and extensive reduction in BMDtrab after SCI. Female subjects, and those with lumbar SCI, tend to have lower bone values than males or those with thoracic SCI, respectively. (orig.)

  4. LID: Computer code for identifying atomic and ionic lines below 3500 Angstroms

    International Nuclear Information System (INIS)

    Peek, J.M.; Dukart, R.J.

    1987-08-01

    An interactive computer code has been written to search a data base containing information useful for identifying lines in experimentally-observed spectra or for designing experiments. The data base was the basis for the Kelly and Palumbo critical review of well-resolved lines below 2000 Angstroms, includes lines below 3500 Angstroms for atoms and ions of hydrogen through krypton, and was obtained from R.L. Kelly. This code allows the user to search the data base for a user-specified wavelength region, with this search either limited to atoms or ions of the user's choice for all atoms and ions contained in the data base. The line information found in the search is stored in a local file for later reference. A plotting capability is provided to graphically display the lines resulting from the search. Several options are available to control the nature of these graphs. It is also possible to bring in data from another source, such as an experimental spectra, for display along with the lines from the data-base search. Options for manipulating the experimental spectra's background intensity and wavelength scale are also available to the user. The intensities for the lines from each ion found in the data-base search can be scaled by a multiplicative constant to better simulate the observed spectrum

  5. Role of peripheral quantitative computed tomography in identifying disuse osteoporosis in paraplegia

    Energy Technology Data Exchange (ETDEWEB)

    Coupaud, Sylvie [University of Glasgow, Centre for Rehabilitation Engineering, Department of Mechanical Engineering, Glasgow (United Kingdom); Southern General Hospital, Queen Elizabeth National Spinal Injuries Unit, Glasgow (United Kingdom); McLean, Alan N.; Allan, David B. [Southern General Hospital, Queen Elizabeth National Spinal Injuries Unit, Glasgow (United Kingdom)

    2009-10-15

    Disuse osteoporosis is a major long-term health consequence of spinal cord injury (SCI) that still needs to be addressed. Its management in SCI should begin with accurate diagnosis, followed by targeted treatments in the most vulnerable subgroups. We present data quantifying disuse osteoporosis in a cross-section of the Scottish paraplegic population to identify subgroups with lowest bone mineral density (BMD). Forty-seven people with chronic SCI at levels T2-L2 were scanned using peripheral quantitative computed tomography at four tibial sites and two femoral sites, at the Queen Elizabeth National Spinal Injuries Unit, Glasgow (UK). At the distal epiphyses, trabecular BMD (BMDtrab), total BMD, total bone cross-sectional area (CSA) and bone mineral content (BMC) were determined. In the diaphyses, cortical BMD, total bone CSA, cortical CSA and BMC were calculated. Bone, muscle and fat CSAs were estimated in the lower leg and thigh. BMDtrab decreased exponentially with time since injury at different rates in the tibia and femur. At most sites, female paraplegics had significantly lower BMC, total bone CSA and muscle CSA than male paraplegics. Subjects with lumbar SCI tended to have lower bone values and smaller muscle CSAs than in thoracic SCI. At the distal epiphyses of the tibia and femur, there is generally a rapid and extensive reduction in BMDtrab after SCI. Female subjects, and those with lumbar SCI, tend to have lower bone values than males or those with thoracic SCI, respectively. (orig.)

  6. [Key effect genes responding to nerve injury identified by gene ontology and computer pattern recognition].

    Science.gov (United States)

    Pan, Qian; Peng, Jin; Zhou, Xue; Yang, Hao; Zhang, Wei

    2012-07-01

    In order to screen out important genes from large gene data of gene microarray after nerve injury, we combine gene ontology (GO) method and computer pattern recognition technology to find key genes responding to nerve injury, and then verify one of these screened-out genes. Data mining and gene ontology analysis of gene chip data GSE26350 was carried out through MATLAB software. Cd44 was selected from screened-out key gene molecular spectrum by comparing genes' different GO terms and positions on score map of principal component. Function interferences were employed to influence the normal binding of Cd44 and one of its ligands, chondroitin sulfate C (CSC), to observe neurite extension. Gene ontology analysis showed that the first genes on score map (marked by red *) mainly distributed in molecular transducer activity, receptor activity, protein binding et al molecular function GO terms. Cd44 is one of six effector protein genes, and attracted us with its function diversity. After adding different reagents into the medium to interfere the normal binding of CSC and Cd44, varying-degree remissions of CSC's inhibition on neurite extension were observed. CSC can inhibit neurite extension through binding Cd44 on the neuron membrane. This verifies that important genes in given physiological processes can be identified by gene ontology analysis of gene chip data.

  7. Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.

    Science.gov (United States)

    Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P

    2010-01-15

    A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.

  8. The Ulam Index: Methods of Theoretical Computer Science Help in Identifying Chemical Substances

    Science.gov (United States)

    Beltran, Adriana; Salvador, James

    1997-01-01

    In this paper, we show how methods developed for solving a theoretical computer problem of graph isomorphism are used in structural chemistry. We also discuss potential applications of these methods to exobiology: the search for life outside Earth.

  9. Identifying controlling variables for math computation fluency through experimental analysis: the interaction of stimulus control and reinforcing consequences.

    Science.gov (United States)

    Hofstadter-Duke, Kristi L; Daly, Edward J

    2015-03-01

    This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.

  10. Early phase drug discovery: cheminformatics and computational techniques in identifying lead series.

    Science.gov (United States)

    Duffy, Bryan C; Zhu, Lei; Decornez, Hélène; Kitchen, Douglas B

    2012-09-15

    Early drug discovery processes rely on hit finding procedures followed by extensive experimental confirmation in order to select high priority hit series which then undergo further scrutiny in hit-to-lead studies. The experimental cost and the risk associated with poor selection of lead series can be greatly reduced by the use of many different computational and cheminformatic techniques to sort and prioritize compounds. We describe the steps in typical hit identification and hit-to-lead programs and then describe how cheminformatic analysis assists this process. In particular, scaffold analysis, clustering and property calculations assist in the design of high-throughput screening libraries, the early analysis of hits and then organizing compounds into series for their progression from hits to leads. Additionally, these computational tools can be used in virtual screening to design hit-finding libraries and as procedures to help with early SAR exploration. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. E-pharmacovigilance: development and implementation of a computable knowledge base to identify adverse drug reactions.

    Science.gov (United States)

    Neubert, Antje; Dormann, Harald; Prokosch, Hans-Ulrich; Bürkle, Thomas; Rascher, Wolfgang; Sojer, Reinhold; Brune, Kay; Criegee-Rieck, Manfred

    2013-09-01

    Computer-assisted signal generation is an important issue for the prevention of adverse drug reactions (ADRs). However, due to poor standardization of patients' medical data and a lack of computable medical drug knowledge the specificity of computerized decision support systems for early ADR detection is too low and thus those systems are not yet implemented in daily clinical practice. We report on a method to formalize knowledge about ADRs based on the Summary of Product Characteristics (SmPCs) and linking them with structured patient data to generate safety signals automatically and with high sensitivity and specificity. A computable ADR knowledge base (ADR-KB) that inherently contains standardized concepts for ADRs (WHO-ART), drugs (ATC) and laboratory test results (LOINC) was built. The system was evaluated in study populations of paediatric and internal medicine inpatients. A total of 262 different ADR concepts related to laboratory findings were linked to 212 LOINC terms. The ADR knowledge base was retrospectively applied to a study population of 970 admissions (474 internal and 496 paediatric patients), who underwent intensive ADR surveillance. The specificity increased from 7% without ADR-KB up to 73% in internal patients and from 19.6% up to 91% in paediatric inpatients, respectively. This study shows that contextual linkage of patients' medication data with laboratory test results is a useful and reasonable instrument for computer-assisted ADR detection and a valuable step towards a systematic drug safety process. The system enables automated detection of ADRs during clinical practice with a quality close to intensive chart review. © 2013 The Authors. British Journal of Clinical Pharmacology © 2013 The British Pharmacological Society.

  12. Use of cone beam computed tomography in identifying postmenopausal women with osteoporosis.

    Science.gov (United States)

    Brasileiro, C B; Chalub, L L F H; Abreu, M H N G; Barreiros, I D; Amaral, T M P; Kakehasi, A M; Mesquita, R A

    2017-12-01

    The aim of this study is to correlate radiometric indices from cone beam computed tomography (CBCT) images and bone mineral density (BMD) in postmenopausal women. Quantitative CBCT indices can be used to screen for women with low BMD. Osteoporosis is a disease characterized by the deterioration of bone tissue and the consequent decrease in BMD and increase in bone fragility. Several studies have been performed to assess radiometric indices in panoramic images as low-BMD predictors. The aim of this study is to correlate radiometric indices from CBCT images and BMD in postmenopausal women. Sixty postmenopausal women with indications for dental implants and CBCT evaluation were selected. Dual-energy X-ray absorptiometry (DXA) was performed, and the patients were divided into normal, osteopenia, and osteoporosis groups, according to the World Health Organization (WHO) criteria. Cross-sectional images were used to evaluate the computed tomography mandibular index (CTMI), the computed tomography index (inferior) (CTI (I)) and computed tomography index (superior) (CTI (S)). Student's t test was used to compare the differences between the indices of the groups' intraclass correlation coefficient (ICC). Statistical analysis showed a high degree of interobserver and intraobserver agreement for all measurements (ICC > 0.80). The mean values of CTMI, CTI (S), and CTI (I) were lower in the osteoporosis group than in osteopenia and normal patients (p < 0.05). In comparing normal patients and women with osteopenia, there was no statistically significant difference in the mean value of CTI (I) (p = 0.075). Quantitative CBCT indices may help dentists to screen for women with low spinal and femoral bone mineral density so that they can refer postmenopausal women for bone densitometry.

  13. The concept of computer software designed to identify and analyse logistics costs in agricultural enterprises

    Directory of Open Access Journals (Sweden)

    Karol Wajszczyk

    2009-01-01

    Full Text Available The study comprised research, development and computer programming works concerning the development of a concept for the IT tool to be used in the identification and analysis of logistics costs in agricultural enterprises in terms of the process-based approach. As a result of research and programming work an overall functional and IT concept of software was developed for the identification and analysis of logistics costs for agricultural enterprises.

  14. Identifying Computer-Generated Portraits: The Importance of Training and Incentives.

    Science.gov (United States)

    Mader, Brandon; Banks, Martin S; Farid, Hany

    2017-09-01

    The past two decades have seen remarkable advances in photo-realistic rendering of everything from inanimate objects to landscapes, animals, and humans. We previously showed that despite these tremendous advances, human observers remain fairly good at distinguishing computer-generated from photographic images. Building on these results, we describe a series of follow-up experiments that reveal how to improve observer performance. Of general interest to anyone performing psychophysical studies on Mechanical Turk or similar platforms, we find that observer performance can be significantly improved with the proper incentives.

  15. Computer-based video analysis identifies infants with absence of fidgety movements.

    Science.gov (United States)

    Støen, Ragnhild; Songstad, Nils Thomas; Silberg, Inger Elisabeth; Fjørtoft, Toril; Jensenius, Alexander Refsum; Adde, Lars

    2017-10-01

    BackgroundAbsence of fidgety movements (FMs) at 3 months' corrected age is a strong predictor of cerebral palsy (CP) in high-risk infants. This study evaluates the association between computer-based video analysis and the temporal organization of FMs assessed with the General Movement Assessment (GMA).MethodsInfants were eligible for this prospective cohort study if referred to a high-risk follow-up program in a participating hospital. Video recordings taken at 10-15 weeks post term age were used for GMA and computer-based analysis. The variation of the spatial center of motion, derived from differences between subsequent video frames, was used for quantitative analysis.ResultsOf 241 recordings from 150 infants, 48 (24.1%) were classified with absence of FMs or sporadic FMs using the GMA. The variation of the spatial center of motion (C SD ) during a recording was significantly lower in infants with normal (0.320; 95% confidence interval (CI) 0.309, 0.330) vs. absence of or sporadic (0.380; 95% CI 0.361, 0.398) FMs (P<0.001). A triage model with C SD thresholds chosen for sensitivity of 90% and specificity of 80% gave a 40% referral rate for GMA.ConclusionQuantitative video analysis during the FMs' period can be used to triage infants at high risk of CP to early intervention or observational GMA.

  16. Identifying a few foot-and-mouth disease virus signature nucleotide strings for computational genotyping

    Directory of Open Access Journals (Sweden)

    Xu Lizhe

    2008-06-01

    Full Text Available Abstract Background Serotypes of the Foot-and-Mouth disease viruses (FMDVs were generally determined by biological experiments. The computational genotyping is not well studied even with the availability of whole viral genomes, due to uneven evolution among genes as well as frequent genetic recombination. Naively using sequence comparison for genotyping is only able to achieve a limited extent of success. Results We used 129 FMDV strains with known serotype as training strains to select as many as 140 most serotype-specific nucleotide strings. We then constructed a linear-kernel Support Vector Machine classifier using these 140 strings. Under the leave-one-out cross validation scheme, this classifier was able to assign correct serotype to 127 of these 129 strains, achieving 98.45% accuracy. It also assigned serotype correctly to an independent test set of 83 other FMDV strains downloaded separately from NCBI GenBank. Conclusion Computational genotyping is much faster and much cheaper than the wet-lab based biological experiments, upon the availability of the detailed molecular sequences. The high accuracy of our proposed method suggests the potential of utilizing a few signature nucleotide strings instead of whole genomes to determine the serotypes of novel FMDV strains.

  17. Identifying Ghanaian Pre-Service Teachers' Readiness for Computer Use: A Technology Acceptance Model Approach

    Science.gov (United States)

    Gyamfi, Stephen Adu

    2016-01-01

    This study extends the technology acceptance model to identify factors that influence technology acceptance among pre-service teachers in Ghana. Data from 380 usable questionnaires were tested against the research model. Utilising the extended technology acceptance model (TAM) as a research framework, the study found that: pre-service teachers'…

  18. SABER: a computational method for identifying active sites for new reactions.

    Science.gov (United States)

    Nosrati, Geoffrey R; Houk, K N

    2012-05-01

    A software suite, SABER (Selection of Active/Binding sites for Enzyme Redesign), has been developed for the analysis of atomic geometries in protein structures, using a geometric hashing algorithm (Barker and Thornton, Bioinformatics 2003;19:1644-1649). SABER is used to explore the Protein Data Bank (PDB) to locate proteins with a specific 3D arrangement of catalytic groups to identify active sites that might be redesigned to catalyze new reactions. As a proof-of-principle test, SABER was used to identify enzymes that have the same catalytic group arrangement present in o-succinyl benzoate synthase (OSBS). Among the highest-scoring scaffolds identified by the SABER search for enzymes with the same catalytic group arrangement as OSBS were L-Ala D/L-Glu epimerase (AEE) and muconate lactonizing enzyme II (MLE), both of which have been redesigned to become effective OSBS catalysts, demonstrated by experiments. Next, we used SABER to search for naturally existing active sites in the PDB with catalytic groups similar to those present in the designed Kemp elimination enzyme KE07. From over 2000 geometric matches to the KE07 active site, SABER identified 23 matches that corresponded to residues from known active sites. The best of these matches, with a 0.28 Å catalytic atom RMSD to KE07, was then redesigned to be compatible with the Kemp elimination using RosettaDesign. We also used SABER to search for potential Kemp eliminases using a theozyme predicted to provide a greater rate acceleration than the active site of KE07, and used Rosetta to create a design based on the proteins identified. Copyright © 2012 The Protein Society.

  19. Coronary plaque quantification and fractional flow reserve by coronary computed tomography angiography identify ischaemia-causing lesions

    DEFF Research Database (Denmark)

    Gaur, Sara; Øvrehus, Kristian Altern; Dey, Damini

    2016-01-01

    AIMS: Coronary plaque characteristics are associated with ischaemia. Differences in plaque volumes and composition may explain the discordance between coronary stenosis severity and ischaemia. We evaluated the association between coronary stenosis severity, plaque characteristics, coronary computed...... tomography angiography (CTA)-derived fractional flow reserve (FFRCT), and lesion-specific ischaemia identified by FFR in a substudy of the NXT trial (Analysis of Coronary Blood Flow Using CT Angiography: Next Steps). METHODS AND RESULTS: Coronary CTA stenosis, plaque volumes, FFRCT, and FFR were assessed...

  20. An Integrated Bioinformatics and Computational Biology Approach Identifies New BH3-Only Protein Candidates.

    Science.gov (United States)

    Hawley, Robert G; Chen, Yuzhong; Riz, Irene; Zeng, Chen

    2012-05-04

    In this study, we utilized an integrated bioinformatics and computational biology approach in search of new BH3-only proteins belonging to the BCL2 family of apoptotic regulators. The BH3 (BCL2 homology 3) domain mediates specific binding interactions among various BCL2 family members. It is composed of an amphipathic α-helical region of approximately 13 residues that has only a few amino acids that are highly conserved across all members. Using a generalized motif, we performed a genome-wide search for novel BH3-containing proteins in the NCBI Consensus Coding Sequence (CCDS) database. In addition to known pro-apoptotic BH3-only proteins, 197 proteins were recovered that satisfied the search criteria. These were categorized according to α-helical content and predictive binding to BCL-xL (encoded by BCL2L1) and MCL-1, two representative anti-apoptotic BCL2 family members, using position-specific scoring matrix models. Notably, the list is enriched for proteins associated with autophagy as well as a broad spectrum of cellular stress responses such as endoplasmic reticulum stress, oxidative stress, antiviral defense, and the DNA damage response. Several potential novel BH3-containing proteins are highlighted. In particular, the analysis strongly suggests that the apoptosis inhibitor and DNA damage response regulator, AVEN, which was originally isolated as a BCL-xL-interacting protein, is a functional BH3-only protein representing a distinct subclass of BCL2 family members.

  1. Automated computation of arbor densities: a step toward identifying neuronal cell types

    Directory of Open Access Journals (Sweden)

    Uygar eSümbül

    2014-11-01

    Full Text Available The shape and position of a neuron convey information regarding its molecular and functional identity. The identification of cell types from structure, a classic method, relies on the time-consuming step of arbor tracing. However, as genetic tools and imaging methods make data-driven approaches to neuronal circuit analysis feasible, the need for automated processing increases. Here, we first establish that mouse retinal ganglion cell types can be as precise about distributing their arbor volumes across the inner plexiform layer as they are about distributing the skeletons of the arbors. Then, we describe an automated approach to computing the spatial distribution of the dendritic arbors, or arbor density, with respect to a global depth coordinate based on this observation. Our method involves three-dimensional reconstruction of neuronal arbors by a supervised machine learning algorithm, post-processing of the enhanced stacks to remove somata and isolate the neuron of interest, and registration of neurons to each other using automatically detected arbors of the starburst amacrine interneurons as fiducial markers. In principle, this method could be generalizable to other structures of the CNS, provided that they allow sparse labeling of the cells and contain a reliable axis of spatial reference.

  2. Computer Vision System For Locating And Identifying Defects In Hardwood Lumber

    Science.gov (United States)

    Conners, Richard W.; Ng, Chong T.; Cho, Tai-Hoon; McMillin, Charles W.

    1989-03-01

    This paper describes research aimed at developing an automatic cutup system for use in the rough mills of the hardwood furniture and fixture industry. In particular, this paper describes attempts to create the vision system that will power this automatic cutup system. There are a number of factors that make the development of such a vision system a challenge. First there is the innate variability of the wood material itself. No two species look exactly the same, in fact, they can have a significant visual difference in appearance among species. Yet a truly robust vision system must be able to handle a variety of such species, preferably with no operator intervention required when changing from one species to another. Secondly, there is a good deal of variability in the definition of what constitutes a removable defect. The hardwood furniture and fixture industry is diverse in the nature of the products that it makes. The products range from hardwood flooring to fancy hardwood furniture, from simple mill work to kitchen cabinets. Thus depending on the manufacturer, the product, and the quality of the product the nature of what constitutes a removable defect can and does vary. The vision system must be such that it can be tailored to meet each of these unique needs, preferably without any additional program modifications. This paper will describe the vision system that has been developed. It will assess the current system capabilities, and it will discuss the directions for future research. It will be argued that artificial intelligence methods provide a natural mechanism for attacking this computer vision application.

  3. Influence of intracanal post on apical periodontitis identified by cone-beam computed tomography

    International Nuclear Information System (INIS)

    Estrela, Carlos; Porto, Olavo Cesar Lyra; Rodrigues, Cleomar Donizeth; Bueno, Mike Reis; Pecora, Jesus Djalma

    2009-01-01

    The determination of the success of endodontic treatment has been often discussed based on outcome obtained by periapical radiography. The aim of this study was to verify the influence of intracanal post on apical periodontitis detected by cone-beam computed tomography (CBCT). A consecutive sample of 1020 images (periapical radiographs and CBCT scans) taken from 619 patients (245 men; mean age, 50.1 years) between February 2008 and September 2009 were used in this study. Presence and intracanal post length (short, medium and long) were associated with apical periodontitis (AP). Chi-square test was used for statistical analyses. Significance level was set at p<0.01. The kappa value was used to assess examiner variability. From a total of 591 intracanal posts, AP was observed in 15.06%, 18.78% and 7.95% using periapical radiographs, into the different lengths, short, medium and long, respectively (p=0.466). Considering the same posts length it was verified AP in 24.20%, 26.40% and 11.84% observed by CBCT scans, respectively (p=0.154). From a total of 1,020 teeth used in this study, AP was detected in 397 (38.92%) by periapical radiography and in 614 (60.19%) by CBCT scans (p<0.001). The distribution of intracanal posts in different dental groups showed higher prevalence in maxillary anterior teeth (54.79%). Intracanal posts lengths did not influenced AP. AP was detected more frequently when CBCT method was used. (author)

  4. Influence of intracanal post on apical periodontitis identified by cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Estrela, Carlos; Porto, Olavo Cesar Lyra; Rodrigues, Cleomar Donizeth [Federal University of Goias (UFG), Goiania, GO (Brazil). Dental School; Bueno, Mike Reis [University of Cuiaba (UNIC), MT (Brazil). Dental School; Pecora, Jesus Djalma, E-mail: estrela3@terra.com.b [University of Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Dental School

    2009-07-01

    The determination of the success of endodontic treatment has been often discussed based on outcome obtained by periapical radiography. The aim of this study was to verify the influence of intracanal post on apical periodontitis detected by cone-beam computed tomography (CBCT). A consecutive sample of 1020 images (periapical radiographs and CBCT scans) taken from 619 patients (245 men; mean age, 50.1 years) between February 2008 and September 2009 were used in this study. Presence and intracanal post length (short, medium and long) were associated with apical periodontitis (AP). Chi-square test was used for statistical analyses. Significance level was set at p<0.01. The kappa value was used to assess examiner variability. From a total of 591 intracanal posts, AP was observed in 15.06%, 18.78% and 7.95% using periapical radiographs, into the different lengths, short, medium and long, respectively (p=0.466). Considering the same posts length it was verified AP in 24.20%, 26.40% and 11.84% observed by CBCT scans, respectively (p=0.154). From a total of 1,020 teeth used in this study, AP was detected in 397 (38.92%) by periapical radiography and in 614 (60.19%) by CBCT scans (p<0.001). The distribution of intracanal posts in different dental groups showed higher prevalence in maxillary anterior teeth (54.79%). Intracanal posts lengths did not influenced AP. AP was detected more frequently when CBCT method was used. (author)

  5. Diagnostic Accuracy of Periapical Radiography and Cone-beam Computed Tomography in Identifying Root Canal Configuration of Human Premolars.

    Science.gov (United States)

    Sousa, Thiago Oliveira; Haiter-Neto, Francisco; Nascimento, Eduarda Helena Leandro; Peroni, Leonardo Vieira; Freitas, Deborah Queiroz; Hassan, Bassam

    2017-07-01

    The aim of this study was to assess the diagnostic accuracy of periapical radiography (PR) and cone-beam computed tomographic (CBCT) imaging in the detection of the root canal configuration (RCC) of human premolars. PR and CBCT imaging of 114 extracted human premolars were evaluated by 2 oral radiologists. RCC was recorded according to Vertucci's classification. Micro-computed tomographic imaging served as the gold standard to determine RCC. Accuracy, sensitivity, specificity, and predictive values were calculated. The Friedman test compared both PR and CBCT imaging with the gold standard. CBCT imaging showed higher values for all diagnostic tests compared with PR. Accuracy was 0.55 and 0.89 for PR and CBCT imaging, respectively. There was no difference between CBCT imaging and the gold standard, whereas PR differed from both CBCT and micro-computed tomographic imaging (P < .0001). CBCT imaging was more accurate than PR for evaluating different types of RCC individually. Canal configuration types III, VII, and "other" were poorly identified on CBCT imaging with a detection accuracy of 50%, 0%, and 43%, respectively. With PR, all canal configurations except type I were poorly visible. PR presented low performance in the detection of RCC in premolars, whereas CBCT imaging showed no difference compared with the gold standard. Canals with complex configurations were less identifiable using both imaging methods, especially PR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  6. The Multimorbidity Cluster Analysis Tool: Identifying Combinations and Permutations of Multiple Chronic Diseases Using a Record-Level Computational Analysis

    Directory of Open Access Journals (Sweden)

    Kathryn Nicholson

    2017-12-01

    Full Text Available Introduction: Multimorbidity, or the co-occurrence of multiple chronic health conditions within an individual, is an increasingly dominant presence and burden in modern health care systems.  To fully capture its complexity, further research is needed to uncover the patterns and consequences of these co-occurring health states.  As such, the Multimorbidity Cluster Analysis Tool and the accompanying Multimorbidity Cluster Analysis Toolkit have been created to allow researchers to identify distinct clusters that exist within a sample of participants or patients living with multimorbidity.  Development: The Tool and Toolkit were developed at Western University in London, Ontario, Canada.  This open-access computational program (JAVA code and executable file was developed and tested to support an analysis of thousands of individual records and up to 100 disease diagnoses or categories.  Application: The computational program can be adapted to the methodological elements of a research project, including type of data, type of chronic disease reporting, measurement of multimorbidity, sample size and research setting.  The computational program will identify all existing, and mutually exclusive, combinations and permutations within the dataset.  An application of this computational program is provided as an example, in which more than 75,000 individual records and 20 chronic disease categories resulted in the detection of 10,411 unique combinations and 24,647 unique permutations among female and male patients.  Discussion: The Tool and Toolkit are now available for use by researchers interested in exploring the complexities of multimorbidity.  Its careful use, and the comparison between results, will be valuable additions to the nuanced understanding of multimorbidity.

  7. A novel computational method identifies intra- and inter-species recombination events in Staphylococcus aureus and Streptococcus pneumoniae.

    Directory of Open Access Journals (Sweden)

    Lisa Sanguinetti

    Full Text Available Advances in high-throughput DNA sequencing technologies have determined an explosion in the number of sequenced bacterial genomes. Comparative sequence analysis frequently reveals evidences of homologous recombination occurring with different mechanisms and rates in different species, but the large-scale use of computational methods to identify recombination events is hampered by their high computational costs. Here, we propose a new method to identify recombination events in large datasets of whole genome sequences. Using a filtering procedure of the gene conservation profiles of a test genome against a panel of strains, this algorithm identifies sets of contiguous genes acquired by homologous recombination. The locations of the recombination breakpoints are determined using a statistical test that is able to account for the differences in the natural rate of evolution between different genes. The algorithm was tested on a dataset of 75 genomes of Staphylococcus aureus and 50 genomes comprising different streptococcal species, and was able to detect intra-species recombination events in S. aureus and in Streptococcus pneumoniae. Furthermore, we found evidences of an inter-species exchange of genetic material between S. pneumoniae and Streptococcus mitis, a closely related commensal species that colonizes the same ecological niche. The method has been implemented in an R package, Reco, which is freely available from supplementary material, and provides a rapid screening tool to investigate recombination on a genome-wide scale from sequence data.

  8. A High-Throughput Computational Framework for Identifying Significant Copy Number Aberrations from Array Comparative Genomic Hybridisation Data

    Directory of Open Access Journals (Sweden)

    Ian Roberts

    2012-01-01

    Full Text Available Reliable identification of copy number aberrations (CNA from comparative genomic hybridization data would be improved by the availability of a generalised method for processing large datasets. To this end, we developed swatCGH, a data analysis framework and region detection heuristic for computational grids. swatCGH analyses sequentially displaced (sliding windows of neighbouring probes and applies adaptive thresholds of varying stringency to identify the 10% of each chromosome that contains the most frequently occurring CNAs. We used the method to analyse a published dataset, comparing data preprocessed using four different DNA segmentation algorithms, and two methods for prioritising the detected CNAs. The consolidated list of the most commonly detected aberrations confirmed the value of swatCGH as a simplified high-throughput method for identifying biologically significant CNA regions of interest.

  9. Use of computed tomography to identify atrial fibrillation associated differences in left atrial wall thickness and density.

    Science.gov (United States)

    Dewland, Thomas A; Wintermark, Max; Vaysman, Anna; Smith, Lisa M; Tong, Elizabeth; Vittinghoff, Eric; Marcus, Gregory M

    2013-01-01

    Left atrial (LA) tissue characteristics may play an important role in atrial fibrillation (AF) induction and perpetuation. Although frequently used in clinical practice, computed tomography (CT) has not been employed to describe differences in LA wall properties between AF patients and controls. We sought to noninvasively characterize AF-associated differences in LA tissue using CT. CT images of the LA were obtained in 98 consecutive patients undergoing AF ablation and in 89 controls. A custom software algorithm was used to measure wall thickness and density in four prespecified regions of the LA. On average, LA walls were thinner (-15.5%, 95% confidence interval [CI] -23.2 to -7.8%, P identified significant thinning of the LA wall and regional alterations in tissue density in patients with a history of AF. These findings suggest differences in LA tissue composition can be noninvasively identified and quantified using CT. ©2012, The Authors. Journal compilation ©2012 Wiley Periodicals, Inc.

  10. Avaliação de citrandarins e outros híbridos de trifoliata como porta-enxertos para citros em São Paulo Performance of citrandarins and others trifoliate hybrids rootstocks in Sao Paulo, Brazil

    Directory of Open Access Journals (Sweden)

    Silvia Blumer

    2005-08-01

    Full Text Available Laranjeiras Valência enxertadas em citrandarins e outros híbridos de trifoliata foram plantadas em 1988, em Itirapina (SP, num Latossolo Vermelho-Amarelo textura arenosa e conduzidas sem irrigação. O citrandarin Sunki x English (1.628, sem diferir estatisticamente de Cleópatra x Rubidoux (1.660, Cleópatra x English (710, Cleópatra x Swingle (715 e do trangpur Cravo x Carrizo (717, induziu a maior produção de frutos nas cinco primeiras colheitas do experimento (1991-1995, sendo que os três primeiros foram os mais produtivos nas três últimas colheitas. Os citranges Troyer e Carrizo foram significativamente inferiores aos citrandarins Sunki x English (1.628, Cleópatra x Rubidoux (1.660 e Cleópatra x English (710 em todos os anos, exceto 1994. Nenhuma das plantas apresentou sintomas de suscetibilidade à tristeza ou ao declínio. Os seedlings dos porta-enxertos diferiram quanto à área lesionada pela inoculação com Phytophthora parasitica. Os citrandarins Cleópatra x Swingle (1.587, Cleópatra x Trifoliata (1.574, Cleópatra x Rubidoux (1.600, Clementina x Trifoliata (1.615 e o limão Cravo x citrange Carrizo (717 foram significativamente mais resistentes que Cleópatra x Christian (712, Sunki x English (1.628, Cleópatra x Swingle (715 e Cleópatra x English (710.Valencia sweet orange trees budded onto citrandarins and others trifoliate hybrids rootstocks from the USDA Horticultural Research Laboratory, Fort Pierce, Florida, were planted in 1988 on a sandy textured Oxisol in São Paulo State, Brazil, and managed without irrigation. Tristeza and blight diseases are endemic in this area. Trees of Sunki x English (1.628, Cleopatra x Rubidoux (1.660, Cleopatra x English (710, Cleopatra x Swingle (715 and Rangpur lime x Carrizo citrange (717, produced the highest cumulative yields in the first five crops (1991-1995. The first three rootstocks induced the highest crops in the last three years. Carrizo and Troyer citranges had the lowest

  11. Computed Tomography Fractional Flow Reserve Can Identify Culprit Lesions in Aortoiliac Occlusive Disease Using Minimally Invasive Techniques.

    Science.gov (United States)

    Ward, Erin P; Shiavazzi, Daniele; Sood, Divya; Marsden, Allison; Lane, John; Owens, Erik; Barleben, Andrew

    2017-01-01

    Currently, the gold standard diagnostic examination for significant aortoiliac lesions is angiography. Fractional flow reserve (FFR) has a growing body of literature in coronary artery disease as a minimally invasive diagnostic procedure. Improvements in numerical hemodynamics have allowed for an accurate and minimally invasive approach to estimating FFR, utilizing cross-sectional imaging. We aim to demonstrate a similar approach to aortoiliac occlusive disease (AIOD). A retrospective review evaluated 7 patients with claudication and cross-sectional imaging showing AIOD. FFR was subsequently measured during conventional angiogram with pull-back pressures in a retrograde fashion. To estimate computed tomography (CT) FFR, CT angiography (CTA) image data were analyzed using the SimVascular software suite to create a computational fluid dynamics model of the aortoiliac system. Inlet flow conditions were derived based on cardiac output, while 3-element Windkessel outlet boundary conditions were optimized to match the expected systolic and diastolic pressures, with outlet resistance distributed based on Murray's law. The data were evaluated with a Student's t-test and receiver operating characteristic curve. All patients had evidence of AIOD on CT and FFR was successfully measured during angiography. The modeled data were found to have high sensitivity and specificity between the measured and CT FFR (P = 0.986, area under the curve = 1). The average difference between the measured and calculated FFRs was 0.136, with a range from 0.03 to 0.30. CT FFR successfully identified aortoiliac lesions with significant pressure drops that were identified with angiographically measured FFR. CT FFR has the potential to provide a minimally invasive approach to identify flow-limiting stenosis for AIOD. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Sentinel nodes identified by computed tomography-lymphography accurately stage the axilla in patients with breast cancer

    International Nuclear Information System (INIS)

    Motomura, Kazuyoshi; Sumino, Hiroshi; Noguchi, Atsushi; Horinouchi, Takashi; Nakanishi, Katsuyuki

    2013-01-01

    Sentinel node biopsy often results in the identification and removal of multiple nodes as sentinel nodes, although most of these nodes could be non-sentinel nodes. This study investigated whether computed tomography-lymphography (CT-LG) can distinguish sentinel nodes from non-sentinel nodes and whether sentinel nodes identified by CT-LG can accurately stage the axilla in patients with breast cancer. This study included 184 patients with breast cancer and clinically negative nodes. Contrast agent was injected interstitially. The location of sentinel nodes was marked on the skin surface using a CT laser light navigator system. Lymph nodes located just under the marks were first removed as sentinel nodes. Then, all dyed nodes or all hot nodes were removed. The mean number of sentinel nodes identified by CT-LG was significantly lower than that of dyed and/or hot nodes removed (1.1 vs 1.8, p <0.0001). Twenty-three (12.5%) patients had ≥2 sentinel nodes identified by CT-LG removed, whereas 94 (51.1%) of patients had ≥2 dyed and/or hot nodes removed (p <0.0001). Pathological evaluation demonstrated that 47 (25.5%) of 184 patients had metastasis to at least one node. All 47 patients demonstrated metastases to at least one of the sentinel nodes identified by CT-LG. CT-LG can distinguish sentinel nodes from non-sentinel nodes, and sentinel nodes identified by CT-LG can accurately stage the axilla in patients with breast cancer. Successful identification of sentinel nodes using CT-LG may facilitate image-based diagnosis of metastasis, possibly leading to the omission of sentinel node biopsy

  13. A Novel Imaging Technique (X-Map) to Identify Acute Ischemic Lesions Using Noncontrast Dual-Energy Computed Tomography.

    Science.gov (United States)

    Noguchi, Kyo; Itoh, Toshihide; Naruto, Norihito; Takashima, Shutaro; Tanaka, Kortaro; Kuroda, Satoshi

    2017-01-01

    We evaluated whether X-map, a novel imaging technique, can visualize ischemic lesions within 20 hours after the onset in patients with acute ischemic stroke, using noncontrast dual-energy computed tomography (DECT). Six patients with acute ischemic stroke were included in this study. Noncontrast head DECT scans were acquired with 2 X-ray tubes operated at 80 kV and Sn150 kV between 32 minutes and 20 hours after the onset. Using these DECT scans, the X-map was reconstructed based on 3-material decomposition and compared with a simulated standard (120 kV) computed tomography (CT) and diffusion-weighted imaging (DWI). The X-map showed more sensitivity to identify the lesions as an area of lower attenuation value than a simulated standard CT in all 6 patients. The lesions on the X-map correlated well with those on DWI. In 3 of 6 patients, the X-map detected a transient decrease in the attenuation value in the peri-infarct area within 1 day after the onset. The X-map is a powerful tool to supplement a simulated standard CT and characterize acute ischemic lesions. However, the X-map cannot replace a simulated standard CT to diagnose acute cerebral infarction. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Advanced computational biology methods identify molecular switches for malignancy in an EGF mouse model of liver cancer.

    Directory of Open Access Journals (Sweden)

    Philip Stegmaier

    Full Text Available The molecular causes by which the epidermal growth factor receptor tyrosine kinase induces malignant transformation are largely unknown. To better understand EGFs' transforming capacity whole genome scans were applied to a transgenic mouse model of liver cancer and subjected to advanced methods of computational analysis to construct de novo gene regulatory networks based on a combination of sequence analysis and entrained graph-topological algorithms. Here we identified transcription factors, processes, key nodes and molecules to connect as yet unknown interacting partners at the level of protein-DNA interaction. Many of those could be confirmed by electromobility band shift assay at recognition sites of gene specific promoters and by western blotting of nuclear proteins. A novel cellular regulatory circuitry could therefore be proposed that connects cell cycle regulated genes with components of the EGF signaling pathway. Promoter analysis of differentially expressed genes suggested the majority of regulated transcription factors to display specificity to either the pre-tumor or the tumor state. Subsequent search for signal transduction key nodes upstream of the identified transcription factors and their targets suggested the insulin-like growth factor pathway to render the tumor cells independent of EGF receptor activity. Notably, expression of IGF2 in addition to many components of this pathway was highly upregulated in tumors. Together, we propose a switch in autocrine signaling to foster tumor growth that was initially triggered by EGF and demonstrate the knowledge gain form promoter analysis combined with upstream key node identification.

  15. Identifying shared genetic structure patterns among Pacific Northwest forest taxa: insights from use of visualization tools and computer simulations.

    Directory of Open Access Journals (Sweden)

    Mark P Miller

    2010-10-01

    Full Text Available Identifying causal relationships in phylogeographic and landscape genetic investigations is notoriously difficult, but can be facilitated by use of multispecies comparisons.We used data visualizations to identify common spatial patterns within single lineages of four taxa inhabiting Pacific Northwest forests (northern spotted owl: Strix occidentalis caurina; red tree vole: Arborimus longicaudus; southern torrent salamander: Rhyacotriton variegatus; and western white pine: Pinus monticola. Visualizations suggested that, despite occupying the same geographical region and habitats, species responded differently to prevailing historical processes. S. o. caurina and P. monticola demonstrated directional patterns of spatial genetic structure where genetic distances and diversity were greater in southern versus northern locales. A. longicaudus and R. variegatus displayed opposite patterns where genetic distances were greater in northern versus southern regions. Statistical analyses of directional patterns subsequently confirmed observations from visualizations. Based upon regional climatological history, we hypothesized that observed latitudinal patterns may have been produced by range expansions. Subsequent computer simulations confirmed that directional patterns can be produced by expansion events.We discuss phylogeographic hypotheses regarding historical processes that may have produced observed patterns. Inferential methods used here may become increasingly powerful as detailed simulations of organisms and historical scenarios become plausible. We further suggest that inter-specific comparisons of historical patterns take place prior to drawing conclusions regarding effects of current anthropogenic change within landscapes.

  16. Machine Learning Classification to Identify the Stage of Brain-Computer Interface Therapy for Stroke Rehabilitation Using Functional Connectivity

    Directory of Open Access Journals (Sweden)

    Rosaleena Mohanty

    2018-05-01

    Full Text Available Interventional therapy using brain-computer interface (BCI technology has shown promise in facilitating motor recovery in stroke survivors; however, the impact of this form of intervention on functional networks outside of the motor network specifically is not well-understood. Here, we investigated resting-state functional connectivity (rs-FC in stroke participants undergoing BCI therapy across stages, namely pre- and post-intervention, to identify discriminative functional changes using a machine learning classifier with the goal of categorizing participants into one of the two therapy stages. Twenty chronic stroke participants with persistent upper-extremity motor impairment received neuromodulatory training using a closed-loop neurofeedback BCI device, and rs-functional MRI (rs-fMRI scans were collected at four time points: pre-, mid-, post-, and 1 month post-therapy. To evaluate the peak effects of this intervention, rs-FC was analyzed from two specific stages, namely pre- and post-therapy. In total, 236 seeds spanning both motor and non-motor regions of the brain were computed at each stage. A univariate feature selection was applied to reduce the number of features followed by a principal component-based data transformation used by a linear binary support vector machine (SVM classifier to classify each participant into a therapy stage. The SVM classifier achieved a cross-validation accuracy of 92.5% using a leave-one-out method. Outside of the motor network, seeds from the fronto-parietal task control, default mode, subcortical, and visual networks emerged as important contributors to the classification. Furthermore, a higher number of functional changes were observed to be strengthening from the pre- to post-therapy stage than the ones weakening, both of which involved motor and non-motor regions of the brain. These findings may provide new evidence to support the potential clinical utility of BCI therapy as a form of stroke

  17. Field performance of "marsh seedless" grapefruit on trifoliate orange inoculated with viroids in Brazil Desempenho do pomeleiro "marsh seedles" enxertado em trifoliata inoculado com viróides no Brasil

    Directory of Open Access Journals (Sweden)

    Eduardo Sanches Stuchi

    2007-12-01

    Full Text Available Some viroids reduce citrus tree growth and may be used for tree size control aiming the establishment of orchards with close tree spacing that may provide higher productivity than conventional ones. To study the effects of citrus viroids inoculation on vegetative growth, yield and fruit quality of 'Marsh Seedless' grapefruit (Citrus paradisi Macf. grafted on trifoliate orange [Poncirus trifoliata (L. Raf.], an experiment was set up in January 1991, in Bebedouro, São Paulo State, Brazil. The experimental design was randomized blocks with four treatments with two plants per plot: viroid isolates Citrus Exocortis Viroid (CEVd + Hop stunt viroid (HSVd - CVd-II, a non cachexia variant + Citrus III viroid (CVd-III and Hop stunt viroid (HSVd - CVd-II, a non cachexia variant + Citrus III viroid (CVd-III and controls: two healthy buds (control, and no grafting (absolute control. Inoculation was done in the field, six months after planting by bud grafting. Both isolates reduced tree growth (trunk diameter, plant height, canopy diameter and volume. Trees not inoculated yielded better (average of eleven harvests than inoculated ones but the productivity was the same after 150 months. Fruit quality was affected by viroids inoculation but not in a restrictive way. The use of such severe dwarfing isolates for high density plantings of grapefruit on trifoliate orange rootstock is not recommended.Alguns viróides reduzem o crescimento dos citros e podem ser usados para o controle do tamanho das plantas objetivando a instalação de pomares adensados que podem ter maior produtividade que os pomares com espaçamentos convencionais. Para estudar o efeito da inoculação de viróides no desenvolvimento vegetativo, produção e qualidade dos frutos de pomeleiro 'Marsh Seedless' (Citrus paradisi Macf. enxertado em trifoliata [Poncirus trifoliata (L. Raf.], foi instalado um experimento em Janeiro de 1991, em Bebedouro, Estado de São Paulo, Brasil. O delineamento

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  19. IDENTIFYING THE DETERMINANTS OF CLOUD COMPUTING ADOPTION IN A GOVERNMENT SECTOR – A CASE STUDY OF SAUDI ORGANISATION

    OpenAIRE

    Alsanea, Majed; Wainwright, David

    2014-01-01

    The adoption of Cloud Computing technology is an essential step forward within both the public and private sectors, particularly in the context of the current economic crisis. However, the trend is struggling for many reasons. The purpose of this study is to establish the foundations for the development of a framework to guide government organisations through the process of transferring to Cloud Computing technology. The main aim of this research is to evaluate the factors affecting the adopt...

  20. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

    Directory of Open Access Journals (Sweden)

    Daniel Durstewitz

    2017-06-01

    Full Text Available The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast maximum-likelihood estimation framework for PLRNNs that may enable to recover

  1. Exploring Students Intentions to Study Computer Science and Identifying the Differences among ICT and Programming Based Courses

    Science.gov (United States)

    Giannakos, Michail N.

    2014-01-01

    Computer Science (CS) courses comprise both Programming and Information and Communication Technology (ICT) issues; however these two areas have substantial differences, inter alia the attitudes and beliefs of the students regarding the intended learning content. In this research, factors from the Social Cognitive Theory and Unified Theory of…

  2. Post-mortem computed tomography angiography utilizing barium sulfate to identify microvascular structures : a preliminary phantom model and case study

    NARCIS (Netherlands)

    Haakma, Wieke; Rohde, Marianne; Kuster, Lidy; Uhrenholt, Lars; Pedersen, Michael; Boel, Lene Warner Thorup

    2016-01-01

    We investigated the use of computer tomography angiography (CTA) to visualize microvascular structures in a vessel-mimicking phantom and post-mortem (PM) bodies. A contrast agent was used based on 22% barium sulfate, 20% polyethylene glycol and 58% distilled water. A vessel-mimicking phantom

  3. Computer Breakdown as a Stress Factor during Task Completion under Time Pressure: Identifying Gender Differences Based on Skin Conductance

    Directory of Open Access Journals (Sweden)

    René Riedl

    2013-01-01

    Full Text Available In today’s society, as computers, the Internet, and mobile phones pervade almost every corner of life, the impact of Information and Communication Technologies (ICT on humans is dramatic. The use of ICT, however, may also have a negative side. Human interaction with technology may lead to notable stress perceptions, a phenomenon referred to as technostress. An investigation of the literature reveals that computer users’ gender has largely been ignored in technostress research, treating users as “gender-neutral.” To close this significant research gap, we conducted a laboratory experiment in which we investigated users’ physiological reaction to the malfunctioning of technology. Based on theories which explain that men, in contrast to women, are more sensitive to “achievement stress,” we predicted that male users would exhibit higher levels of stress than women in cases of system breakdown during the execution of a human-computer interaction task under time pressure, if compared to a breakdown situation without time pressure. Using skin conductance as a stress indicator, the hypothesis was confirmed. Thus, this study shows that user gender is crucial to better understanding the influence of stress factors such as computer malfunctions on physiological stress reactions.

  4. Development of the regional EPR and PACS sharing system on the infrastructure of cloud computing technology controlled by patient identifier cross reference manager.

    Science.gov (United States)

    Kondoh, Hiroshi; Teramoto, Kei; Kawai, Tatsurou; Mochida, Maki; Nishimura, Motohiro

    2013-01-01

    A Newly developed Oshidori-Net2, providing medical professionals with remote access to electronic patient record systems (EPR) and PACSs of four hospitals, of different venders, using cloud computing technology and patient identifier cross reference manager. The operation was started from April 2012. The patients moved to other hospital were applied. Objective is to show the merit and demerit of the new system.

  5. Binary Logistic Regression Analysis in Assessment and Identifying Factors That Influence Students' Academic Achievement: The Case of College of Natural and Computational Science, Wolaita Sodo University, Ethiopia

    Science.gov (United States)

    Zewude, Bereket Tessema; Ashine, Kidus Meskele

    2016-01-01

    An attempt has been made to assess and identify the major variables that influence student academic achievement at college of natural and computational science of Wolaita Sodo University in Ethiopia. Study time, peer influence, securing first choice of department, arranging study time outside class, amount of money received from family, good life…

  6. Films, Affective Computing and Aesthetic Experience: Identifying Emotional and Aesthetic Highlights from Multimodal Signals in a Social Setting

    OpenAIRE

    Kostoulas, Theodoros; Chanel, Guillaume; Muszynski, Michal; Lombardo, Patrizia; Pun, Thierry

    2017-01-01

    Over the last years, affective computing has been strengthening its ties with the humanities, exploring and building understanding of people’s responses to specific artistic multimedia stimuli. “Aesthetic experience” is acknowledged to be the subjective part of some artistic exposure, namely, the inner affective state of a person exposed to some artistic object. In this work, we describe ongoing research activities for studying the aesthetic experience of people when exposed to movie artistic...

  7. Computational and experimental analysis identified 6-diazo-5-oxonorleucine as a potential agent for treating infection by Plasmodium falciparum.

    Science.gov (United States)

    Plaimas, Kitiporn; Wang, Yulin; Rotimi, Solomon O; Olasehinde, Grace; Fatumo, Segun; Lanzer, Michael; Adebiyi, Ezekiel; König, Rainer

    2013-12-01

    Plasmodium falciparum (PF) is the most severe malaria parasite. It is developing resistance quickly to existing drugs making it indispensable to discover new drugs. Effective drugs have been discovered targeting metabolic enzymes of the parasite. In order to predict new drug targets, computational methods can be used employing database information of metabolism. Using this data, we performed recently a computational network analysis of metabolism of PF. We analyzed the topology of the network to find reactions which are sensitive against perturbations, i.e., when a single enzyme is blocked by drugs. We now used a refined network comprising also the host enzymes which led to a refined set of the five targets glutamyl-tRNA (gln) amidotransferase, hydroxyethylthiazole kinase, deoxyribose-phophate aldolase, pseudouridylate synthase, and deoxyhypusine synthase. It was shown elsewhere that glutamyl-tRNA (gln) amidotransferase of other microorganisms can be inhibited by 6-diazo-5-oxonorleucine. Performing a half maximal inhibitory concentration (IC50) assay, we showed, that 6-diazo-5-oxonorleucine is also severely affecting viability of PF in blood plasma of the human host. We confirmed this by an in vivo study observing Plasmodium berghei infected mice. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. A new computed tomography method to identify meningitis-related cochlear ossification and fibrosis before cochlear implantation.

    Science.gov (United States)

    Ichikawa, Kazunori; Kashio, Akinori; Mori, Harushi; Ochi, Atushi; Karino, Shotaro; Sakamoto, Takashi; Kakigi, Akinobu; Yamasoba, Tatsuya

    2014-04-01

    To develop a new method to determine the presence of intracochlear ossification and/or fibrosis in cochlear implantation candidates with bilateral profound deafness following meningitis. Diagnostic test assessment. A university hospital. This study involved 15 ears from 13 patients with profound deafness following meningitis who underwent cochlear implantation. These ears showed normal structures, soft tissue, partial bony occlusion, and complete bony occlusion in 4, 3, 2, and 6 ears, respectively. We measured radiodensity in Hounsfield units (HU) using 0.5-mm-thick axial high-resolution computed tomography image slices at 3 different levels in the basal turn, the fenestration, and inferior and ascending segment sites, located along the electrode-insertion path. Pixel-level analysis on the DICOM viewer yielded actual computed tomography values of intracochlear soft tissues by eliminating the partial volume effect. The values were compared with the intraoperative findings. Values for ossification (n = 12) ranged from +547 HU to +1137 HU; for fibrosis (n = 11), from +154 HU to +574 HU; and for fluid (n = 22), from -49 HU to +255 HU. From these values, we developed 2 presets of window width (WW) and window level (WL): (1) WW: 1800, WL: 1100 (200 HU to 2000 HU) and (2) WW: 1500, WL: 1250 (500 HU to 2000 HU). The results using these 2 presets corresponded well to the intraoperative findings. Our new method is easy and feasible for preoperative determination of the presence of cochlear ossification and/or fibrosis that develops following meningitis.

  9. The development of bronchiectasis on chest computed tomography in children with cystic fibrosis: can pre-stages be identified?

    Energy Technology Data Exchange (ETDEWEB)

    Tepper, Leonie A. [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Sophia Children' s Hospital, Department of Radiology, Erasmus MC, Rotterdam (Netherlands); Caudri, Daan [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Perez Rovira, Adria [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Erasmus MC, Biomedical Imaging Group Rotterdam, Departments of Radiology and Medical Informatics, Rotterdam (Netherlands); Tiddens, Harm A.W.M. [Sophia Children' s Hospital, Department of Pediatric Pulmonology, Erasmus MC, Rotterdam (Netherlands); Sophia Children' s Hospital, Department of Radiology, Erasmus MC, Rotterdam (Netherlands); Sophia Children' s Hospital, Department of Pediatric Pulmonology and Radiology, Erasmus Medical Center, Rotterdam (Netherlands); Bruijne, Marleen de [Erasmus MC, Biomedical Imaging Group Rotterdam, Departments of Radiology and Medical Informatics, Rotterdam (Netherlands); University of Copenhagen, Department of Computer Science, Copenhagen (Denmark)

    2016-12-15

    Bronchiectasis is an important component of cystic fibrosis (CF) lung disease but little is known about its development. We aimed to study the development of bronchiectasis and identify determinants for rapid progression of bronchiectasis on chest CT. Forty-three patients with CF with at least four consecutive biennial volumetric CTs were included. Areas with bronchiectasis on the most recent CT were marked as regions of interest (ROIs). These ROIs were generated on all preceding CTs using deformable image registration. Observers indicated whether: bronchiectasis, mucus plugging, airway wall thickening, atelectasis/consolidation or normal airways were present in the ROIs. We identified 362 ROIs on the most recent CT. In 187 (51.7 %) ROIs bronchiectasis was present on all preceding CTs, while 175 ROIs showed development of bronchiectasis. In 139/175 (79.4 %) no pre-stages of bronchiectasis were identified. In 36/175 (20.6 %) bronchiectatic airways the following pre-stages were identified: mucus plugging (17.7 %), airway wall thickening (1.7 %) or atelectasis/consolidation (1.1 %). Pancreatic insufficiency was more prevalent in the rapid progressors compared to the slow progressors (p = 0.05). Most bronchiectatic airways developed within 2 years without visible pre-stages, underlining the treacherous nature of CF lung disease. Mucus plugging was the most frequent pre-stage. (orig.)

  10. The development of bronchiectasis on chest computed tomography in children with cystic fibrosis: can pre-stages be identified?

    International Nuclear Information System (INIS)

    Tepper, Leonie A.; Caudri, Daan; Perez Rovira, Adria; Tiddens, Harm A.W.M.; Bruijne, Marleen de

    2016-01-01

    Bronchiectasis is an important component of cystic fibrosis (CF) lung disease but little is known about its development. We aimed to study the development of bronchiectasis and identify determinants for rapid progression of bronchiectasis on chest CT. Forty-three patients with CF with at least four consecutive biennial volumetric CTs were included. Areas with bronchiectasis on the most recent CT were marked as regions of interest (ROIs). These ROIs were generated on all preceding CTs using deformable image registration. Observers indicated whether: bronchiectasis, mucus plugging, airway wall thickening, atelectasis/consolidation or normal airways were present in the ROIs. We identified 362 ROIs on the most recent CT. In 187 (51.7 %) ROIs bronchiectasis was present on all preceding CTs, while 175 ROIs showed development of bronchiectasis. In 139/175 (79.4 %) no pre-stages of bronchiectasis were identified. In 36/175 (20.6 %) bronchiectatic airways the following pre-stages were identified: mucus plugging (17.7 %), airway wall thickening (1.7 %) or atelectasis/consolidation (1.1 %). Pancreatic insufficiency was more prevalent in the rapid progressors compared to the slow progressors (p = 0.05). Most bronchiectatic airways developed within 2 years without visible pre-stages, underlining the treacherous nature of CF lung disease. Mucus plugging was the most frequent pre-stage. (orig.)

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  12. Unusual presentation of metastatic carcinoma cervix with clinically silent primary identified by 18F-flouro deoxy glucose positron emission tomography/computed tomography

    International Nuclear Information System (INIS)

    Senthil, Raja; Mohapatra, Ranjan Kumar; Srinivas, Shripriya; Sampath, Mouleeswaran Koramadai; Sundaraiya, Sumati

    2016-01-01

    Carcinoma cervix is the most common gynecological malignancy among Indian women. The common symptoms at presentation include abnormal vaginal bleeding, unusual discharge from the vagina, or pain during coitus and postmenopausal bleeding. Rarely, few patients may present with distant metastases without local symptoms. We present two patients with an unusual presentation of metastatic disease without any gynecological symptoms, where 18 F-flouro deoxy glucose positron emission tomography/computed tomography helped in identifying the primary malignancy in the uterine cervix

  13. Identifying the most infectious lesions in pulmonary tuberculosis by high-resolution multi-detector computed tomography

    International Nuclear Information System (INIS)

    Yeh, Jun Jun; Chen, Solomon Chih-Cheng; Teng, Wen-Bao; Chou, Chun-Hsiung; Hsieh, Shih-Peng; Lee, Tsung-Lung; Wu, Ming-Ting

    2010-01-01

    This study aimed to determine whether characteristics detected by multi-detector computed tomography (MDCT) were predictive of highly infectious, smear-positive, active pulmonary tuberculosis (PTB). Among 124 patients with active PTB, 84 had positive (group 1) and 40 had negative (group 2) smear results for acid-fast bacilli. Multiplanar MDCT, axial conventional CT and chest X-ray images were analysed retrospectively for morphology, number, and segmental (lobe) distribution of lesions. By multivariate analysis, consolidation over any segment of the upper, middle, or lingual lobes, cavitations, and clusters of nodules were associated with group 1, while centrilobular nodules were predictive of group 2. Using five independent variables associated with risk in group 1, a prediction model was created to distinguish between group 1 and group 2. ROC curve analysis showed an area under the curve of 0.951 ± 0.021 for this prediction model. With the ideal cutoff point score of 1, the sensitivity, specificity, and positive predictive values were 84.5%, 97.5%, and 98.0%, respectively. A model to predict smear-positive active PTB on the basis of findings from MDCT may be a useful tool for clinical decisions about isolating patients pending sputum smear results. (orig.)

  14. Identifying the most infectious lesions in pulmonary tuberculosis by high-resolution multi-detector computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, Jun Jun [Pingtung Christian Hospital, Pingtung (China); Mei-Ho Institute of Technology, Pingtung (China); China Medical University, Taichung (China); Chen, Solomon Chih-Cheng [Pingtung Christian Hospital, Pingtung (China); National Taiwan University, Institute of Occupational Medicine and Industrial Hygiene, College of Public Health, Taipei (China); Teng, Wen-Bao; Chou, Chun-Hsiung; Hsieh, Shih-Peng; Lee, Tsung-Lung [Pingtung Christian Hospital, Pingtung (China); Wu, Ming-Ting [National Yang Ming University, Faculty of Medicine, School of Medicine, Taipei (China); Kaohsiung Veterans General Hospital, Section of Thoracic and Circulation Imaging, Department of Radiology, Kaohsiung (China)

    2010-09-15

    This study aimed to determine whether characteristics detected by multi-detector computed tomography (MDCT) were predictive of highly infectious, smear-positive, active pulmonary tuberculosis (PTB). Among 124 patients with active PTB, 84 had positive (group 1) and 40 had negative (group 2) smear results for acid-fast bacilli. Multiplanar MDCT, axial conventional CT and chest X-ray images were analysed retrospectively for morphology, number, and segmental (lobe) distribution of lesions. By multivariate analysis, consolidation over any segment of the upper, middle, or lingual lobes, cavitations, and clusters of nodules were associated with group 1, while centrilobular nodules were predictive of group 2. Using five independent variables associated with risk in group 1, a prediction model was created to distinguish between group 1 and group 2. ROC curve analysis showed an area under the curve of 0.951 {+-} 0.021 for this prediction model. With the ideal cutoff point score of 1, the sensitivity, specificity, and positive predictive values were 84.5%, 97.5%, and 98.0%, respectively. A model to predict smear-positive active PTB on the basis of findings from MDCT may be a useful tool for clinical decisions about isolating patients pending sputum smear results. (orig.)

  15. In search of Leonardo: computer-based facial image analysis of Renaissance artworks for identifying Leonardo as subject

    Science.gov (United States)

    Tyler, Christopher W.; Smith, William A. P.; Stork, David G.

    2012-03-01

    One of the enduring mysteries in the history of the Renaissance is the adult appearance of the archetypical "Renaissance Man," Leonardo da Vinci. His only acknowledged self-portrait is from an advanced age, and various candidate images of younger men are difficult to assess given the absence of documentary evidence. One clue about Leonardo's appearance comes from the remark of the contemporary historian, Vasari, that the sculpture of David by Leonardo's master, Andrea del Verrocchio, was based on the appearance of Leonardo when he was an apprentice. Taking a cue from this statement, we suggest that the more mature sculpture of St. Thomas, also by Verrocchio, might also have been a portrait of Leonardo. We tested the possibility Leonardo was the subject for Verrocchio's sculpture by a novel computational technique for the comparison of three-dimensional facial configurations. Based on quantitative measures of similarities, we also assess whether another pair of candidate two-dimensional images are plausibly attributable as being portraits of Leonardo as a young adult. Our results are consistent with the claim Leonardo is indeed the subject in these works, but we need comparisons with images in a larger corpora of candidate artworks before our results achieve statistical significance.

  16. Novel PCA-VIP scheme for ranking MRI protocols and identifying computer-extracted MRI measurements associated with central gland and peripheral zone prostate tumors.

    Science.gov (United States)

    Ginsburg, Shoshana B; Viswanath, Satish E; Bloch, B Nicolas; Rofsky, Neil M; Genega, Elizabeth M; Lenkinski, Robert E; Madabhushi, Anant

    2015-05-01

    To identify computer-extracted features for central gland and peripheral zone prostate cancer localization on multiparametric magnetic resonance imaging (MRI). Preoperative T2-weighted (T2w), diffusion-weighted imaging (DWI), and dynamic contrast-enhanced (DCE) MRI were acquired from 23 men with confirmed prostate cancer. Following radical prostatectomy, the cancer extent was delineated by a pathologist on ex vivo histology and mapped to MRI by nonlinear registration of histology and corresponding MRI slices. In all, 244 computer-extracted features were extracted from MRI, and principal component analysis (PCA) was employed to reduce the data dimensionality so that a generalizable classifier could be constructed. A novel variable importance on projection (VIP) measure for PCA (PCA-VIP) was leveraged to identify computer-extracted MRI features that discriminate between cancer and normal prostate, and these features were used to construct classifiers for cancer localization. Classifiers using features selected by PCA-VIP yielded an area under the curve (AUC) of 0.79 and 0.85 for peripheral zone and central gland tumors, respectively. For tumor localization in the central gland, T2w, DCE, and DWI MRI features contributed 71.6%, 18.1%, and 10.2%, respectively; for peripheral zone tumors T2w, DCE, and DWI MRI contributed 29.6%, 21.7%, and 48.7%, respectively. PCA-VIP identified relatively stable subsets of MRI features that performed well in localizing prostate cancer on MRI. © 2014 Wiley Periodicals, Inc.

  17. EVALUATION OF THE COMPUTED TOMOGRAPHIC "SENTINEL CLOT SIGN" TO IDENTIFY BLEEDING ABDOMINAL ORGANS IN DOGS WITH HEMOABDOMEN.

    Science.gov (United States)

    Specchi, Swan; Auriemma, Edoardo; Morabito, Simona; Ferri, Filippo; Zini, Eric; Piola, Valentina; Pey, Pascaline; Rossi, Federica

    2017-01-01

    The CT "sentinel clot sign" has been defined as the highest attenuation hematoma adjacent to a bleeding organ in humans with hemoabdomen. The aims of this retrospective descriptive multicenter study were to describe CT findings in a sample of dogs with surgically or necropsy confirmed intra-abdominal bleeding and determine prevalence of the "sentinel clot sign" adjacent to the location of bleeding. Medical records between 2012 and 2014 were searched for dogs with hemoabdomen and in which the origin of the bleeding was confirmed either with surgery or necropsy. Retrieved CT images were reviewed for the presence and localization of the "sentinel clot sign," HU measurements of the "sentinel clot sign" and hemoabdomen, and presence of extravasation of contrast media within the abdominal cavity. Nineteen dogs were included. Three dogs were excluded due to the low amount of blood that did not allow the identification of a "sentinel clot sign." A "sentinel clot sign" was detected in the proximity of the confirmed bleeding organ in 14/16 (88%) of the patients. The mean HU of the "sentinel clot sign" was 56 (range: 43-70) while that of the hemoabdomen was 34 (range: 20-45). Active hemorrhage was identified as extravasation of contrast medium within the peritoneal cavity from the bleeding organ in three dogs. In conclusion, the CT "sentinel clot sign" may be helpful for identifying the source of bleeding in dogs with hemoabdomen. © 2016 American College of Veterinary Radiology.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  19. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  1. Sestamibi technetium-99m brain single-photon emission computed tomography to identify recurrent glioma in adults: 201 studies.

    Science.gov (United States)

    Le Jeune, Florence Prigent; Dubois, François; Blond, Serge; Steinling, Marc

    2006-04-01

    In the follow-up of treated gliomas, CT and MRI can often not differentiate radionecrosis from recurrent tumor. The aim of this study was to assess the interest of functional imaging with (99m)Tc-MIBI SPECT in a large series of 201 examinations. MIBI SPECT were performed in 81 patients treated for brain gliomas. A MIBI uptake index was computed as the ratio of counts in the lesion to counts in the controlateral region. SPECT was compared to stereotactic biopsy in 14 cases, or in the others cases to imaging evolution or clinical course at 6 months after the last tomoscintigraphy Two hundred and one tomoscintigraphies were performed. One hundred and two scans were true positive, 82 scans were true negative. Six scans were false positive (corresponding to 3 patients): 2 patients with an inflammatory reaction after radiosurgery, 1 with no explanation up to now. Eleven scans were false negative (5 patients): 1 patient with a deep peri-ventricular lesion, 2 patients with no contrast enhancement on MRI, 2 patients with a temporal tumor. The sensitivity for tumor recurrence was 90%, specificity 91.5% and accuracy 90.5%. We studied separately low and high grade glioma: sensitivity for tumor recurrence was respectively 91% and 89%, specificity 100% and 83% and accuracy 95% and 87%. MIBI SPECT allowed the diagnose of anaplasic degenerence of low grade sometimes earlier than clinical (5 cases) or MRI signs (7 cases). Our results confirm the usefullness of MIBI SPECT in the follow-up of treated gliomas for the differential diagnosis between radiation necrosis and tumor recurrence.

  2. ApicoAP: the first computational model for identifying apicoplast-targeted proteins in multiple species of Apicomplexa.

    Directory of Open Access Journals (Sweden)

    Gokcen Cilingir

    Full Text Available Most of the parasites of the phylum Apicomplexa contain a relict prokaryotic-derived plastid called the apicoplast. This organelle is important not only for the survival of the parasite, but its unique properties make it an ideal drug target. The majority of apicoplast-associated proteins are nuclear encoded and targeted post-translationally to the organellar lumen via a bipartite signaling mechanism that requires an N-terminal signal and transit peptide (TP. Attempts to define a consensus motif that universally identifies apicoplast TPs have failed.In this study, we propose a generalized rule-based classification model to identify apicoplast-targeted proteins (ApicoTPs that use a bipartite signaling mechanism. Given a training set specific to an organism, this model, called ApicoAP, incorporates a procedure based on a genetic algorithm to tailor a discriminating rule that exploits the known characteristics of ApicoTPs. Performance of ApicoAP is evaluated for four labeled datasets of Plasmodium falciparum, Plasmodium yoelii, Babesia bovis, and Toxoplasma gondii proteins. ApicoAP improves the classification accuracy of the published dataset for P. falciparum to 94%, originally 90% using PlasmoAP.We present a parametric model for ApicoTPs and a procedure to optimize the model parameters for a given training set. A major asset of this model is that it is customizable to different parasite genomes. The ApicoAP prediction software is available at http://code.google.com/p/apicoap/ and http://bcb.eecs.wsu.edu.

  3. Computational study of the fibril organization of polyglutamine repeats reveals a common motif identified in beta-helices.

    Science.gov (United States)

    Zanuy, David; Gunasekaran, Kannan; Lesk, Arthur M; Nussinov, Ruth

    2006-04-21

    The formation of fibril aggregates by long polyglutamine sequences is assumed to play a major role in neurodegenerative diseases such as Huntington. Here, we model peptides rich in glutamine, through a series of molecular dynamics simulations. Starting from a rigid nanotube-like conformation, we have obtained a new conformational template that shares structural features of a tubular helix and of a beta-helix conformational organization. Our new model can be described as a super-helical arrangement of flat beta-sheet segments linked by planar turns or bends. Interestingly, our comprehensive analysis of the Protein Data Bank reveals that this is a common motif in beta-helices (termed beta-bend), although it has not been identified so far. The motif is based on the alternation of beta-sheet and helical conformation as the protein sequence is followed from the N to the C termini (beta-alpha(R)-beta-polyPro-beta). We further identify this motif in the ssNMR structure of the protofibril of the amyloidogenic peptide Abeta(1-40). The recurrence of the beta-bend suggests a general mode of connecting long parallel beta-sheet segments that would allow the growth of partially ordered fibril structures. The design allows the peptide backbone to change direction with a minimal loss of main chain hydrogen bonds. The identification of a coherent organization beyond that of the beta-sheet segments in different folds rich in parallel beta-sheets suggests a higher degree of ordered structure in protein fibrils, in agreement with their low solubility and dense molecular packing.

  4. Risk Factors for Chronic Subdural Hematoma Recurrence Identified Using Quantitative Computed Tomography Analysis of Hematoma Volume and Density.

    Science.gov (United States)

    Stavrinou, Pantelis; Katsigiannis, Sotirios; Lee, Jong Hun; Hamisch, Christina; Krischek, Boris; Mpotsaris, Anastasios; Timmer, Marco; Goldbrunner, Roland

    2017-03-01

    Chronic subdural hematoma (CSDH), a common condition in elderly patients, presents a therapeutic challenge with recurrence rates of 33%. We aimed to identify specific prognostic factors for recurrence using quantitative analysis of hematoma volume and density. We retrospectively reviewed radiographic and clinical data of 227 CSDHs in 195 consecutive patients who underwent evacuation of the hematoma through a single burr hole, 2 burr holes, or a mini-craniotomy. To examine the relationship between hematoma recurrence and various clinical, radiologic, and surgical factors, we used quantitative image-based analysis to measure the hematoma and trapped air volumes and the hematoma densities. Recurrence of CSDH occurred in 35 patients (17.9%). Multivariate logistic regression analysis revealed that the percentage of hematoma drained and postoperative CSDH density were independent risk factors for recurrence. All 3 evacuation methods were equally effective in draining the hematoma (71.7% vs. 73.7% vs. 71.9%) without observable differences in postoperative air volume captured in the subdural space. Quantitative image analysis provided evidence that percentage of hematoma drained and postoperative CSDH density are independent prognostic factors for subdural hematoma recurrence. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Use of manual alveolar recruitment maneuvers to eliminate atelectasis artifacts identified during thoracic computed tomography of healthy neonatal foals.

    Science.gov (United States)

    Lascola, Kara M; Clark-Price, Stuart C; Joslyn, Stephen K; Mitchell, Mark A; O'Brien, Robert T; Hartman, Susan K; Kline, Kevin H

    2016-11-01

    OBJECTIVE To evaluate use of single manual alveolar recruitment maneuvers (ARMs) to eliminate atelectasis during CT of anesthetized foals. ANIMALS 6 neonatal Standardbred foals. PROCEDURES Thoracic CT was performed on spontaneously breathing anesthetized foals positioned in sternal (n = 3) or dorsal (3) recumbency when foals were 24 to 36 hours old (time 1), 4 days old (time 2), 7 days old (time 3), and 10 days old (time 4). The CT images were collected without ARMs (all times) and during ARMs with an internal airway pressure of 10, 20, and 30 cm H 2 O (times 2 and 3). Quantitative analysis of CT images measured whole lung and regional changes in attenuation or volume with ARMs. RESULTS Increased attenuation and an alveolar pattern were most prominent in the dependent portion of the lungs. Subjectively, ARMs did not eliminate atelectasis; however, they did incrementally reduce attenuation, particularly in the nondependent portion of the lungs. Quantitative differences in lung attenuation attributable to position of foal were not identified. Lung attenuation decreased significantly (times 2 and 3) and lung volume increased significantly (times 2 and 3) after ARMs. Changes in attenuation and volume were most pronounced in the nondependent portion of the lungs and at ARMs of 20 and 30 cm H 2 O. CONCLUSIONS AND CLINICAL RELEVANCE Manual ARMs did not eliminate atelectasis but reduced attenuation in nondependent portions of the lungs. Positioning of foals in dorsal recumbency for CT may be appropriate when pathological changes in the ventral portion of the lungs are suspected.

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  7. Soft computing model for optimized siRNA design by identifying off target possibilities using artificial neural network model.

    Science.gov (United States)

    Murali, Reena; John, Philips George; Peter S, David

    2015-05-15

    The ability of small interfering RNA (siRNA) to do posttranscriptional gene regulation by knocking down targeted genes is an important research topic in functional genomics, biomedical research and in cancer therapeutics. Many tools had been developed to design exogenous siRNA with high experimental inhibition. Even though considerable amount of work has been done in designing exogenous siRNA, design of effective siRNA sequences is still a challenging work because the target mRNAs must be selected such that their corresponding siRNAs are likely to be efficient against that target and unlikely to accidentally silence other transcripts due to sequence similarity. In some cases, siRNAs may tolerate mismatches with the target mRNA, but knockdown of genes other than the intended target could make serious consequences. Hence to design siRNAs, two important concepts must be considered: the ability in knocking down target genes and the off target possibility on any nontarget genes. So before doing gene silencing by siRNAs, it is essential to analyze their off target effects in addition to their inhibition efficacy against a particular target. Only a few methods have been developed by considering both efficacy and off target possibility of siRNA against a gene. In this paper we present a new design of neural network model with whole stacking energy (ΔG) that enables to identify the efficacy and off target effect of siRNAs against target genes. The tool lists all siRNAs against a particular target with their inhibition efficacy and number of matches or sequence similarity with other genes in the database. We could achieve an excellent performance of Pearson Correlation Coefficient (R=0. 74) and Area Under Curve (AUC=0.906) when the threshold of whole stacking energy is ≥-34.6 kcal/mol. To the best of the author's knowledge, this is one of the best score while considering the "combined efficacy and off target possibility" of siRNA for silencing a gene. The proposed model

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  10. Site-Mutation of Hydrophobic Core Residues Synchronically Poise Super Interleukin 2 for Signaling: Identifying Distant Structural Effects through Affordable Computations

    Directory of Open Access Journals (Sweden)

    Longcan Mei

    2018-03-01

    Full Text Available A superkine variant of interleukin-2 with six site mutations away from the binding interface developed from the yeast display technique has been previously characterized as undergoing a distal structure alteration which is responsible for its super-potency and provides an elegant case study with which to get insight about how to utilize allosteric effect to achieve desirable protein functions. By examining the dynamic network and the allosteric pathways related to those mutated residues using various computational approaches, we found that nanosecond time scale all-atom molecular dynamics simulations can identify the dynamic network as efficient as an ensemble algorithm. The differentiated pathways for the six core residues form a dynamic network that outlines the area of structure alteration. The results offer potentials of using affordable computing power to predict allosteric structure of mutants in knowledge-based mutagenesis.

  11. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  18. ISD97, a computer program to analyze data from a series of in situ measurements on a grid and identify potential localized areas of elevated activity

    International Nuclear Information System (INIS)

    Reginatto, M.; Shebell, P.; Miller, K.M.

    1997-10-01

    A computer program, ISD97, was developed to analyze data from a series of in situ measurements on a grid and identify potential localized areas of elevated activity. The ISD97 code operates using a two-step process. A deconvolution of the data is carried out using the maximum entropy method, and a map of activity on the ground that fits the data within experimental error is generated. This maximum entropy map is then analyzed to determine the locations and magnitudes of potential areas of elevated activity that are consistent with the data. New deconvolutions are then carried out for each potential area of elevated activity identified by the code. Properties of the algorithm are demonstrated using data from actual field measurements

  19. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  20. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  1. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  2. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  5. Sepsis reconsidered: Identifying novel metrics for behavioral landscape characterization with a high-performance computing implementation of an agent-based model.

    Science.gov (United States)

    Cockrell, Chase; An, Gary

    2017-10-07

    Sepsis affects nearly 1 million people in the United States per year, has a mortality rate of 28-50% and requires more than $20 billion a year in hospital costs. Over a quarter century of research has not yielded a single reliable diagnostic test or a directed therapeutic agent for sepsis. Central to this insufficiency is the fact that sepsis remains a clinical/physiological diagnosis representing a multitude of molecularly heterogeneous pathological trajectories. Advances in computational capabilities offered by High Performance Computing (HPC) platforms call for an evolution in the investigation of sepsis to attempt to define the boundaries of traditional research (bench, clinical and computational) through the use of computational proxy models. We present a novel investigatory and analytical approach, derived from how HPC resources and simulation are used in the physical sciences, to identify the epistemic boundary conditions of the study of clinical sepsis via the use of a proxy agent-based model of systemic inflammation. Current predictive models for sepsis use correlative methods that are limited by patient heterogeneity and data sparseness. We address this issue by using an HPC version of a system-level validated agent-based model of sepsis, the Innate Immune Response ABM (IIRBM), as a proxy system in order to identify boundary conditions for the possible behavioral space for sepsis. We then apply advanced analysis derived from the study of Random Dynamical Systems (RDS) to identify novel means for characterizing system behavior and providing insight into the tractability of traditional investigatory methods. The behavior space of the IIRABM was examined by simulating over 70 million sepsis patients for up to 90 days in a sweep across the following parameters: cardio-respiratory-metabolic resilience; microbial invasiveness; microbial toxigenesis; and degree of nosocomial exposure. In addition to using established methods for describing parameter space, we

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  8. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Directory of Open Access Journals (Sweden)

    Sebastian McBride

    Full Text Available Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1 conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2 implementation and validation of the model into robotic hardware (as a representative of an active vision system. Seven computational requirements were identified: 1 transformation of retinotopic to egocentric mappings, 2 spatial memory for the purposes of medium-term inhibition of return, 3 synchronization of 'where' and 'what' information from the two visual streams, 4 convergence of top-down and bottom-up information to a centralized point of information processing, 5 a threshold function to elicit saccade action, 6 a function to represent task relevance as a ratio of excitation and inhibition, and 7 derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  9. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Science.gov (United States)

    McBride, Sebastian; Huelse, Martin; Lee, Mark

    2013-01-01

    Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1) conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2) implementation and validation of the model into robotic hardware (as a representative of an active vision system). Seven computational requirements were identified: 1) transformation of retinotopic to egocentric mappings, 2) spatial memory for the purposes of medium-term inhibition of return, 3) synchronization of 'where' and 'what' information from the two visual streams, 4) convergence of top-down and bottom-up information to a centralized point of information processing, 5) a threshold function to elicit saccade action, 6) a function to represent task relevance as a ratio of excitation and inhibition, and 7) derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  10. Development of computational fluid dynamics--habitat suitability (CFD-HSI) models to identify potential passage--Challenge zones for migratory fishes in the Penobscot River

    Science.gov (United States)

    Haro, Alexander J.; Dudley, Robert W.; Chelminski, Michael

    2012-01-01

    A two-dimensional computational fluid dynamics-habitat suitability (CFD–HSI) model was developed to identify potential zones of shallow depth and high water velocity that may present passage challenges for five anadromous fish species in the Penobscot River, Maine, upstream from two existing dams and as a result of the proposed future removal of the dams. Potential depth-challenge zones were predicted for larger species at the lowest flow modeled in the dam-removal scenario. Increasing flows under both scenarios increased the number and size of potential velocity-challenge zones, especially for smaller species. This application of the two-dimensional CFD–HSI model demonstrated its capabilities to estimate the potential effects of flow and hydraulic alteration on the passage of migratory fish.

  11. Systematic procedure for identifying the five main ossification stages of the medial clavicular epiphysis using computed tomography: a practical proposal for forensic age diagnostics.

    Science.gov (United States)

    Wittschieber, Daniel; Schulz, Ronald; Pfeiffer, Heidi; Schmeling, Andreas; Schmidt, Sven

    2017-01-01

    In forensic age estimations of living individuals, computed tomography of the clavicle is widely used for determining the age of majority. To this end, the degree of ossification of the medial clavicular epiphysis can be determined by means of two classification systems complementing each other: a 5-stage system and an additional 6-stage system that further sub-classifies the stages 2 and 3. In recent years, practical experience and new data revealed that difficulties and even wrong stage determinations may occur especially when following the short descriptions of the fundamental 5-stage system only. Based on current literature, this article provides a systematic procedure for identifying the five main ossification stages by listing important preconditions and presenting an algorithm that is comprised of four specific questions. Each question is accompanied by comprehensive and detailed descriptions which specify the criteria used for differentiation. The information is subdivided into "single-slice view" and "multi-slice view." In addition, illustrative case examples and schematic drawings facilitate application of the procedure in forensic practice. The pitfalls associated with the criteria of stage determination will be discussed in detail. Eventually, two general rules will be inferred to assign correct ossification stages of the medial clavicular epiphysis by means of computed tomography.

  12. An Approach for a Synthetic CTL Vaccine Design against Zika Flavivirus Using Class I and Class II Epitopes Identified by Computer Modeling

    Directory of Open Access Journals (Sweden)

    Edecio Cunha-Neto

    2017-06-01

    Full Text Available The threat posed by severe congenital abnormalities related to Zika virus (ZKV infection during pregnancy has turned development of a ZKV vaccine into an emergency. Recent work suggests that the cytotoxic T lymphocyte (CTL response to infection is an important defense mechanism in response to ZKV. Here, we develop the rationale and strategy for a new approach to developing cytotoxic T lymphocyte (CTL vaccines for ZKV flavivirus infection. The proposed approach is based on recent studies using a protein structure computer model for HIV epitope selection designed to select epitopes for CTL attack optimized for viruses that exhibit antigenic drift. Because naturally processed and presented human ZKV T cell epitopes have not yet been described, we identified predicted class I peptide sequences on ZKV matching previously identified DNV (Dengue class I epitopes and by using a Major Histocompatibility Complex (MHC binding prediction tool. A subset of those met the criteria for optimal CD8+ attack based on physical chemistry parameters determined by analysis of the ZKV protein structure encoded in open source Protein Data File (PDB format files. We also identified candidate ZKV epitopes predicted to bind promiscuously to multiple HLA class II molecules that could provide help to the CTL responses. This work suggests that a CTL vaccine for ZKV may be possible even if ZKV exhibits significant antigenic drift. We have previously described a microsphere-based CTL vaccine platform capable of eliciting an immune response for class I epitopes in mice and are currently working toward in vivo testing of class I and class II epitope delivery directed against ZKV epitopes using the same microsphere-based vaccine.

  13. Selection of personalized patient therapy through the use of knowledge-based computational models that identify tumor-driving signal transduction pathways.

    Science.gov (United States)

    Verhaegh, Wim; van Ooijen, Henk; Inda, Márcia A; Hatzis, Pantelis; Versteeg, Rogier; Smid, Marcel; Martens, John; Foekens, John; van de Wiel, Paul; Clevers, Hans; van de Stolpe, Anja

    2014-06-01

    Increasing knowledge about signal transduction pathways as drivers of cancer growth has elicited the development of "targeted drugs," which inhibit aberrant signaling pathways. They require a companion diagnostic test that identifies the tumor-driving pathway; however, currently available tests like estrogen receptor (ER) protein expression for hormonal treatment of breast cancer do not reliably predict therapy response, at least in part because they do not adequately assess functional pathway activity. We describe a novel approach to predict signaling pathway activity based on knowledge-based Bayesian computational models, which interpret quantitative transcriptome data as the functional output of an active signaling pathway, by using expression levels of transcriptional target genes. Following calibration on only a small number of cell lines or cohorts of patient data, they provide a reliable assessment of signaling pathway activity in tumors of different tissue origin. As proof of principle, models for the canonical Wnt and ER pathways are presented, including initial clinical validation on independent datasets from various cancer types. ©2014 American Association for Cancer Research.

  14. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Korfiati, Aigli; Theofilatos, Konstantinos A.; Likothanassis, Spiridon D.; Tsakalidis, Athanasios K.; Mavroudi, Seferina P.

    2013-01-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. © 2013 Elsevier Inc.

  15. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2013-06-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3\\'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. © 2013 Elsevier Inc.

  16. Prevalence and Clinical Import of Thoracic Injury Identified by Chest Computed Tomography but Not Chest Radiography in Blunt Trauma: Multicenter Prospective Cohort Study.

    Science.gov (United States)

    Langdorf, Mark I; Medak, Anthony J; Hendey, Gregory W; Nishijima, Daniel K; Mower, William R; Raja, Ali S; Baumann, Brigitte M; Anglin, Deirdre R; Anderson, Craig L; Lotfipour, Shahram; Reed, Karin E; Zuabi, Nadia; Khan, Nooreen A; Bithell, Chelsey A; Rowther, Armaan A; Villar, Julian; Rodriguez, Robert M

    2015-12-01

    Chest computed tomography (CT) diagnoses more injuries than chest radiography, so-called occult injuries. Wide availability of chest CT has driven substantial increase in emergency department use, although the incidence and clinical significance of chest CT findings have not been fully described. We determine the frequency, severity, and clinical import of occult injury, as determined by changes in management. These data will better inform clinical decisions, need for chest CT, and odds of intervention. Our sample included prospective data (2009 to 2013) on 5,912 patients at 10 Level I trauma center EDs with both chest radiography and chest CT at physician discretion. These patients were 40.6% of 14,553 enrolled in the parent study who had either chest radiography or chest CT. Occult injuries were pneumothorax, hemothorax, sternal or greater than 2 rib fractures, pulmonary contusion, thoracic spine or scapula fracture, and diaphragm or great vessel injury found on chest CT but not on preceding chest radiography. A priori, we categorized thoracic injuries as major (having invasive procedures), minor (observation or inpatient pain control >24 hours), or of no clinical significance. Primary outcome was prevalence and proportion of occult injury with major interventions of chest tube, mechanical ventilation, or surgery. Secondary outcome was minor interventions of admission rate or observation hours because of occult injury. Two thousand forty-eight patients (34.6%) had chest injury on chest radiography or chest CT, whereas 1,454 of these patients (71.0%, 24.6% of all patients) had occult injury. Of these, in 954 patients (46.6% of injured, 16.1% of total), chest CT found injuries not observed on immediately preceding chest radiography. In 500 more patients (24.4% of injured patients, 8.5% of all patients), chest radiography found some injury, but chest CT found occult injury. Chest radiography found all injuries in only 29.0% of injured patients. Two hundred and two

  17. A rare case of extensive skeletal muscle metastases in adenocarcinoma cervix identified by 18F-fluorodeoxyglucose positron emission tomography/computed tomography scan

    International Nuclear Information System (INIS)

    Vishnoi, Madan Gopal; Jain, Anurag; John, Arun Ravi; Paliwal, Dharmesh

    2016-01-01

    Adenocarcinoma cervix is an uncommon histological subtype of carcinoma cervix; further incidence of skeletal muscle metastases is even rarer. We report the identification of extensive fluorodeoxyglucose (FDG) avid metastatic skeletal muscle deposits in a known case of adenocarcinoma cervix. The largest lesion representative of muscle deposit in the right deltoid was histopathologically confirmed to be metastatic poorly differentiated carcinoma. This report also serves to highlight the importance of 18 F-FDG positron emission tomography/computed tomography (CT) as compared to conventional imaging modalities such as CT and ultrasonography and comments better over the description of invasiveness as well as the extent of disease in carcinoma cervix

  18. Identifying Structure-Property Relationships Through DREAM.3D Representative Volume Elements and DAMASK Crystal Plasticity Simulations: An Integrated Computational Materials Engineering Approach

    Science.gov (United States)

    Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk

    2017-05-01

    Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.

  19. A computational approach identifies two regions of Hepatitis C Virus E1 protein as interacting domains involved in viral fusion process

    Directory of Open Access Journals (Sweden)

    El Sawaf Gamal

    2009-07-01

    Full Text Available Abstract Background The E1 protein of Hepatitis C Virus (HCV can be dissected into two distinct hydrophobic regions: a central domain containing an hypothetical fusion peptide (FP, and a C-terminal domain (CT comprising two segments, a pre-anchor and a trans-membrane (TM region. In the currently accepted model of the viral fusion process, the FP and the TM regions are considered to be closely juxtaposed in the post-fusion structure and their physical interaction cannot be excluded. In the present study, we took advantage of the natural sequence variability present among HCV strains to test, by purely sequence-based computational tools, the hypothesis that in this virus the fusion process involves the physical interaction of the FP and CT regions of E1. Results Two computational approaches were applied. The first one is based on the co-evolution paradigm of interacting peptides and consequently on the correlation between the distance matrices generated by the sequence alignment method applied to FP and CT primary structures, respectively. In spite of the relatively low random genetic drift between genotypes, co-evolution analysis of sequences from five HCV genotypes revealed a greater correlation between the FP and CT domains than respect to a control HCV sequence from Core protein, so giving a clear, albeit still inconclusive, support to the physical interaction hypothesis. The second approach relies upon a non-linear signal analysis method widely used in protein science called Recurrence Quantification Analysis (RQA. This method allows for a direct comparison of domains for the presence of common hydrophobicity patterns, on which the physical interaction is based upon. RQA greatly strengthened the reliability of the hypothesis by the scoring of a lot of cross-recurrences between FP and CT peptides hydrophobicity patterning largely outnumbering chance expectations and pointing to putative interaction sites. Intriguingly, mutations in the CT

  20. Quantitative perfusion computed tomography measurements of cerebral hemodynamics: Correlation with digital subtraction angiography identified primary and secondary cerebral collaterals in internal carotid artery occlusive disease

    International Nuclear Information System (INIS)

    Cheng Xiaoqing; Tian Jianming; Zuo Changjing; Liu Jia; Zhang Qi; Lu Guangming

    2012-01-01

    Background: The aim of the present study was to assess hemodynamic variations in symptomatic unilateral internal carotid artery occlusion (ICAO) patients with primary collateral flow via circle of Willis or secondary collateral flow via ophthalmic artery and/or leptomeningeal collaterals. Methods: Thirty-eight patients with a symptomatic unilateral ICAO were enrolled in the study. Based on digital subtraction angiography (DSA) findings, patients were classified into 2 groups: primary collateral (n = 14) and secondary collateral (n = 24) groups. Collateral flow hemodynamics were investigated with perfusion computed tomography (PCT) by measuring the cerebral blood flow (CBF), cerebral blood volume (CBV) and time to peak (TTP) in the hemispheres ipsilateral and contralateral to ICAO. Based on the measurements, the ipsilateral to contralateral ratio for each parameter was calculated and compared. Results: Irrespective of the collateral patterns, ipsilateral CBF was not significantly different from that of the contralateral hemisphere (P = 0.285); ipsilateral CBV and TTP was significantly increased compared with those of the contralateral hemisphere (P = 0.000 and P = 0.000 for CBV and TTP, respectively). Furthermore, patients with secondary collaterals had significantly larger ipsilateral-to-contralateral ratios for both CBV (rCBV, P = 0.0197) and TTP (rTTP, P = 0.000) than those of patients with only primary collaterals. These two groups showed no difference in ipsilateral-to-contralateral ratio for CBF (rCBF, P = 0.312). Conclusion: Patients with symptomatic unilateral ICAO in our study were in an autoregulatory vasodilatation status. Moreover, secondary collaterals in ICAO patients were correlated with ipsilateral CBV and delayed TTP that suggested severe hemodynamic impairment, presumably increasing the risk of ischemic events.

  1. Chest Wall Constriction after the Nuss Procedure Identified from Chest Radiograph and Multislice Computed Tomography Shortly after Removal of the Bar.

    Science.gov (United States)

    Chang, Pei-Yeh; Zeng, Qi; Wong, Kin-Sun; Wang, Chao-Jan; Chang, Chee-Jen

    2016-01-01

    This study radiographically examined the changes in the chest walls of patients with pectus excavatum (PE) after Nuss bar removal, to define the deformation caused by the bar and stabilizer. In the first part of the study, we compared the changes in chest radiographs of patients with PE to a preoperation PE control group. In the second part, we used multislice computed tomography (CT) scans to provide three-dimensional reconstructions with which to evaluate the changes to the thoracic wall. Part 1 From June 2006 to August 2011, 1,125 patients with PE who had posteroanterior chest radiographs taken before undergoing the Nuss procedure at four hospitals were enrolled as a preoperative control group. At the same time, 203 patients who had the bar removed were enrolled as the study group. The maximum dimensions of the outer boundary of the first to ninth rib pairs (R1-R9, rib pair width), chest height, and chest width were measured. Part 2 Thirty-one consecutive patients with PE (20 males and 11 females) who underwent Nuss bar removal were evaluated 7 to 30 days after operation. During this period, a further 34 patients with PE who had undergone CT imaging before bar insertion were evaluated and compared with the postoperative group. Part 1 The width of the lower ribs (R4-R9) after bar removal was significantly less than in the age-matched controls. The ribs adjacent to the bar (R5-R7) showed the greatest restriction. The width of the upper ribs (R1-R3) 2 to 3 years after bar placement did not differ significantly from the controls. Patients who were operated on after 10 years of age had less of a restrictive effect. Three years of bar placement resulted in more restriction than a 2-year period, particularly in patients younger than 10 years old. Part 2: A significant constriction of the chest wall was observed in 13 patients after removal of the Nuss bar. Constriction at ribs 5 to 8 was found to be present adjacent to the site of bar insertion. However

  2. A Fast SVM-Based Tongue’s Colour Classification Aided by k-Means Clustering Identifiers and Colour Attributes as Computer-Assisted Tool for Tongue Diagnosis

    Directory of Open Access Journals (Sweden)

    Nur Diyana Kamarudin

    2017-01-01

    Full Text Available In tongue diagnosis, colour information of tongue body has kept valuable information regarding the state of disease and its correlation with the internal organs. Qualitatively, practitioners may have difficulty in their judgement due to the instable lighting condition and naked eye’s ability to capture the exact colour distribution on the tongue especially the tongue with multicolour substance. To overcome this ambiguity, this paper presents a two-stage tongue’s multicolour classification based on a support vector machine (SVM whose support vectors are reduced by our proposed k-means clustering identifiers and red colour range for precise tongue colour diagnosis. In the first stage, k-means clustering is used to cluster a tongue image into four clusters of image background (black, deep red region, red/light red region, and transitional region. In the second-stage classification, red/light red tongue images are further classified into red tongue or light red tongue based on the red colour range derived in our work. Overall, true rate classification accuracy of the proposed two-stage classification to diagnose red, light red, and deep red tongue colours is 94%. The number of support vectors in SVM is improved by 41.2%, and the execution time for one image is recorded as 48 seconds.

  3. A Fast SVM-Based Tongue's Colour Classification Aided by k-Means Clustering Identifiers and Colour Attributes as Computer-Assisted Tool for Tongue Diagnosis

    Science.gov (United States)

    Ooi, Chia Yee; Kawanabe, Tadaaki; Odaguchi, Hiroshi; Kobayashi, Fuminori

    2017-01-01

    In tongue diagnosis, colour information of tongue body has kept valuable information regarding the state of disease and its correlation with the internal organs. Qualitatively, practitioners may have difficulty in their judgement due to the instable lighting condition and naked eye's ability to capture the exact colour distribution on the tongue especially the tongue with multicolour substance. To overcome this ambiguity, this paper presents a two-stage tongue's multicolour classification based on a support vector machine (SVM) whose support vectors are reduced by our proposed k-means clustering identifiers and red colour range for precise tongue colour diagnosis. In the first stage, k-means clustering is used to cluster a tongue image into four clusters of image background (black), deep red region, red/light red region, and transitional region. In the second-stage classification, red/light red tongue images are further classified into red tongue or light red tongue based on the red colour range derived in our work. Overall, true rate classification accuracy of the proposed two-stage classification to diagnose red, light red, and deep red tongue colours is 94%. The number of support vectors in SVM is improved by 41.2%, and the execution time for one image is recorded as 48 seconds. PMID:29065640

  4. Computational Characterization of Small Molecules Binding to the Human XPF Active Site and Virtual Screening to Identify Potential New DNA Repair Inhibitors Targeting the ERCC1-XPF Endonuclease

    Directory of Open Access Journals (Sweden)

    Francesco Gentile

    2018-04-01

    Full Text Available The DNA excision repair protein ERCC-1-DNA repair endonuclease XPF (ERCC1-XPF is a heterodimeric endonuclease essential for the nucleotide excision repair (NER DNA repair pathway. Although its activity is required to maintain genome integrity in healthy cells, ERCC1-XPF can counteract the effect of DNA-damaging therapies such as platinum-based chemotherapy in cancer cells. Therefore, a promising approach to enhance the effect of these therapies is to combine their use with small molecules, which can inhibit the repair mechanisms in cancer cells. Currently, there are no structures available for the catalytic site of the human ERCC1-XPF, which performs the metal-mediated cleavage of a DNA damaged strand at 5′. We adopted a homology modeling strategy to build a structural model of the human XPF nuclease domain which contained the active site and to extract dominant conformations of the domain using molecular dynamics simulations followed by clustering of the trajectory. We investigated the binding modes of known small molecule inhibitors targeting the active site to build a pharmacophore model. We then performed a virtual screening of the ZINC Is Not Commercial 15 (ZINC15 database to identify new ERCC1-XPF endonuclease inhibitors. Our work provides structural insights regarding the binding mode of small molecules targeting the ERCC1-XPF active site that can be used to rationally optimize such compounds. We also propose a set of new potential DNA repair inhibitors to be considered for combination cancer therapy strategies.

  5. A Fast SVM-Based Tongue's Colour Classification Aided by k-Means Clustering Identifiers and Colour Attributes as Computer-Assisted Tool for Tongue Diagnosis.

    Science.gov (United States)

    Kamarudin, Nur Diyana; Ooi, Chia Yee; Kawanabe, Tadaaki; Odaguchi, Hiroshi; Kobayashi, Fuminori

    2017-01-01

    In tongue diagnosis, colour information of tongue body has kept valuable information regarding the state of disease and its correlation with the internal organs. Qualitatively, practitioners may have difficulty in their judgement due to the instable lighting condition and naked eye's ability to capture the exact colour distribution on the tongue especially the tongue with multicolour substance. To overcome this ambiguity, this paper presents a two-stage tongue's multicolour classification based on a support vector machine (SVM) whose support vectors are reduced by our proposed k -means clustering identifiers and red colour range for precise tongue colour diagnosis. In the first stage, k -means clustering is used to cluster a tongue image into four clusters of image background (black), deep red region, red/light red region, and transitional region. In the second-stage classification, red/light red tongue images are further classified into red tongue or light red tongue based on the red colour range derived in our work. Overall, true rate classification accuracy of the proposed two-stage classification to diagnose red, light red, and deep red tongue colours is 94%. The number of support vectors in SVM is improved by 41.2%, and the execution time for one image is recorded as 48 seconds.

  6. Use of Anisotropy, 3D Segmented Atlas, and Computational Analysis to Identify Gray Matter Subcortical Lesions Common to Concussive Injury from Different Sites on the Cortex.

    Directory of Open Access Journals (Sweden)

    Praveen Kulkarni

    Full Text Available Traumatic brain injury (TBI can occur anywhere along the cortical mantel. While the cortical contusions may be random and disparate in their locations, the clinical outcomes are often similar and difficult to explain. Thus a question that arises is, do concussions at different sites on the cortex affect similar subcortical brain regions? To address this question we used a fluid percussion model to concuss the right caudal or rostral cortices in rats. Five days later, diffusion tensor MRI data were acquired for indices of anisotropy (IA for use in a novel method of analysis to detect changes in gray matter microarchitecture. IA values from over 20,000 voxels were registered into a 3D segmented, annotated rat atlas covering 150 brain areas. Comparisons between left and right hemispheres revealed a small population of subcortical sites with altered IA values. Rostral and caudal concussions were of striking similarity in the impacted subcortical locations, particularly the central nucleus of the amygdala, laterodorsal thalamus, and hippocampal complex. Subsequent immunohistochemical analysis of these sites showed significant neuroinflammation. This study presents three significant findings that advance our understanding and evaluation of TBI: 1 the introduction of a new method to identify highly localized disturbances in discrete gray matter, subcortical brain nuclei without postmortem histology, 2 the use of this method to demonstrate that separate injuries to the rostral and caudal cortex produce the same subcortical, disturbances, and 3 the central nucleus of the amygdala, critical in the regulation of emotion, is vulnerable to concussion.

  7. Trifoliate hybrids as rootstocks for Pêra sweet orange tree

    OpenAIRE

    Jorgino Pompeu Junior; Silvia Blumer

    2014-01-01

    The Rangpur lime (Citrus limonia) has been used as the main rootstock for Pêra sweet orange (C. sinensis) trees. However, its susceptibility to citrus blight and citrus sudden death has led to the use of disease-tolerant rootstocks, such as Cleopatra mandarin reshni), Sunki mandarin (C. sunki) and Swingle citrumelo (C. paradisi x Poncirus trifoliata), which are more susceptible to drought than the Rangpur lime. These mandarin varieties are also less resistant to root rot caused by Phytophthor...

  8. The use of electronic alerts in primary care computer systems to identify the excessive prescription of short-acting beta2-agonists for people with asthma: a systematic review.

    Science.gov (United States)

    McKibben, Shauna; De Simoni, Anna; Bush, Andy; Thomas, Mike; Griffiths, Chris

    2018-04-16

    Computers are increasingly used to improve prescribing decisions in the management of long-term conditions however the effects on asthma prescribing remain unclear. We aimed to synthesise the evidence for the use of computerised alerts that identify excessive prescribing of short-acting beta 2 -agonists (SABAs) to improve asthma management for people with asthma. MEDLINE, CINAHL, Embase, Cochrane and Scopus databases (1990-2016) were searched for randomised controlled trials using electronic alerts to identify excessive prescribing of SABAs for people with asthma in primary care. Inclusion eligibility, quality appraisal (Cochrane risk of bias tool) and data extraction were performed by two independent reviewers. Findings were synthesised narratively. A total of 2035 articles were screened and four trials were eligible. Three studies had low risk of bias: one reported a positive effect on our primary outcome of interest, excessive SABA prescribing; another reported positive effects on the ratio of inhaled corticosteroid (ICS)-SABA prescribing, and asthma control; a third reported no effect on outcomes of interest. One study at high risk of bias reported a reduction in exacerbations and primary care consultations. There is some evidence that electronic alerts reduce excessive prescribing of SABAs, when delivered as part of a multicomponent intervention in an integrated health care system. However due to the variation in health care systems, intervention design and outcomes measured, further research is required to establish optimal design of alerting and intervening systems.

  9. Indeterminate solid hepatic lesions identified on non-diagnostic contrast-enhanced computed tomography: Assessment of the additional diagnostic value of contrast-enhanced ultrasound in the non-cirrhotic liver

    International Nuclear Information System (INIS)

    Quaia, Emilio; De Paoli, Luca; Angileri, Roberta; Cabibbo, Biagio; Cova, Maria Assunta

    2014-01-01

    Objective: To assess the additional diagnostic value of contrast-enhanced ultrasound (CEUS) in the characterization of indeterminate solid hepatic lesions identified on non-diagnostic contrast-enhanced computed tomography (CT). Methods: Fifty-five solid hepatic lesions (1–4 cm in diameter) in 46 non-cirrhotic patients (26 female, 20 male; age ± SD, 55 ± 10 years) underwent CEUS after being detected on contrast-enhanced CT which was considered as non-diagnostic after on-site analysis. Two blinded independent readers assessed CT and CEUS scans and were asked to classify retrospectively each lesion as a malignant or benign based on reference diagnostic criteria for the different hepatic lesion histotypes. Diagnostic accuracy and confidence (area – A z – under ROC curve) were assessed by using gadobenate dimeglumine-enhanced magnetic resonance (MR) imaging (n = 30 lesions), histology (n = 7 lesions), or US follow-up (n = 18 lesions) as the reference standards. Results: Final diagnoses included 29 hemangiomas, 3 focal nodular hyperplasias, 1 hepatocellular adenoma, and 22 metastases. The additional review of CEUS after CT images improved significantly (P < .05) the diagnostic accuracy (before vs after CEUS review = 49% [20/55] vs 89% [49/55] – reader 1 and 43% [24/55] vs 92% [51/55] – reader 2) and confidence (A z , 95% Confidence Intervals before vs after CEUS review = .773 [.652–.895] vs .997 [.987–1] – reader 1 and .831 [.724–.938] vs .998 [.992–1] – reader 2). Conclusions: CEUS improved the characterization of indeterminate solid hepatic lesions identified on non-diagnostic contrast-enhanced CT by identifying some specific contrast enhancement patterns.

  10. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  11. “Drug mules” as a radiological challenge: Sensitivity and specificity in identifying internal cocaine in body packers, body pushers and body stuffers by computed tomography, plain radiography and Lodox

    Energy Technology Data Exchange (ETDEWEB)

    Flach, Patricia M., E-mail: patricia.flach@irm.uzh.ch [Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Bern, Buehlstrasse 20, 3012 Bern (Switzerland); Department of Neuroradiology, Inselspital Bern, University of Bern, 3010 Bern (Switzerland); Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Department of Radiology, University Hospital USZ, University of Zurich, Raemistrasse 100, 8091 Zurich (Switzerland); Ross, Steffen G. [Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Bern, Buehlstrasse 20, 3012 Bern (Switzerland); Ampanozi, Garyfalia; Ebert, Lars [Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Bern, Buehlstrasse 20, 3012 Bern (Switzerland); Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Germerott, Tanja; Hatch, Gary M. [Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Bern, Buehlstrasse 20, 3012 Bern (Switzerland); Thali, Michael J. [Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Bern, Buehlstrasse 20, 3012 Bern (Switzerland); Centre for Forensic Imaging and Virtopsy, Institute of Forensic Medicine, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Patak, Michael A. [Department of Radiology, Inselspital Bern, University of Bern, 3010 Bern (Switzerland); Department of Radiology, University Hospital USZ, University of Zurich, Raemistrasse 100, 8091 Zurich (Switzerland)

    2012-10-15

    Purpose: The purpose of our study was to retrospectively evaluate the specificity, sensitivity and accuracy of computed tomography (CT), digital radiography (DR) and low-dose linear slit digital radiography (LSDR, Lodox{sup ®}) in the detection of internal cocaine containers. Methods: Institutional review board approval was obtained. The study collectively consisted of 83 patients (76 males, 7 females, 16–45 years) suspected of having incorporated cocaine drug containers. All underwent radiological imaging; a total of 135 exams were performed: nCT = 35, nDR = 70, nLSDR = 30. An overall calculation of all “drug mules” and a specific evaluation of body packers, pushers and stuffers were performed. The gold standard was stool examination in a dedicated holding cell equipped with a drug toilet. Results: There were 54 drug mules identified in this study. CT of all drug carriers showed the highest diagnostic accuracy 97.1%, sensitivity 100% and specificity 94.1%. DR in all cases was 71.4% accurate, 58.3% sensitive and 85.3% specific. LSDR of all patients with internal cocaine was 60% accurate, 57.9% sensitive and 63.4% specific. Conclusions: CT was the most accurate test studied. Therefore, the detection of internal cocaine drug packs should be performed by CT, rather than by conventional X-ray, in order to apply the most sensitive exam in the medico-legal investigation of suspected drug carriers. Nevertheless, the higher radiation applied by CT than by DR or LSDR needs to be considered. Future studies should include evaluation of low dose CT protocols in order to address germane issues and to reduce dosage.

  12. “Drug mules” as a radiological challenge: Sensitivity and specificity in identifying internal cocaine in body packers, body pushers and body stuffers by computed tomography, plain radiography and Lodox

    International Nuclear Information System (INIS)

    Flach, Patricia M.; Ross, Steffen G.; Ampanozi, Garyfalia; Ebert, Lars; Germerott, Tanja; Hatch, Gary M.; Thali, Michael J.; Patak, Michael A.

    2012-01-01

    Purpose: The purpose of our study was to retrospectively evaluate the specificity, sensitivity and accuracy of computed tomography (CT), digital radiography (DR) and low-dose linear slit digital radiography (LSDR, Lodox ® ) in the detection of internal cocaine containers. Methods: Institutional review board approval was obtained. The study collectively consisted of 83 patients (76 males, 7 females, 16–45 years) suspected of having incorporated cocaine drug containers. All underwent radiological imaging; a total of 135 exams were performed: nCT = 35, nDR = 70, nLSDR = 30. An overall calculation of all “drug mules” and a specific evaluation of body packers, pushers and stuffers were performed. The gold standard was stool examination in a dedicated holding cell equipped with a drug toilet. Results: There were 54 drug mules identified in this study. CT of all drug carriers showed the highest diagnostic accuracy 97.1%, sensitivity 100% and specificity 94.1%. DR in all cases was 71.4% accurate, 58.3% sensitive and 85.3% specific. LSDR of all patients with internal cocaine was 60% accurate, 57.9% sensitive and 63.4% specific. Conclusions: CT was the most accurate test studied. Therefore, the detection of internal cocaine drug packs should be performed by CT, rather than by conventional X-ray, in order to apply the most sensitive exam in the medico-legal investigation of suspected drug carriers. Nevertheless, the higher radiation applied by CT than by DR or LSDR needs to be considered. Future studies should include evaluation of low dose CT protocols in order to address germane issues and to reduce dosage

  13. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  14. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  15. 31P MR spectroscopy and computational modeling identify a direct relation between Pi content of an alkaline compartment in resting muscle and phosphocreatine resynthesis kinetics in active muscle in humans.

    Directory of Open Access Journals (Sweden)

    Joep W M van Oorschot

    Full Text Available The assessment of mitochondrial properties in skeletal muscle is important in clinical research, for instance in the study of diabetes. The gold standard to measure mitochondrial capacity non-invasively is the phosphocreatine (PCr recovery rate after exercise, measured by (31P Magnetic Resonance spectroscopy ((31P MRS. Here, we sought to expand the evidence base for an alternative method to assess mitochondrial properties which uses (31P MRS measurement of the Pi content of an alkaline compartment attributed to mitochondria (Pi2; as opposed to cytosolic Pi (Pi1 in resting muscle at high magnetic field. Specifically, the PCr recovery rate in human quadriceps muscle was compared with the signal intensity of the Pi2 peak in subjects with varying mitochondrial content of the quadriceps muscle as a result of athletic training, and the results were entered into a mechanistic computational model of mitochondrial metabolism in muscle to test if the empirical relation between Pi2/Pi1 ratio and the PCr recovery was consistent with theory. Localized (31P spectra were obtained at 7T from resting vastus lateralis muscle to measure the intensity of the Pi2 peak. In the endurance trained athletes a Pi2/Pi1 ratio of 0.07 ± 0.01 was found, compared to a significantly lower (p<0.05 Pi2/Pi1 ratio of 0.03 ± 0.01 in the normally active group. Next, PCr recovery kinetics after in magnet bicycle exercise were measured at 1.5T. For the endurance trained athletes, a time constant τPCr 12 ± 3 s was found, compared to 24 ± 5s in normally active subjects. Without any parameter optimization the computational model prediction matched the experimental data well (r(2 of 0.75. Taken together, these results suggest that the Pi2 resonance in resting human skeletal muscle observed at 7T provides a quantitative MR-based functional measure of mitochondrial density.

  16. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  17. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  18. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  19. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  20. Sparse Linear Identifiable Multivariate Modeling

    DEFF Research Database (Denmark)

    Henao, Ricardo; Winther, Ole

    2011-01-01

    and bench-marked on artificial and real biological data sets. SLIM is closest in spirit to LiNGAM (Shimizu et al., 2006), but differs substantially in inference, Bayesian network structure learning and model comparison. Experimentally, SLIM performs equally well or better than LiNGAM with comparable......In this paper we consider sparse and identifiable linear latent variable (factor) and linear Bayesian network models for parsimonious analysis of multivariate data. We propose a computationally efficient method for joint parameter and model inference, and model comparison. It consists of a fully...

  1. Thoughts on identifiers

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    As business processes and information transactions have become an inextricably intertwined with the Web, the importance of assignment, registration, discovery, and maintenance of identifiers has increased. In spite of this, integrated frameworks for managing identifiers have been slow to emerge. Instead, identification systems arise (quite naturally) from immediate business needs without consideration for how they fit into larger information architectures. In addition, many legacy identifier systems further complicate the landscape, making it difficult for content managers to select and deploy identifier systems that meet both the business case and long term information management objectives. This presentation will outline a model for evaluating identifier applications and the functional requirements of the systems necessary to support them. The model is based on a layered analysis of the characteristics of identifier systems, including: * Functional characteristics * Technology * Policy * Business * Social T...

  2. Identifiability in stochastic models

    CERN Document Server

    1992-01-01

    The problem of identifiability is basic to all statistical methods and data analysis, occurring in such diverse areas as Reliability Theory, Survival Analysis, and Econometrics, where stochastic modeling is widely used. Mathematics dealing with identifiability per se is closely related to the so-called branch of ""characterization problems"" in Probability Theory. This book brings together relevant material on identifiability as it occurs in these diverse fields.

  3. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  4. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  5. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  6. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  7. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  8. Identifying Strategic Scientific Opportunities

    Science.gov (United States)

    As NCI's central scientific strategy office, CRS collaborates with the institute's divisions, offices, and centers to identify research opportunities to advance NCI's vision for the future of cancer research.

  9. Identifying Breast Cancer Oncogenes

    Science.gov (United States)

    2011-10-01

    cells we observed that it promoted transformation of HMLE cells, suggesting a tumor suppressive role of Merlin in breast cancer (Figure 4B). A...08-1-0767 TITLE: Identifying Breast Cancer Oncogenes PRINCIPAL INVESTIGATOR: Yashaswi Shrestha...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 W81XWH-08-1-0767 Identifying Breast Cancer Oncogenes Yashaswi Shrestha Dana-Farber

  10. Using computer technology to identify the appropriate radioactive materials packaging

    International Nuclear Information System (INIS)

    Driscoll, K.L.; Conan, M.R.

    1989-01-01

    The Radioactive Materials Packaging (RAMPAC) database is designed to store and retrieve information on all non-classified packages certified for the transport of radioactive materials within the boundaries of the US. The information in RAMPAC is publicly available, and the database has been designed so that individuals without programming experience can search for and retrieve information using a menu-driven system. RAMPAC currently contains information on over 650 radioactive material shipping packages. Information is gathered from the US Department of Energy (DOE), the US Department of transportation (DOT), and the US Nuclear Regulatory Commission (NRC). RAMPAC is the only tool available to radioactive material shippers that contains and reports packaging information from all three Federal Agencies. The DOT information includes package listings from Canada, France, Germany, Great Britain, and Japan, which have DOT revalidations for their certificates of competent authority and are authorized for use within the US for import and export shipments only. RAMPAC was originally developed in 1981 by DOE as a research and development tool. In recent years, however, RAMPAC has proven to be highly useful to operational personnel. As packages become obsolete or materials to be transported change, shippers of radioactive materials must be able to determine if alternative packages exist before designing new packages. RAMPAC is designed to minimize the time required to make this determination, thus assisting the operational community in meeting their goals

  11. Computational Approach to Identify Different Injuries by Firearms.

    Science.gov (United States)

    Costa, Sarah Teixeira; Freire, Alexandre Rodrigues; Matoso, Rodrigo Ivo; Daruge Júnior, Eduardo; Rossi, Ana Cláudia; Prado, Felippe Bevilacqua

    2017-03-01

    Complications arise in the analysis of gunshot wounds to the maxillofacial region, when neither the projectile nor the gun is found at the crime scene. We simulated 5- and 15-cm firing distances at a human mandible to investigate the external morphology of entrance wounds based on fire range. The ammunition models, .40-caliber S&W, .380-caliber, and 9 × 19-mm Luger, were constructed with free-form NURBS surfaces. In a dynamic simulation, projectiles were fired against mandibular body 3D model at 5 and 15 cm. All entrance wounds presented oval aspect. Maximum diameter and von Mises stress values were 16.5 mm and 50.8 MPa, both for .40-caliber S&W fired at 5 cm. The maximum energy loss was 138.4 J for .40 S&W fired at 15 cm. In conclusion, the mandible was most affected by .40-caliber S&W and morphological differences were observable in holes caused by different incoming projectile calibers fired at different distances. © 2017 American Academy of Forensic Sciences.

  12. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  13. Distributed Persistent Identifiers System Design

    Directory of Open Access Journals (Sweden)

    Pavel Golodoniuc

    2017-06-01

    Full Text Available The need to identify both digital and physical objects is ubiquitous in our society. Past and present persistent identifier (PID systems, of which there is a great variety in terms of technical and social implementation, have evolved with the advent of the Internet, which has allowed for globally unique and globally resolvable identifiers. PID systems have, by in large, catered for identifier uniqueness, integrity, and persistence, regardless of the identifier’s application domain. Trustworthiness of these systems has been measured by the criteria first defined by Bütikofer (2009 and further elaborated by Golodoniuc 'et al'. (2016 and Car 'et al'. (2017. Since many PID systems have been largely conceived and developed by a single organisation they faced challenges for widespread adoption and, most importantly, the ability to survive change of technology. We believe that a cause of PID systems that were once successful fading away is the centralisation of support infrastructure – both organisational and computing and data storage systems. In this paper, we propose a PID system design that implements the pillars of a trustworthy system – ensuring identifiers’ independence of any particular technology or organisation, implementation of core PID system functions, separation from data delivery, and enabling the system to adapt for future change. We propose decentralisation at all levels — persistent identifiers and information objects registration, resolution, and data delivery — using Distributed Hash Tables and traditional peer-to-peer networks with information replication and caching mechanisms, thus eliminating the need for a central PID data store. This will increase overall system fault tolerance thus ensuring its trustworthiness. We also discuss important aspects of the distributed system’s governance, such as the notion of the authoritative source and data integrity

  14. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  15. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  16. Cloud Computing (1/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  17. Cloud Computing (2/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  18. Identifying Knowledge and Communication

    Directory of Open Access Journals (Sweden)

    Eduardo Coutinho Lourenço de Lima

    2006-12-01

    Full Text Available In this paper, I discuss how the principle of identifying knowledge which Strawson advances in ‘Singular Terms and Predication’ (1961, and in ‘Identifying Reference and Truth-Values’ (1964 turns out to constrain communication. The principle states that a speaker’s use of a referring expression should invoke identifying knowledge on the part of the hearer, if the hearer is to understand what the speaker is saying, and also that, in so referring, speakers are attentive to hearers’ epistemic states. In contrasting it with Russell’s Principle (Evans 1982, as well as with the principle of identifying descriptions (Donnellan 1970, I try to show that the principle of identifying knowledge, ultimately a condition for understanding, makes sense only in a situation of conversation. This allows me to conclude that the cooperative feature of communication (Grice 1975 and reference (Clark andWilkes-Gibbs 1986 holds also at the understanding level. Finally, I discuss where Strawson’s views seem to be unsatisfactory, and suggest how they might be improved.

  19. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  20. Identifying and Managing Risk.

    Science.gov (United States)

    Abraham, Janice M.

    1999-01-01

    The role of the college or university chief financial officer in institutional risk management is (1) to identify risk (physical, casualty, fiscal, business, reputational, workplace safety, legal liability, employment practices, general liability), (2) to develop a campus plan to reduce and control risk, (3) to transfer risk, and (4) to track and…

  1. Internally readable identifying tag

    International Nuclear Information System (INIS)

    Jefferts, K.B.; Jefferts, E.R.

    1980-01-01

    A method of identifying non-metallic objects by means of X-ray equipment is described in detail. A small metal pin with a number of grooves cut in a pre-determined equi-spaced pattern is implanted into the non-metallic object and by decoding the groove patterns using X-ray equipment, the object is uniquely identified. A specific example of such an application is in studying the migratory habits of fish. The pin inserted into the snout of the fish is 0.010 inch in diameter, 0.040 inch in length with 8 possible positions for grooves if spaced 0.005 inch apart. With 6 of the groove positions available for data, the capacity is 2 6 or 64 combinations; clearly longer pins would increase the data capacity. This method of identification is a major advance over previous techniques which necessitated destruction of the fish in order to recover the identification tag. (UK)

  2. Identifying Breast Cancer Oncogenes

    Science.gov (United States)

    2010-10-01

    tyrosine kinases with an SH3, SH2 and catalytic domain, it lacks a native myristylation signal shared by most members of this class [14], [38]. The...therapeutics and consequently, improve clinical outcomes. We aim to identify novel drivers of breast oncogenesis. We hypothesize that a kinase gain-of...human mammary epithelial cells. A pBabe-Puro-Myr-Flag kinase open reading frame (ORF) library was screened in immortalized human mammary epithelial

  3. Rock disposal problems identified

    Energy Technology Data Exchange (ETDEWEB)

    Knox, R

    1978-06-01

    Mathematical models are the only way of examining the return of radioactivity from nuclear waste to the environment over long periods of time. Work in Britain has helped identify areas where more basic data is required, but initial results look very promising for final disposal of high level waste in hard rock repositories. A report by the National Radiological Protection Board of a recent study, is examined.

  4. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  5. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  6. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  7. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  8. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  9. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  10. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  11. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  12. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  13. Identifying phenomenal consciousness.

    Science.gov (United States)

    Schier, Elizabeth

    2009-03-01

    This paper examines the possibility of finding evidence that phenomenal consciousness is independent of access. The suggestion reviewed is that we should look for isomorphisms between phenomenal and neural activation spaces. It is argued that the fact that phenomenal spaces are mapped via verbal report is no problem for this methodology. The fact that activation and phenomenal space are mapped via different means does not mean that they cannot be identified. The paper finishes by examining how data addressing this theoretical question could be obtained.

  14. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  15. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  16. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  17. Granular computing: perspectives and challenges.

    Science.gov (United States)

    Yao, JingTao; Vasilakos, Athanasios V; Pedrycz, Witold

    2013-12-01

    Granular computing, as a new and rapidly growing paradigm of information processing, has attracted many researchers and practitioners. Granular computing is an umbrella term to cover any theories, methodologies, techniques, and tools that make use of information granules in complex problem solving. The aim of this paper is to review foundations and schools of research and to elaborate on current developments in granular computing research. We first review some basic notions of granular computing. Classification and descriptions of various schools of research in granular computing are given. We also present and identify some research directions in granular computing.

  18. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  19. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  20. List identifies threatened ecosystems

    Science.gov (United States)

    Showstack, Randy

    2012-09-01

    The International Union for Conservation of Nature (IUCN) announced on 9 September that it will develop a new Red List of Ecosystems that will identify which ecosystems are vulnerable or endangered. The list, which is modeled on the group's Red List of Threatened Species™, could help to guide conservation activities and influence policy processes such as the Convention on Biological Diversity, according to the group. “We will assess the status of marine, terrestrial, freshwater, and subterranean ecosystems at local, regional, and global levels,” stated Jon Paul Rodriguez, leader of IUCN's Ecosystems Red List Thematic Group. “The assessment can then form the basis for concerted implementation action so that we can manage them sustainably if their risk of collapse is low or restore them if they are threatened and then monitor their recovery.”

  1. Global Microbial Identifier

    DEFF Research Database (Denmark)

    Wielinga, Peter; Hendriksen, Rene S.; Aarestrup, Frank Møller

    2017-01-01

    ) will likely also enable a much better understanding of the pathogenesis of the infection and the molecular basis of the host response to infection. But the full potential of these advances will only transpire if the data in this area become transferable and thereby comparable, preferably in open-source...... of microorganisms, for the identification of relevant genes and for the comparison of genomes to detect outbreaks and emerging pathogens. To harness the full potential of WGS, a shared global database of genomes linked to relevant metadata and the necessary software tools needs to be generated, hence the global...... microbial identifier (GMI) initiative. This tool will ideally be used in amongst others in the diagnosis of infectious diseases in humans and animals, in the identification of microorganisms in food and environment, and to track and trace microbial agents in all arenas globally. This will require...

  2. Collectively loading an application in a parallel computer

    Science.gov (United States)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Miller, Samuel J.; Mundy, Michael B.

    2016-01-05

    Collectively loading an application in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: identifying, by a parallel computer control system, a subset of compute nodes in the parallel computer to execute a job; selecting, by the parallel computer control system, one of the subset of compute nodes in the parallel computer as a job leader compute node; retrieving, by the job leader compute node from computer memory, an application for executing the job; and broadcasting, by the job leader to the subset of compute nodes in the parallel computer, the application for executing the job.

  3. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  4. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  5. Radiograph identifying means

    International Nuclear Information System (INIS)

    Sheldon, A.D.

    1983-01-01

    A flexible character-indentable plastics embossing tape is backed by and bonded to a lead strip, not more than 0.025 inches thick, to form a tape suitable for identifying radiographs. The lead strip is itself backed by a relatively thin and flimsy plastics or fabric strip which, when removed, allows the lead plastic tape to be pressure-bonded to the surface to be radiographed. A conventional tape-embossing gun is used to indent the desired characters in succession into the lead-backed tape, without necessarily severing the lead; and then the backing strip is peeled away to expose the layer of adhesive which pressure-bonds the indented tape to the object to be radiographed. X-rays incident on the embossed tape will cause the raised characters to show up dark on the subsequently-developed film, whilst the raised side areas will show up white. Each character will thus stand out on the developed film. (author)

  6. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  7. IDENTIFIABILITY VERSUS HETEROGENEITY IN GROUNDWATER MODELING SYSTEMS

    Directory of Open Access Journals (Sweden)

    A M BENALI

    2003-06-01

    Full Text Available Review of history matching of reservoirs parameters in groundwater flow raises the problem of identifiability of aquifer systems. Lack of identifiability means that there exists parameters to which the heads are insensitive. From the guidelines of the study of the homogeneous case, we inspect the identifiability of the distributed transmissivity field of heterogeneous groundwater aquifers. These are derived from multiple realizations of a random function Y = log T  whose probability distribution function is normal. We follow the identifiability of the autocorrelated block transmissivities through the measure of the sensitivity of the local derivatives DTh = (∂hi  ∕ ∂Tj computed for each sample of a population N (0; σY, αY. Results obtained from an analysis of Monte Carlo type suggest that the more a system is heterogeneous, the less it is identifiable.

  8. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  9. Exploiting intrinsic fluctuations to identify model parameters.

    Science.gov (United States)

    Zimmer, Christoph; Sahle, Sven; Pahle, Jürgen

    2015-04-01

    Parameterisation of kinetic models plays a central role in computational systems biology. Besides the lack of experimental data of high enough quality, some of the biggest challenges here are identification issues. Model parameters can be structurally non-identifiable because of functional relationships. Noise in measured data is usually considered to be a nuisance for parameter estimation. However, it turns out that intrinsic fluctuations in particle numbers can make parameters identifiable that were previously non-identifiable. The authors present a method to identify model parameters that are structurally non-identifiable in a deterministic framework. The method takes time course recordings of biochemical systems in steady state or transient state as input. Often a functional relationship between parameters presents itself by a one-dimensional manifold in parameter space containing parameter sets of optimal goodness. Although the system's behaviour cannot be distinguished on this manifold in a deterministic framework it might be distinguishable in a stochastic modelling framework. Their method exploits this by using an objective function that includes a measure for fluctuations in particle numbers. They show on three example models, immigration-death, gene expression and Epo-EpoReceptor interaction, that this resolves the non-identifiability even in the case of measurement noise with known amplitude. The method is applied to partially observed recordings of biochemical systems with measurement noise. It is simple to implement and it is usually very fast to compute. This optimisation can be realised in a classical or Bayesian fashion.

  10. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  11. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  13. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  14. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  15. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  16. Performing stencil computations

    Energy Technology Data Exchange (ETDEWEB)

    Donofrio, David

    2018-01-16

    A method and apparatus for performing stencil computations efficiently are disclosed. In one embodiment, a processor receives an offset, and in response, retrieves a value from a memory via a single instruction, where the retrieving comprises: identifying, based on the offset, one of a plurality of registers of the processor; loading an address stored in the identified register; and retrieving from the memory the value at the address.

  17. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  18. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  19. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  20. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  1. Computer Simulation of Reading.

    Science.gov (United States)

    Leton, Donald A.

    In recent years, coding and decoding have been claimed to be the processes for converting one language form to another. But there has been little effort to locate these processes in the human learner or to identify the nature of the internal codes. Computer simulation of reading is useful because the similarities in the human reception and…

  2. Computer/Information Science

    Science.gov (United States)

    Birman, Ken; Roughgarden, Tim; Seltzer, Margo; Spohrer, Jim; Stolterman, Erik; Kearsley, Greg; Koszalka, Tiffany; de Jong, Ton

    2013-01-01

    Scholars representing the field of computer/information science were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Ken Birman, Jennifer Rexford, Tim Roughgarden, Margo Seltzer, Jim Spohrer, and…

  3. (Some) Computer Futures: Mainframes.

    Science.gov (United States)

    Joseph, Earl C.

    Possible futures for the world of mainframe computers can be forecast through studies identifying forces of change and their impact on current trends. Some new prospects for the future have been generated by advances in information technology; for example, recent United States successes in applied artificial intelligence (AI) have created new…

  4. Quantum steady computation

    Energy Technology Data Exchange (ETDEWEB)

    Castagnoli, G. (Dipt. di Informatica, Sistemistica, Telematica, Univ. di Genova, Viale Causa 13, 16145 Genova (IT))

    1991-08-10

    This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition.

  5. Quantum steady computation

    International Nuclear Information System (INIS)

    Castagnoli, G.

    1991-01-01

    This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition

  6. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  7. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  8. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  9. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  10. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  11. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  12. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  13. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  14. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  15. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  16. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  17. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  18. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  19. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  20. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  1. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  2. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  3. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  4. Computer Science Research at Langley

    Science.gov (United States)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  5. Computer Education and Computer Use by Preschool Educators

    Science.gov (United States)

    Towns, Bernadette

    2010-01-01

    Researchers have found that teachers seldom use computers in the preschool classroom. However, little research has examined why preschool teachers elect not to use computers. This case study focused on identifying whether community colleges that prepare teachers for early childhood education include in their curriculum how teachers can effectively…

  6. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  7. SPARQL-enabled identifier conversion with Identifiers.org

    Science.gov (United States)

    Wimalaratne, Sarala M.; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-01-01

    Motivation: On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. Results: We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. Availability and implementation: The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. Contact: sarala@ebi.ac.uk PMID:25638809

  8. SPARQL-enabled identifier conversion with Identifiers.org.

    Science.gov (United States)

    Wimalaratne, Sarala M; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-06-01

    On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. © The Author 2015. Published by Oxford University Press.

  9. Quantum Internet: from Communication to Distributed Computing!

    OpenAIRE

    Caleffi, Marcello; Cacciapuoti, Angela Sara; Bianchi, Giuseppe

    2018-01-01

    In this invited paper, the authors discuss the exponential computing speed-up achievable by interconnecting quantum computers through a quantum internet. They also identify key future research challenges and open problems for quantum internet design and deployment.

  10. Aggregated Computational Toxicology Resource (ACTOR)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aggregated Computational Toxicology Resource (ACTOR) is a database on environmental chemicals that is searchable by chemical name and other identifiers, and by...

  11. Structural Identifiability of Dynamic Systems Biology Models.

    Science.gov (United States)

    Villaverde, Alejandro F; Barreiro, Antonio; Papachristodoulou, Antonis

    2016-10-01

    A powerful way of gaining insight into biological systems is by creating a nonlinear differential equation model, which usually contains many unknown parameters. Such a model is called structurally identifiable if it is possible to determine the values of its parameters from measurements of the model outputs. Structural identifiability is a prerequisite for parameter estimation, and should be assessed before exploiting a model. However, this analysis is seldom performed due to the high computational cost involved in the necessary symbolic calculations, which quickly becomes prohibitive as the problem size increases. In this paper we show how to analyse the structural identifiability of a very general class of nonlinear models by extending methods originally developed for studying observability. We present results about models whose identifiability had not been previously determined, report unidentifiabilities that had not been found before, and show how to modify those unidentifiable models to make them identifiable. This method helps prevent problems caused by lack of identifiability analysis, which can compromise the success of tasks such as experiment design, parameter estimation, and model-based optimization. The procedure is called STRIKE-GOLDD (STRuctural Identifiability taKen as Extended-Generalized Observability with Lie Derivatives and Decomposition), and it is implemented in a MATLAB toolbox which is available as open source software. The broad applicability of this approach facilitates the analysis of the increasingly complex models used in systems biology and other areas.

  12. Identifiability of PBPK Models with Applications to ...

    Science.gov (United States)

    Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy

  13. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  14. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  15. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  16. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  17. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  18. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  19. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  20. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  1. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  2. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  3. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  4. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  5. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  6. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  7. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  8. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  9. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  10. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  11. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  12. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  13. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  14. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  15. Identifying Broadband Rotational Spectra with Neural Networks

    Science.gov (United States)

    Zaleski, Daniel P.; Prozument, Kirill

    2017-06-01

    A typical broadband rotational spectrum may contain several thousand observable transitions, spanning many species. Identifying the individual spectra, particularly when the dynamic range reaches 1,000:1 or even 10,000:1, can be challenging. One approach is to apply automated fitting routines. In this approach, combinations of 3 transitions can be created to form a "triple", which allows fitting of the A, B, and C rotational constants in a Watson-type Hamiltonian. On a standard desktop computer, with a target molecule of interest, a typical AUTOFIT routine takes 2-12 hours depending on the spectral density. A new approach is to utilize machine learning to train a computer to recognize the patterns (frequency spacing and relative intensities) inherit in rotational spectra and to identify the individual spectra in a raw broadband rotational spectrum. Here, recurrent neural networks have been trained to identify different types of rotational spectra and classify them accordingly. Furthermore, early results in applying convolutional neural networks for spectral object recognition in broadband rotational spectra appear promising. Perez et al. "Broadband Fourier transform rotational spectroscopy for structure determination: The water heptamer." Chem. Phys. Lett., 2013, 571, 1-15. Seifert et al. "AUTOFIT, an Automated Fitting Tool for Broadband Rotational Spectra, and Applications to 1-Hexanal." J. Mol. Spectrosc., 2015, 312, 13-21. Bishop. "Neural networks for pattern recognition." Oxford university press, 1995.

  16. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  17. Computer Security: Competing Concepts

    OpenAIRE

    Nissenbaum, Helen; Friedman, Batya; Felten, Edward

    2001-01-01

    This paper focuses on a tension we discovered in the philosophical part of our multidisciplinary project on values in web-browser security. Our project draws on the methods and perspectives of empirical social science, computer science, and philosophy to identify values embodied in existing web-browser security and also to prescribe changes to existing systems (in particular, Mozilla) so that values relevant to web-browser systems are better served than presently they are. The tension, which ...

  18. Computational Physics' Greatest Hits

    Science.gov (United States)

    Bug, Amy

    2011-03-01

    The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.

  19. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  20. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  1. Computer programs as accounting object

    Directory of Open Access Journals (Sweden)

    I.V. Perviy

    2015-03-01

    Full Text Available Existing approaches to the regulation of accounting software as one of the types of intangible assets have been considered. The features and current state of the legal protection of computer programs have been analyzed. The reasons for the need to use patent law as a means of legal protection of individual elements of computer programs have been discovered. The influence of the legal aspects of the use of computer programs for national legislation to their accounting reflection has been analyzed. The possible options for the transfer of rights from computer programs copyright owners have been analyzed that should be considered during creation of software accounting system at the enterprise. Identified and analyzed the characteristics of computer software as an intangible asset under the current law. General economic characteristics of computer programs as one of the types of intangible assets have been grounded. The main distinguishing features of software compared to other types of intellectual property have been all ocated

  2. Studi Perbandingan Layanan Cloud Computing

    Directory of Open Access Journals (Sweden)

    Afdhal Afdhal

    2014-03-01

    Full Text Available In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud delivery service, correlation and inter-dependency. This article compares and contrasts the different levels of delivery services and the development models, identify issues, and future directions on cloud computing. The end-users comprehension of cloud computing delivery service classification will equip them with knowledge to determine and decide which business model that will be chosen and adopted securely and comfortably. The last part of this article provides several recommendations for cloud computing service providers and end-users.

  3. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  4. Computed tomography

    International Nuclear Information System (INIS)

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  5. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  6. Computational universes

    International Nuclear Information System (INIS)

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  7. Report of the Task Force on Computer Charging.

    Science.gov (United States)

    Computer Co-ordination Group, Ottawa (Ontario).

    The objectives of the Task Force on Computer Charging as approved by the Committee of Presidents of Universities of Ontario were: (1) to identify alternative methods of costing computing services; (2) to identify alternative methods of pricing computing services; (3) to develop guidelines for the pricing of computing services; (4) to identify…

  8. Near Identifiability of Dynamical Systems

    Science.gov (United States)

    Hadaegh, F. Y.; Bekey, G. A.

    1987-01-01

    Concepts regarding approximate mathematical models treated rigorously. Paper presents new results in analysis of structural identifiability, equivalence, and near equivalence between mathematical models and physical processes they represent. Helps establish rigorous mathematical basis for concepts related to structural identifiability and equivalence revealing fundamental requirements, tacit assumptions, and sources of error. "Structural identifiability," as used by workers in this field, loosely translates as meaning ability to specify unique mathematical model and set of model parameters that accurately predict behavior of corresponding physical system.

  9. Identifying parameter regions for multistationarity

    DEFF Research Database (Denmark)

    Conradi, Carsten; Feliu, Elisenda; Mincheva, Maya

    2017-01-01

    is the avoidance of numerical analysis and parameter sampling. The procedure consists of a number of steps. Each of these steps might be addressed algorithmically using various computer programs and available software, or manually. We demonstrate our procedure on several models of gene transcription and cell...

  10. 48 CFR 212.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  11. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  12. Market-Oriented Cloud Computing: Vision, Hype, and Reality for Delivering IT Services as Computing Utilities

    OpenAIRE

    Buyya, Rajkumar; Yeo, Chee Shin; Venugopal, Srikumar

    2008-01-01

    This keynote paper: presents a 21st century vision of computing; identifies various computing paradigms promising to deliver the vision of computing utilities; defines Cloud computing and provides the architecture for creating market-oriented Clouds by leveraging technologies such as VMs; provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; presents...

  13. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  14. Impact of new computing systems on finite element computations

    International Nuclear Information System (INIS)

    Noor, A.K.; Fulton, R.E.; Storaasi, O.O.

    1983-01-01

    Recent advances in computer technology that are likely to impact finite element computations are reviewed. The characteristics of supersystems, highly parallel systems, and small systems (mini and microcomputers) are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario is presented for future hardware/software environment and finite element systems. A number of research areas which have high potential for improving the effectiveness of finite element analysis in the new environment are identified

  15. The NOAA Dataset Identifier Project

    Science.gov (United States)

    de la Beaujardiere, J.; Mccullough, H.; Casey, K. S.

    2013-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) initiated a project in 2013 to assign persistent identifiers to datasets archived at NOAA and to create informational landing pages about those datasets. The goals of this project are to enable the citation of datasets used in products and results in order to help provide credit to data producers, to support traceability and reproducibility, and to enable tracking of data usage and impact. A secondary goal is to encourage the submission of datasets for long-term preservation, because only archived datasets will be eligible for a NOAA-issued identifier. A team was formed with representatives from the National Geophysical, Oceanographic, and Climatic Data Centers (NGDC, NODC, NCDC) to resolve questions including which identifier scheme to use (answer: Digital Object Identifier - DOI), whether or not to embed semantics in identifiers (no), the level of granularity at which to assign identifiers (as coarsely as reasonable), how to handle ongoing time-series data (do not break into chunks), creation mechanism for the landing page (stylesheet from formal metadata record preferred), and others. Decisions made and implementation experience gained will inform the writing of a Data Citation Procedural Directive to be issued by the Environmental Data Management Committee in 2014. Several identifiers have been issued as of July 2013, with more on the way. NOAA is now reporting the number as a metric to federal Open Government initiatives. This paper will provide further details and status of the project.

  16. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  17. Computing Services and Assured Computing

    Science.gov (United States)

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  18. Computer system architecture for laboratory automation

    International Nuclear Information System (INIS)

    Penney, B.K.

    1978-01-01

    This paper describes the various approaches that may be taken to provide computing resources for laboratory automation. Three distinct approaches are identified, the single dedicated small computer, shared use of a larger computer, and a distributed approach in which resources are provided by a number of computers, linked together, and working in some cooperative way. The significance of the microprocessor in laboratory automation is discussed, and it is shown that it is not simply a cheap replacement of the minicomputer. (Auth.)

  19. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  20. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  1. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  2. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  3. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  4. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  5. Cloud Computing Security: A Survey

    Directory of Open Access Journals (Sweden)

    Issa M. Khalil

    2014-02-01

    Full Text Available Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing and outsourcing, create new challenges to the security community. Addressing these challenges requires, in addition to the ability to cultivate and tune the security measures developed for traditional computing systems, proposing new security policies, models, and protocols to address the unique cloud security challenges. In this work, we provide a comprehensive study of cloud computing security and privacy concerns. We identify cloud vulnerabilities, classify known security threats and attacks, and present the state-of-the-art practices to control the vulnerabilities, neutralize the threats, and calibrate the attacks. Additionally, we investigate and identify the limitations of the current solutions and provide insights of the future security perspectives. Finally, we provide a cloud security framework in which we present the various lines of defense and identify the dependency levels among them. We identify 28 cloud security threats which we classify into five categories. We also present nine general cloud attacks along with various attack incidents, and provide effectiveness analysis of the proposed countermeasures.

  6. Identifying tier one key suppliers.

    Science.gov (United States)

    Wicks, Steve

    2013-01-01

    In today's global marketplace, businesses are becoming increasingly reliant on suppliers for the provision of key processes, activities, products and services in support of their strategic business goals. The result is that now, more than ever, the failure of a key supplier has potential to damage reputation, productivity, compliance and financial performance seriously. Yet despite this, there is no recognised standard or guidance for identifying a tier one key supplier base and, up to now, there has been little or no research on how to do so effectively. This paper outlines the key findings of a BCI-sponsored research project to investigate good practice in identifying tier one key suppliers, and suggests a scalable framework process model and risk matrix tool to help businesses effectively identify their tier one key supplier base.

  7. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  8. Football refereeing: Identifying innovative methods

    Directory of Open Access Journals (Sweden)

    Reza MohammadKazemi

    2014-08-01

    Full Text Available The aim of the present study is to identify the potentials innovation in football industry. Data were collected from 10 national and international referees, assistant referees and referees’ supervisors in Iran. In this study, technological innovations are identified that assist better refereeing performances. The analysis revealed a significant relationship between using new technologies and referees ‘performance. The results indicate that elite referees, assistant referees and supervisors agreed to use new technological innovations during the game. According to their comments, this kind of technology causes the referees’ performance development.

  9. Computer generated holographic microtags

    International Nuclear Information System (INIS)

    Sweatt, W.C.

    1998-01-01

    A microlithographic tag comprising an array of individual computer generated holographic patches having feature sizes between 250 and 75 nanometers is disclosed. The tag is a composite hologram made up of the individual holographic patches and contains identifying information when read out with a laser of the proper wavelength and at the proper angles of probing and reading. The patches are fabricated in a steep angle Littrow readout geometry to maximize returns in the -1 diffracted order. The tags are useful as anti-counterfeiting markers because of the extreme difficulty in reproducing them. 5 figs

  10. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  11. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  12. An approach to identify urban groundwater recharge

    Directory of Open Access Journals (Sweden)

    E. Vázquez-Suñé

    2010-10-01

    Full Text Available Evaluating the proportion in which waters from different origins are mixed in a given water sample is relevant for many hydrogeological problems, such as quantifying total recharge, assessing groundwater pollution risks, or managing water resources. Our work is motivated by urban hydrogeology, where waters with different chemical signature can be identified (losses from water supply and sewage networks, infiltration from surface runoff and other water bodies, lateral aquifers inflows, .... The relative contribution of different sources to total recharge can be quantified by means of solute mass balances, but application is hindered by the large number of potential origins. Hence, the need to incorporate data from a large number of conservative species, the uncertainty in sources concentrations and measurement errors. We present a methodology to compute mixing ratios and end-members composition, which consists of (i Identification of potential recharge sources, (ii Selection of tracers, (iii Characterization of the hydrochemical composition of potential recharge sources and mixed water samples, and (iv Computation of mixing ratios and reevaluation of end-members. The analysis performed in a data set from samples of the Barcelona city aquifers suggests that the main contributors to total recharge are the water supply network losses (22%, the sewage network losses (30%, rainfall, concentrated in the non-urbanized areas (17%, from runoff infiltration (20%, and the Besòs River (11%. Regarding species, halogens (chloride, fluoride and bromide, sulfate, total nitrogen, and stable isotopes (18O, 2H, and 34S behaved quite conservatively. Boron, residual alkalinity, EDTA and Zn did not. Yet, including these species in the computations did not affect significantly the proportion estimations.

  13. Effectively identifying user profiles in network and host metrics

    Science.gov (United States)

    Murphy, John P.; Berk, Vincent H.; Gregorio-de Souza, Ian

    2010-04-01

    This work presents a collection of methods that is used to effectively identify users of computers systems based on their particular usage of the software and the network. Not only are we able to identify individual computer users by their behavioral patterns, we are also able to detect significant deviations in their typical computer usage over time, or compared to a group of their peers. For instance, most people have a small, and relatively unique selection of regularly visited websites, certain email services, daily work hours, and typical preferred applications for mandated tasks. We argue that these habitual patterns are sufficiently specific to identify fully anonymized network users. We demonstrate that with only a modest data collection capability, profiles of individual computer users can be constructed so as to uniquely identify a profiled user from among their peers. As time progresses and habits or circumstances change, the methods presented update each profile so that changes in user behavior can be reliably detected over both abrupt and gradual time frames, without losing the ability to identify the profiled user. The primary benefit of our methodology allows one to efficiently detect deviant behaviors, such as subverted user accounts, or organizational policy violations. Thanks to the relative robustness, these techniques can be used in scenarios with very diverse data collection capabilities, and data privacy requirements. In addition to behavioral change detection, the generated profiles can also be compared against pre-defined examples of known adversarial patterns.

  14. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  15. Identifying people from gait pattern with accelerometers

    Science.gov (United States)

    Ailisto, Heikki J.; Lindholm, Mikko; Mantyjarvi, Jani; Vildjiounaite, Elena; Makela, Satu-Marja

    2005-03-01

    Protecting portable devices is becoming more important, not only because of the value of the devices themselves, but for the value of the data in them and their capability for transactions, including m-commerce and m-banking. An unobtrusive and natural method for identifying the carrier of portable devices is presented. The method uses acceleration signals produced by sensors embedded in the portable device. When the user carries the device, the acceleration signal is compared with the stored template signal. The method consists of finding individual steps, normalizing and averaging them, aligning them with the template and computing cross-correlation, which is used as a measure of similarity. Equal Error Rate of 6.4% is achieved in tentative experiments with 36 test subjects.

  16. Method of identifying features in indexed data

    Science.gov (United States)

    Jarman, Kristin H [Richland, WA; Daly, Don Simone [Richland, WA; Anderson, Kevin K [Richland, WA; Wahl, Karen L [Richland, WA

    2001-06-26

    The present invention is a method of identifying features in indexed data, especially useful for distinguishing signal from noise in data provided as a plurality of ordered pairs. Each of the plurality of ordered pairs has an index and a response. The method has the steps of: (a) providing an index window having a first window end located on a first index and extending across a plurality of indices to a second window end; (b) selecting responses corresponding to the plurality of indices within the index window and computing a measure of dispersion of the responses; and (c) comparing the measure of dispersion to a dispersion critical value. Advantages of the present invention include minimizing signal to noise ratio, signal drift, varying baseline signal and combinations thereof.

  17. A neural network to identify neutral mesons

    International Nuclear Information System (INIS)

    Lefevre, F.; Lautridou, P.; Marques, M.; Matulewicz, T.; Ostendorf, R.; Schutz, Y.

    1994-01-01

    Both π 0 and η mesons decay long before they can reach a detector. They predominantly decay by emission of two photons, and are identified by constructing the invariant mass of the photons. Misidentified mesons result from ambiguity in associating photons. Our work tries to select which pair is the most likely to be a physical one rather than a chance one. We first designed a Hopfield neural net, but all the activities converged rapidly towards zero except the highest one. To improve the solution we slew down the computation in order to let the network explore several states and to impose activities to converge towards one for all selected pairs. This was achieved by adding links connecting each cell to itself. The network performance is all the more interesting that the solid angle covered by the detector is greater than 15%. (D.L.). 5 refs

  18. Cooperative testing of a positive personnel identifier

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Grambihler, A.J.; Graham, D.K.; Bradley, R.G.

    1980-06-01

    HEDL has a requirement to ensure the identification of remote computer terminal operators on a real-time nuclear inventory data base. The integrity of this data base depends on input from authorized individuals. Thus, a key to developing such a system is the ability to positively identify people attempting access to the system. Small scale tests of the Identimat 2000T hand geometry unit with an adjusting alogrithm have suggested a promising solution. To prove operational suitability, HEDL, in cooperation with Sandia Laboratories, has designed a large scale test of the Identimat 2000T. Data gathering on error rates, reliability, maintainability, and user acceptance will determine if the Identimat 2000T is suitable for the HEDL application. If proven acceptable, use of the Identimat 2000T can be broadened to many general applications where security information, locations and systems are required

  19. SOCIODEMOGRAPHIC DATA USED FOR IDENTIFYING ...

    Science.gov (United States)

    Due to unique social and demographic characteristics, various segments of the population may experience exposures different from those of the general population, which, in many cases, may be greater. When risk assessments do not characterize subsets of the general population, the populations that may experience the greatest risk remain unidentified. When such populations are not identified, the social and demographic data relevant to these populations is not considered when preparing exposure estimates, which can underestimate exposure and risk estimates for at-risk populations. Thus, it is necessary for risk or exposure assessors characterizing a diverse population, to first identify and then enumerate certain groups within the general population who are at risk for greater contaminant exposures. The document entitled Sociodemographic Data Used for Identifying Potentially Highly Exposed Populations (also referred to as the Highly Exposed Populations document), assists assessors in identifying and enumerating potentially highly exposed populations. This document presents data relating to factors which potentially impact an individual or group's exposure to environmental contaminants based on activity patterns (how time is spent), microenvironments (locations where time is spent), and other socio-demographic data such as age, gender, race and economic status. Populations potentially more exposed to various chemicals of concern, relative to the general population

  20. SNP interaction pattern identifier (SIPI)

    DEFF Research Database (Denmark)

    Lin, Hui Yi; Chen, Dung Tsa; Huang, Po Yu

    2017-01-01

    Motivation: Testing SNP-SNP interactions is considered as a key for overcoming bottlenecks of genetic association studies. However, related statistical methods for testing SNP-SNP interactions are underdeveloped. Results: We propose the SNP Interaction Pattern Identifier (SIPI), which tests 45...

  1. Identifying the Gifted Child Humorist.

    Science.gov (United States)

    Fern, Tami L.

    1991-01-01

    This study attempted to identify gifted child humorists among 1,204 children in grades 3-6. Final identification of 13 gifted child humorists was determined through application of such criteria as funniness, originality, and exemplary performance or product. The influence of intelligence, development, social factors, sex differences, family…

  2. Identifying high-risk medication

    DEFF Research Database (Denmark)

    Sædder, Eva; Brock, Birgitte; Nielsen, Lars Peter

    2014-01-01

    salicylic acid, and beta-blockers; 30 drugs or drug classes caused 82 % of all serious MEs. The top ten drugs involved in fatal events accounted for 73 % of all drugs identified. CONCLUSION: Increasing focus on seven drugs/drug classes can potentially reduce hospitalizations, extended hospitalizations...

  3. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  4. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  5. Positron emission computed tomography

    International Nuclear Information System (INIS)

    Grover, M.; Schelbert, H.R.

    1985-01-01

    Regional mycardial blood flow and substrate metabolism can be non-invasively evaluated and quantified with positron emission computed tomography (Positron-CT). Tracers of exogenous glucose utilization and fatty acid metabolism are available and have been extensively tested. Specific tracer kinetic models have been developed or are being tested so that glucose and fatty acid metabolism can be measured quantitatively by Positron-CT. Tracers of amino acid and oxygen metabolism are utilized in Positron-CT studies of the brain and development of such tracers for cardiac studies are in progress. Methods to quantify regional myocardial blood flow are also being developed. Previous studies have demonstrated the ability of Positron-/CT to document myocardial infarction. Experimental and clinical studies have begun to identify metabolic markers of reversibly ischemic myocardium. The potential of Positron-CT to reliably detect potentially salvageable myocardium and, hence, to identify appropriate therapeutic interventions is one of the most exciting applications of the technique

  6. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  7. Lecture 4: Cloud Computing in Large Computer Centers

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    This lecture will introduce Cloud Computing concepts identifying and analyzing its characteristics, models, and applications. Also, you will learn how CERN built its Cloud infrastructure and which tools are been used to deploy and manage it. About the speaker: Belmiro Moreira is an enthusiastic software engineer passionate about the challenges and complexities of architecting and deploying Cloud Infrastructures in ve...

  8. CLOUD COMPUTING SECURITY ISSUES

    Directory of Open Access Journals (Sweden)

    Florin OGIGAU-NEAMTIU

    2012-01-01

    Full Text Available The term “cloud computing” has been in the spotlights of IT specialists the last years because of its potential to transform this industry. The promised benefits have determined companies to invest great sums of money in researching and developing this domain and great steps have been made towards implementing this technology. Managers have traditionally viewed IT as difficult and expensive and the promise of cloud computing leads many to think that IT will now be easy and cheap. The reality is that cloud computing has simplified some technical aspects of building computer systems, but the myriad challenges facing IT environment still remain. Organizations which consider adopting cloud based services must also understand the many major problems of information policy, including issues of privacy, security, reliability, access, and regulation. The goal of this article is to identify the main security issues and to draw the attention of both decision makers and users to the potential risks of moving data into “the cloud”.

  9. ORCID: Author Identifiers for Librarians

    Directory of Open Access Journals (Sweden)

    Robyn B. Reed

    2017-10-01

    Full Text Available Generating accurate publication lists by researchers can be challenging when faced with scholars who have common names or who have published under name variations. This article describes ORCID and the goal of generating author identifiers for scholars to connect their research outputs. Included are the reasons for having author identifiers as well as the types of information within individual profiles. This article includes information on how academic libraries are playing a role with ORCID initiatives as well as describing how publishers, institutions, and funders are employing ORCID in their workflows. Highlighted is material on academic institutions in Pennsylvania using ORCID. The purpose of the article is to provide an overview of ORCID and its uses to inform librarians about this important initiative.

  10. Device for identifying fuel assembly

    International Nuclear Information System (INIS)

    Imai, Tetsuo; Miyazawa, Tatsuo.

    1982-01-01

    Purpose: To accurately identify a symbol printed on a hanging tool at the upper part of a fuel assembly. Constitution: Optical fibers are bundled to prepare a detector which is disposed at a predetermined position on a hanging tool. This position is set by a guide. Thus, the light emitted from an illumination lamp arrives at the bottom of a groove printed on the upper surface of the tool, and is divided into a weak light reflected upwardly and a strong light reflected on the surface lower than the groove. When these lights are received by the optical fibers, the fibers corresponding to the grooved position become dark, and the fibers corresponding to the ungrooved position become bright. Since the fuel assembly is identified by the dark and bright of the optical fibers as symbols, different machining can be performed every fuel assembly on the upper surface of the tool. (Yoshihara, H.)

  11. Identifying patient risks during hospitalization

    Directory of Open Access Journals (Sweden)

    Lucélia Ferreira Lima

    2008-12-01

    Full Text Available Objective: To identify the risks reported at a public institution andto know the main patient risks from the nursing staff point of view.Methods: A retrospective, descriptive and exploratory study. Thesurvey was developed at a hospital in the city of Taboão da Serra, SãoPaulo, Brazil. The study included all nurses working in care areas whoagreed to participate in the study. At the same time, sentinel eventsoccurring in the period from July 2006 to July 2007 were identified.Results: There were 440 sentinel events reported, and the main risksincluded patient falls, medication errors and pressure ulcers. Sixty-fivenurses were interviewed. They also reported patient falls, medicationerrors and pressure ulcers as the main risks. Conclusions: Riskassessment and implementation of effective preventive actions arenecessary to ensure patient’s safety. Involvement of a multidisciplinaryteam is one of the steps for a successful process.

  12. Identifying High Performance ERP Projects

    OpenAIRE

    Stensrud, Erik; Myrtveit, Ingunn

    2002-01-01

    Learning from high performance projects is crucial for software process improvement. Therefore, we need to identify outstanding projects that may serve as role models. It is common to measure productivity as an indicator of performance. It is vital that productivity measurements deal correctly with variable returns to scale and multivariate data. Software projects generally exhibit variable returns to scale, and the output from ERP projects is multivariate. We propose to use Data Envelopment ...

  13. Optimisation of multiplet identifier processing on a PLAYSTATION® 3

    Science.gov (United States)

    Hattori, Masami; Mizuno, Takashi

    2010-02-01

    To enable high-performance computing (HPC) for applications with large datasets using a Sony® PLAYSTATION® 3 (PS3™) video game console, we configured a hybrid system consisting of a Windows® PC and a PS3™. To validate this system, we implemented the real-time multiplet identifier (RTMI) application, which identifies multiplets of microearthquakes in terms of the similarity of their waveforms. The cross-correlation computation, which is a core algorithm of the RTMI application, was optimised for the PS3™ platform, while the rest of the computation, including data input and output remained on the PC. With this configuration, the core part of the algorithm ran 69 times faster than the original program, accelerating total computation speed more than five times. As a result, the system processed up to 2100 total microseismic events, whereas the original implementation had a limit of 400 events. These results indicate that this system enables high-performance computing for large datasets using the PS3™, as long as data transfer time is negligible compared with computation time.

  14. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  15. Academic Training Lecture Regular Programme: Cloud Computing

    CERN Multimedia

    2012-01-01

    Cloud Computing (1/2), by Belmiro Rodrigues Moreira (LIP Laboratorio de Instrumentacao e Fisica Experimental de Part).   Wednesday, May 30, 2012 from 11:00 to 12:00 (Europe/Zurich) at CERN ( 500-1-001 - Main Auditorium ) Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  16. Persistent Identifier Practice for Big Data Management at NCI

    Directory of Open Access Journals (Sweden)

    Jingbo Wang

    2017-04-01

    Full Text Available The National Computational Infrastructure (NCI manages over 10 PB research data, which is co-located with the high performance computer (Raijin and an HPC class 3000 core OpenStack cloud system (Tenjin. In support of this integrated High Performance Computing/High Performance Data (HPC/HPD infrastructure, NCI’s data management practices includes building catalogues, DOI minting, data curation, data publishing, and data delivery through a variety of data services. The metadata catalogues, DOIs, THREDDS, and Vocabularies, all use different Uniform Resource Locator (URL styles. A Persistent IDentifier (PID service provides an important utility to manage URLs in a consistent, controlled and monitored manner to support the robustness of our national ‘Big Data’ infrastructure. In this paper we demonstrate NCI’s approach of utilising the NCI’s 'PID Service 'to consistently manage its persistent identifiers with various applications.

  17. Security Problems in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Rola Motawie

    2016-12-01

    Full Text Available Cloud is a pool of computing resources which are distributed among cloud users. Cloud computing has many benefits like scalability, flexibility, cost savings, reliability, maintenance and mobile accessibility. Since cloud-computing technology is growing day by day, it comes with many security problems. Securing the data in the cloud environment is most critical challenges which act as a barrier when implementing the cloud. There are many new concepts that cloud introduces, such as resource sharing, multi-tenancy, and outsourcing, create new challenges for the security community. In this work, we provide a comparable study of cloud computing privacy and security concerns. We identify and classify known security threats, cloud vulnerabilities, and attacks.

  18. Identifying the Key Weaknesses in Network Security at Colleges.

    Science.gov (United States)

    Olsen, Florence

    2000-01-01

    A new study identifies and ranks the 10 security gaps responsible for most outsider attacks on college computer networks. The list is intended to help campus system administrators establish priorities as they work to increase security. One network security expert urges that institutions utilize multiple security layers. (DB)

  19. The benefit of enterprise ontology in identifying business components

    OpenAIRE

    Albani, Antonia

    2006-01-01

    The benefit of enterprise ontology in identifying business components / A. Albani, J. Dietz. - In: Artificial intelligence in theory and practice : IFIP 19th World Computer Congress ; TC 12: IFIP AI 2006 Stream, August 21-24, 2006, Santiago, Chile / ed. by Max Bramer. - New York : Springer, 2006. - S. 1-12. - (IFIP ; 217)

  20. Identifying flares in rheumatoid arthritis

    DEFF Research Database (Denmark)

    Bykerk, Vivian P; Bingham, Clifton O; Choy, Ernest H

    2016-01-01

    to flare, with escalation planned in 61%. CONCLUSIONS: Flares are common in rheumatoid arthritis (RA) and are often preceded by treatment reductions. Patient/MD/DAS agreement of flare status is highest in patients worsening from R/LDA. OMERACT RA flare questions can discriminate between patients with...... Set. METHODS: Candidate flare questions and legacy measures were administered at consecutive visits to Canadian Early Arthritis Cohort (CATCH) patients between November 2011 and November 2014. The American College of Rheumatology (ACR) core set indicators were recorded. Concordance to identify flares...

  1. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    Science.gov (United States)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  2. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  3. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  4. Mobile computing initiatives within pharmacy education.

    Science.gov (United States)

    Cain, Jeff; Bird, Eleanora R; Jones, Mikael

    2008-08-15

    To identify mobile computing initiatives within pharmacy education, including how devices are obtained, supported, and utilized within the curriculum. An 18-item questionnaire was developed and delivered to academic affairs deans (or closest equivalent) of 98 colleges and schools of pharmacy. Fifty-four colleges and schools completed the questionnaire for a 55% completion rate. Thirteen of those schools have implemented mobile computing requirements for students. Twenty schools reported they were likely to formally consider implementing a mobile computing initiative within 5 years. Numerous models of mobile computing initiatives exist in terms of device obtainment, technical support, infrastructure, and utilization within the curriculum. Responders identified flexibility in teaching and learning as the most positive aspect of the initiatives and computer-aided distraction as the most negative, Numerous factors should be taken into consideration when deciding if and how a mobile computing requirement should be implemented.

  5. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  6. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  7. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    Science.gov (United States)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  8. Persistent Identifiers as Boundary Objects

    Science.gov (United States)

    Parsons, M. A.; Fox, P. A.

    2017-12-01

    In 1989, Leigh Star and Jim Griesemer defined the seminal concept of `boundary objects'. These `objects' are what Latour calls `immutable mobiles' that enable communication and collaboration across difference by helping meaning to be understood in different contexts. As Star notes, they are a sort of arrangement that allow different groups to work together without (a priori) consensus. Part of the idea is to recognize and allow for the `interpretive flexibility' that is central to much of the `constructivist' approach in the sociology of science. Persistent Identifiers (PIDs) can clearly act as boundary objects, but people do not usually assume that they enable interpretive flexibility. After all, they are meant to be unambiguous, machine-interpretable identifiers of defined artifacts. In this paper, we argue that PIDs can fill at least two roles: 1) That of the standardized form, where there is strong agreement on what is being represented and how and 2) that of the idealized type, a more conceptual concept that allows many different representations. We further argue that these seemingly abstract conceptions actually help us implement PIDs more effectively to link data, publications, various other artifacts, and especially people. Considering PIDs as boundary objects can help us address issues such as what level of granularity is necessary for PIDs, what metadata should be directly associated with PIDs, and what purpose is the PID serving (reference, provenance, credit, etc.). In short, sociological theory can improve data sharing standards and their implementation in a way that enables broad interdisciplinary data sharing and reuse. We will illustrate this with several specific examples of Earth science data.

  9. The Computer Backgrounds of Soldiers in Army Units: FY01

    National Research Council Canada - National Science Library

    Singh, Harnam

    2002-01-01

    A multi-year research effort was instituted in FY99 to examine soldiers' experiences with computers, self- perceptions of their computer skill, and their ability to identify frequently used, Windows-based icons...

  10. Computer science: Data analysis meets quantum physics

    Science.gov (United States)

    Schramm, Steven

    2017-10-01

    A technique that combines machine learning and quantum computing has been used to identify the particles known as Higgs bosons. The method could find applications in many areas of science. See Letter p.375

  11. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  12. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  13. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  14. Computers: Instruments of Change.

    Science.gov (United States)

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  15. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  16. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  17. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  18. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  19. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  20. Computer Virus and Trends

    OpenAIRE

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  1. RECOVIR Software for Identifying Viruses

    Science.gov (United States)

    Chakravarty, Sugoto; Fox, George E.; Zhu, Dianhui

    2013-01-01

    Most single-stranded RNA (ssRNA) viruses mutate rapidly to generate a large number of strains with highly divergent capsid sequences. Determining the capsid residues or nucleotides that uniquely characterize these strains is critical in understanding the strain diversity of these viruses. RECOVIR (an acronym for "recognize viruses") software predicts the strains of some ssRNA viruses from their limited sequence data. Novel phylogenetic-tree-based databases of protein or nucleic acid residues that uniquely characterize these virus strains are created. Strains of input virus sequences (partial or complete) are predicted through residue-wise comparisons with the databases. RECOVIR uses unique characterizing residues to identify automatically strains of partial or complete capsid sequences of picorna and caliciviruses, two of the most highly diverse ssRNA virus families. Partition-wise comparisons of the database residues with the corresponding residues of more than 300 complete and partial sequences of these viruses resulted in correct strain identification for all of these sequences. This study shows the feasibility of creating databases of hitherto unknown residues uniquely characterizing the capsid sequences of two of the most highly divergent ssRNA virus families. These databases enable automated strain identification from partial or complete capsid sequences of these human and animal pathogens.

  2. ASCR Workshop on Quantum Computing for Science

    Energy Technology Data Exchange (ETDEWEB)

    Aspuru-Guzik, Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Van Dam, Wim [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Farhi, Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gaitan, Frank [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Humble, Travis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jordan, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Landahl, Andrew J [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lucas, Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Preskill, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Muller, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Svore, Krysta [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wiebe, Nathan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Williams, Carl [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms for linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.

  3. Computational mechanics research at ONR

    International Nuclear Information System (INIS)

    Kushner, A.S.

    1986-01-01

    Computational mechanics is not an identified program at the Office of Naval Research (ONR), but rather plays a key role in the Solid Mechanics, Fluid Mechanics, Energy Conversion, and Materials Science programs. The basic philosophy of the Mechanics Division at ONR is to support fundamental research which expands the basis for understanding, predicting, and controlling the behavior of solid and fluid materials and systems at the physical and geometric scales appropriate to the phenomena of interest. It is shown in this paper that a strong commonalty of computational mechanics drivers exists for the forefront research areas in both solid and fluid mechanics

  4. Cloud Computing Principles and Paradigms

    CERN Document Server

    Buyya, Rajkumar; Goscinski, Andrzej M

    2010-01-01

    The primary purpose of this book is to capture the state-of-the-art in Cloud Computing technologies and applications. The book will also aim to identify potential research directions and technologies that will facilitate creation a global market-place of cloud computing services supporting scientific, industrial, business, and consumer applications. We expect the book to serve as a reference for larger audience such as systems architects, practitioners, developers, new researchers and graduate level students. This area of research is relatively recent, and as such has no existing reference boo

  5. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  6. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  7. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  8. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  9. Computers in nuclear medicine

    International Nuclear Information System (INIS)

    Giannone, Carlos A.

    1999-01-01

    This chapter determines: capture and observation of images in computers; hardware and software used, personal computers, networks and workstations. The use of special filters determine the quality image

  10. Identifying crucial parameter correlations maintaining bursting activity.

    Directory of Open Access Journals (Sweden)

    Anca Doloc-Mihu

    2014-06-01

    Full Text Available Recent experimental and computational studies suggest that linearly correlated sets of parameters (intrinsic and synaptic properties of neurons allow central pattern-generating networks to produce and maintain their rhythmic activity regardless of changing internal and external conditions. To determine the role of correlated conductances in the robust maintenance of functional bursting activity, we used our existing database of half-center oscillator (HCO model instances of the leech heartbeat CPG. From the database, we identified functional activity groups of burster (isolated neuron and half-center oscillator model instances and realistic subgroups of each that showed burst characteristics (principally period and spike frequency similar to the animal. To find linear correlations among the conductance parameters maintaining functional leech bursting activity, we applied Principal Component Analysis (PCA to each of these four groups. PCA identified a set of three maximal conductances (leak current, [Formula: see text]Leak; a persistent K current, [Formula: see text]K2; and of a persistent Na+ current, [Formula: see text]P that correlate linearly for the two groups of burster instances but not for the HCO groups. Visualizations of HCO instances in a reduced space suggested that there might be non-linear relationships between these parameters for these instances. Experimental studies have shown that period is a key attribute influenced by modulatory inputs and temperature variations in heart interneurons. Thus, we explored the sensitivity of period to changes in maximal conductances of [Formula: see text]Leak, [Formula: see text]K2, and [Formula: see text]P, and we found that for our realistic bursters the effect of these parameters on period could not be assessed because when varied individually bursting activity was not maintained.

  11. Computer aided control engineering

    DEFF Research Database (Denmark)

    Szymkat, Maciej; Ravn, Ole

    1997-01-01

    Current developments in the field of Computer Aided Control Engineering (CACE) have a visible impact on the design methodologies and the structure of the software tools supporting them. Today control engineers has at their disposal libraries, packages or programming environments that may...... in CACE enhancing efficient flow of information between the tools supporting the following phases of the design process. In principle, this flow has to be two-way, and more or less automated, in order to enable the engineer to observe the propagation of the particular design decisions taken at various...... levels.The major conclusions of the paper are related with identifying the factors affecting the software tool integration in a way needed to facilitate design "inter-phase" communication. These are: standard application interfaces, dynamic data exchange mechanisms, code generation techniques and general...

  12. Identifying ELIXIR Core Data Resources.

    Science.gov (United States)

    Durinx, Christine; McEntyre, Jo; Appel, Ron; Apweiler, Rolf; Barlow, Mary; Blomberg, Niklas; Cook, Chuck; Gasteiger, Elisabeth; Kim, Jee-Hyub; Lopez, Rodrigo; Redaschi, Nicole; Stockinger, Heinz; Teixeira, Daniel; Valencia, Alfonso

    2016-01-01

    The core mission of ELIXIR is to build a stable and sustainable infrastructure for biological information across Europe. At the heart of this are the data resources, tools and services that ELIXIR offers to the life-sciences community, providing stable and sustainable access to biological data. ELIXIR aims to ensure that these resources are available long-term and that the life-cycles of these resources are managed such that they support the scientific needs of the life-sciences, including biological research. ELIXIR Core Data Resources are defined as a set of European data resources that are of fundamental importance to the wider life-science community and the long-term preservation of biological data. They are complete collections of generic value to life-science, are considered an authority in their field with respect to one or more characteristics, and show high levels of scientific quality and service. Thus, ELIXIR Core Data Resources are of wide applicability and usage. This paper describes the structures, governance and processes that support the identification and evaluation of ELIXIR Core Data Resources. It identifies key indicators which reflect the essence of the definition of an ELIXIR Core Data Resource and support the promotion of excellence in resource development and operation. It describes the specific indicators in more detail and explains their application within ELIXIR's sustainability strategy and science policy actions, and in capacity building, life-cycle management and technical actions. The identification process is currently being implemented and tested for the first time. The findings and outcome will be evaluated by the ELIXIR Scientific Advisory Board in March 2017. Establishing the portfolio of ELIXIR Core Data Resources and ELIXIR Services is a key priority for ELIXIR and publicly marks the transition towards a cohesive infrastructure.

  13. DIA-datasnooping and identifiability

    Science.gov (United States)

    Zaminpardaz, S.; Teunissen, P. J. G.

    2018-04-01

    In this contribution, we present and analyze datasnooping in the context of the DIA method. As the DIA method for the detection, identification and adaptation of mismodelling errors is concerned with estimation and testing, it is the combination of both that needs to be considered. This combination is rigorously captured by the DIA estimator. We discuss and analyze the DIA-datasnooping decision probabilities and the construction of the corresponding partitioning of misclosure space. We also investigate the circumstances under which two or more hypotheses are nonseparable in the identification step. By means of a theorem on the equivalence between the nonseparability of hypotheses and the inestimability of parameters, we demonstrate that one can forget about adapting the parameter vector for hypotheses that are nonseparable. However, as this concerns the complete vector and not necessarily functions of it, we also show that parameter functions may exist for which adaptation is still possible. It is shown how this adaptation looks like and how it changes the structure of the DIA estimator. To demonstrate the performance of the various elements of DIA-datasnooping, we apply the theory to some selected examples. We analyze how geometry changes in the measurement setup affect the testing procedure, by studying their partitioning of misclosure space, the decision probabilities and the minimal detectable and identifiable biases. The difference between these two minimal biases is highlighted by showing the difference between their corresponding contributing factors. We also show that if two alternative hypotheses, say Hi and Hj , are nonseparable, the testing procedure may have different levels of sensitivity to Hi -biases compared to the same Hj -biases.

  14. Progress in computer vision.

    Science.gov (United States)

    Jain, A. K.; Dorai, C.

    Computer vision has emerged as a challenging and important area of research, both as an engineering and a scientific discipline. The growing importance of computer vision is evident from the fact that it was identified as one of the "Grand Challenges" and also from its prominent role in the National Information Infrastructure. While the design of a general-purpose vision system continues to be elusive machine vision systems are being used successfully in specific application elusive, machine vision systems are being used successfully in specific application domains. Building a practical vision system requires a careful selection of appropriate sensors, extraction and integration of information from available cues in the sensed data, and evaluation of system robustness and performance. The authors discuss and demonstrate advantages of (1) multi-sensor fusion, (2) combination of features and classifiers, (3) integration of visual modules, and (IV) admissibility and goal-directed evaluation of vision algorithms. The requirements of several prominent real world applications such as biometry, document image analysis, image and video database retrieval, and automatic object model construction offer exciting problems and new opportunities to design and evaluate vision algorithms.

  15. Interactive Computer Graphics

    Science.gov (United States)

    Kenwright, David

    2000-01-01

    Aerospace data analysis tools that significantly reduce the time and effort needed to analyze large-scale computational fluid dynamics simulations have emerged this year. The current approach for most postprocessing and visualization work is to explore the 3D flow simulations with one of a dozen or so interactive tools. While effective for analyzing small data sets, this approach becomes extremely time consuming when working with data sets larger than one gigabyte. An active area of research this year has been the development of data mining tools that automatically search through gigabyte data sets and extract the salient features with little or no human intervention. With these so-called feature extraction tools, engineers are spared the tedious task of manually exploring huge amounts of data to find the important flow phenomena. The software tools identify features such as vortex cores, shocks, separation and attachment lines, recirculation bubbles, and boundary layers. Some of these features can be extracted in a few seconds; others take minutes to hours on extremely large data sets. The analysis can be performed off-line in a batch process, either during or following the supercomputer simulations. These computations have to be performed only once, because the feature extraction programs search the entire data set and find every occurrence of the phenomena being sought. Because the important questions about the data are being answered automatically, interactivity is less critical than it is with traditional approaches.

  16. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  17. Identifying Memory Allocation Patterns in HEP Software

    Science.gov (United States)

    Kama, S.; Rauschmayr, N.

    2017-10-01

    HEP applications perform an excessive amount of allocations/deallocations within short time intervals which results in memory churn, poor locality and performance degradation. These issues are already known for a decade, but due to the complexity of software frameworks and billions of allocations for a single job, up until recently no efficient mechanism has been available to correlate these issues with source code lines. However, with the advent of the Big Data era, many tools and platforms are now available to do large scale memory profiling. This paper presents, a prototype program developed to track and identify each single (de-)allocation. The CERN IT Hadoop cluster is used to compute memory key metrics, like locality, variation, lifetime and density of allocations. The prototype further provides a web based visualization back-end that allows the user to explore the results generated on the Hadoop cluster. Plotting these metrics for every single allocation over time gives a new insight into application’s memory handling. For instance, it shows which algorithms cause which kind of memory allocation patterns, which function flow causes how many short-lived objects, what are the most commonly allocated sizes etc. The paper will give an insight into the prototype and will show profiling examples for the LHC reconstruction, digitization and simulation jobs.

  18. Indexing molecules with chemical graph identifiers.

    Science.gov (United States)

    Gregori-Puigjané, Elisabet; Garriga-Sust, Rut; Mestres, Jordi

    2011-09-01

    Fast and robust algorithms for indexing molecules have been historically considered strategic tools for the management and storage of large chemical libraries. This work introduces a modified and further extended version of the molecular equivalence number naming adaptation of the Morgan algorithm (J Chem Inf Comput Sci 2001, 41, 181-185) for the generation of a chemical graph identifier (CGI). This new version corrects for the collisions recognized in the original adaptation and includes the ability to deal with graph canonicalization, ensembles (salts), and isomerism (tautomerism, regioisomerism, optical isomerism, and geometrical isomerism) in a flexible manner. Validation of the current CGI implementation was performed on the open NCI database and the drug-like subset of the ZINC database containing 260,071 and 5,348,089 structures, respectively. The results were compared with those obtained with some of the most widely used indexing codes, such as the CACTVS hash code and the new InChIKey. The analyses emphasize the fact that compound management activities, like duplicate analysis of chemical libraries, are sensitive to the exact definition of compound uniqueness and thus still depend, to a minor extent, on the type and flexibility of the molecular index being used. Copyright © 2011 Wiley Periodicals, Inc.

  19. Light-driven movements of the trifoliate leaves of bean (Phaseolus vulgaris L.). Spectral and functional analysis

    International Nuclear Information System (INIS)

    Koller, D.; Ritter, S.; Fork, D.C.

    1996-01-01

    The light-driven responses of the terminal leaflet of bean were analyzed spectrally and functionally. Laminar elevation increases rapidly in response to continuous overhead exposure of its pulvinus to blue light. This response is enhanced in its early stages by simultaneous exposure to red light. The pulvinus responds similarly to continuous overhead unmixed red, or far-red light, albeit at much lower rates. The response to overhead red, alone, or during enhancement of the response to blue, was not affected by simultaneous far-red. However, the response to blue alone, or enhanced by mixture with red, was partially inhibited by simultaneous exposure to far-red. The results suggest that the response to blue resulted mostly from a blue-absorbing pigment system, but may involve some absorption by phytochrome, while responses to red or far-red, with and without blue, may be mediated by high-irradiance responses of phytochrome. Functional differences between the responses to red and blue become apparent when the abaxial (lower), or lateral sectors of the pulvinus are exposed to them, separately and in combination. These differences suggest that red controls the photonastic unfolding of the pulvinus, whereas blue controls its phototropic responses. These responses co-exist in the same tissue, but are separate and additive. (author)

  20. Patterns of students' computer use and relations to their computer and information literacy

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe; Gerick, Julia

    2017-01-01

    Background: Previous studies have shown that there is a complex relationship between students’ computer and information literacy (CIL) and their use of information and communication technologies (ICT) for both recreational and school use. Methods: This study seeks to dig deeper into these complex...... relations by identifying different patterns of students’ school-related and recreational computer use in the 21 countries participating in the International Computer and Information Literacy Study (ICILS 2013). Results: Latent class analysis (LCA) of the student questionnaire and performance data from......, raising important questions about differences in contexts. Keywords: ICILS, Computer use, Latent class analysis (LCA), Computer and information literacy....

  1. Inclusive vision for high performance computing at the CSIR

    CSIR Research Space (South Africa)

    Gazendam, A

    2006-02-01

    Full Text Available and computationally intensive applications. A number of different technologies and standards were identified as core to the open and distributed high-performance infrastructure envisaged...

  2. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  3. Mathematics for computer graphics

    CERN Document Server

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  4. Computations and interaction

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, S.P.; Tilburg, van P.J.A.; Natarajan, R.; Ojo, A.

    2011-01-01

    We enhance the notion of a computation of the classical theory of computing with the notion of interaction. In this way, we enhance a Turing machine as a model of computation to a Reactive Turing Machine that is an abstract model of a computer as it is used nowadays, always interacting with the user

  5. Symbiotic Cognitive Computing

    OpenAIRE

    Farrell, Robert G.; Lenchner, Jonathan; Kephjart, Jeffrey O.; Webb, Alan M.; Muller, MIchael J.; Erikson, Thomas D.; Melville, David O.; Bellamy, Rachel K.E.; Gruen, Daniel M.; Connell, Jonathan H.; Soroker, Danny; Aaron, Andy; Trewin, Shari M.; Ashoori, Maryam; Ellis, Jason B.

    2016-01-01

    IBM Research is engaged in a research program in symbiotic cognitive computing to investigate how to embed cognitive computing in physical spaces. This article proposes 5 key principles of symbiotic cognitive computing.  We describe how these principles are applied in a particular symbiotic cognitive computing environment and in an illustrative application.  

  6. From Computer Forensics to Forensic Computing: Investigators Investigate, Scientists Associate

    OpenAIRE

    Dewald, Andreas; Freiling, Felix C.

    2014-01-01

    This paper draws a comparison of fundamental theories in traditional forensic science and the state of the art in current computer forensics, thereby identifying a certain disproportion between the perception of central aspects in common theory and the digital forensics reality. We propose a separation of what is currently demanded of practitioners in digital forensics into a rigorous scientific part on the one hand, and a more general methodology of searching and seizing digital evidence an...

  7. Computer virus information update CIAC-2301

    Energy Technology Data Exchange (ETDEWEB)

    Orvis, W.J.

    1994-01-15

    While CIAC periodically issues bulletins about specific computer viruses, these bulletins do not cover all the computer viruses that affect desktop computers. The purpose of this document is to identify most of the known viruses for the MS-DOS and Macintosh platforms and give an overview of the effects of each virus. The authors also include information on some windows, Atari, and Amiga viruses. This document is revised periodically as new virus information becomes available. This document replaces all earlier versions of the CIAC Computer virus Information Update. The date on the front cover indicates date on which the information in this document was extracted from CIAC`s Virus database.

  8. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  9. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  10. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  11. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  12. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  13. Quantum computing with defects.

    Science.gov (United States)

    Weber, J R; Koehl, W F; Varley, J B; Janotti, A; Buckley, B B; Van de Walle, C G; Awschalom, D D

    2010-05-11

    Identifying and designing physical systems for use as qubits, the basic units of quantum information, are critical steps in the development of a quantum computer. Among the possibilities in the solid state, a defect in diamond known as the nitrogen-vacancy (NV(-1)) center stands out for its robustness--its quantum state can be initialized, manipulated, and measured with high fidelity at room temperature. Here we describe how to systematically identify other deep center defects with similar quantum-mechanical properties. We present a list of physical criteria that these centers and their hosts should meet and explain how these requirements can be used in conjunction with electronic structure theory to intelligently sort through candidate defect systems. To illustrate these points in detail, we compare electronic structure calculations of the NV(-1) center in diamond with those of several deep centers in 4H silicon carbide (SiC). We then discuss the proposed criteria for similar defects in other tetrahedrally coordinated semiconductors.

  14. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  15. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  16. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  17. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  18. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  19. Structural parameter identifiability analysis for dynamic reaction networks

    DEFF Research Database (Denmark)

    Davidescu, Florin Paul; Jørgensen, Sten Bay

    2008-01-01

    method based on Lie derivatives. The proposed systematic two phase methodology is illustrated on a mass action based model for an enzymatically catalyzed reaction pathway network where only a limited set of variables is measured. The methodology clearly pinpoints the structurally identifiable parameters...... where for a given set of measured variables it is desirable to investigate which parameters may be estimated prior to spending computational effort on the actual estimation. This contribution addresses the structural parameter identifiability problem for the typical case of reaction network models....... The proposed analysis is performed in two phases. The first phase determines the structurally identifiable reaction rates based on reaction network stoichiometry. The second phase assesses the structural parameter identifiability of the specific kinetic rate expressions using a generating series expansion...

  20. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  1. Radiological difficulty in identifying unicompartmental knee replacement dislocation

    Directory of Open Access Journals (Sweden)

    Mr Oruaro Adebayo Onibere, MBBS, MRCS

    2017-09-01

    Full Text Available Unicondylar knee replacement is a relatively common elective orthopedic procedure but is not often seen in the Emergency Department setting. Familiarity with normal clinical and radiological appearances is difficult to gain. Dislocation of the mobile bearing component “spacer” is a known complication of unicondylar knee replacements, and these patients will initially present to the accident and Emergency Department. In this setting, an accurate and prompt diagnosis is necessary to appropriately manage the patient's condition. There is normally a radiological challenge in identifying dislocated mobile bearings on plain radiographs. These patients may need to have further imaging, such as a computer tomographic scan to identify the dislocated mobile bearing.

  2. On Identifying which Intermediate Nodes Should Code in Multicast Networks

    DEFF Research Database (Denmark)

    Pinto, Tiago; Roetter, Daniel Enrique Lucani; Médard, Muriel

    2013-01-01

    the data packets. Previous work has shown that in lossless wireline networks, the performance of tree-packing mechanisms is comparable to network coding, albeit with added complexity at the time of computing the trees. This means that most nodes in the network need not code. Thus, mechanisms that identify...... intermediate nodes that do require coding is instrumental for the efficient operation of coded networks and can have a significant impact in overall energy consumption. We present a distributed, low complexity algorithm that allows every node to identify if it should code and, if so, through what output link...

  3. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  4. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  5. Computers and Computation. Readings from Scientific American.

    Science.gov (United States)

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  6. Know Your Personal Computer Introduction to Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 1. Know Your Personal Computer Introduction to Computers. Siddhartha Kumar Ghoshal. Series Article Volume 1 Issue 1 January 1996 pp 48-55. Fulltext. Click here to view fulltext PDF. Permanent link:

  7. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  8. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  9. Computer Modeling of Platinum Reforming Reactors | Momoh ...

    African Journals Online (AJOL)

    This paper, instead of using a theoretical approach has considered a computer model as means of assessing the reformate composition for three-stage fixed bed reactors in platforming unit. This is done by identifying many possible hydrocarbon transformation reactions that are peculiar to the process unit, identify the ...

  10. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  11. The importance of trust in computer security

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2014-01-01

    The computer security community has traditionally regarded security as a “hard” property that can be modelled and formally proven under certain simplifying assumptions. Traditional security technologies assume that computer users are either malicious, e.g. hackers or spies, or benevolent, competent...... and well informed about the security policies. Over the past two decades, however, computing has proliferated into all aspects of modern society and the spread of malicious software (malware) like worms, viruses and botnets have become an increasing threat. This development indicates a failure in some...... of the fundamental assumptions that underpin existing computer security technologies and that a new view of computer security is long overdue. In this paper, we examine traditionalmodels, policies and mechanisms of computer security in order to identify areas where the fundamental assumptions may fail. In particular...

  12. Office ergonomics: deficiencies in computer workstation design.

    Science.gov (United States)

    Shikdar, Ashraf A; Al-Kindi, Mahmoud A

    2007-01-01

    The objective of this research was to study and identify ergonomic deficiencies in computer workstation design in typical offices. Physical measurements and a questionnaire were used to study 40 workstations. Major ergonomic deficiencies were found in physical design and layout of the workstations, employee postures, work practices, and training. The consequences in terms of user health and other problems were significant. Forty-five percent of the employees used nonadjustable chairs, 48% of computers faced windows, 90% of the employees used computers more than 4 hrs/day, 45% of the employees adopted bent and unsupported back postures, and 20% used office tables for computers. Major problems reported were eyestrain (58%), shoulder pain (45%), back pain (43%), arm pain (35%), wrist pain (30%), and neck pain (30%). These results indicated serious ergonomic deficiencies in office computer workstation design, layout, and usage. Strategies to reduce or eliminate ergonomic deficiencies in computer workstation design were suggested.

  13. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  14. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  15. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  16. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  17. CY15 Livermore Computing Focus Areas

    Energy Technology Data Exchange (ETDEWEB)

    Connell, Tom M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Cupps, Kim C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); D' Hooge, Trent E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fahey, Tim J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fox, Dave M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Futral, Scott W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gary, Mark R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Goldstone, Robin J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hamilton, Pam G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Heer, Todd M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Long, Jeff W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mark, Rich J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Morrone, Chris J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shoopman, Jerry D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Slavec, Joe A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Smith, David W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Springmeyer, Becky R [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Stearman, Marc D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Watson, Py C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-20

    The LC team undertook a survey of primary Center drivers for CY15. Identified key drivers included enhancing user experience and productivity, pre-exascale platform preparation, process improvement, data-centric computing paradigms and business expansion. The team organized critical supporting efforts into three cross-cutting focus areas; Improving Service Quality; Monitoring, Automation, Delegation and Center Efficiency; and Next Generation Compute and Data Environments In each area the team detailed high level challenges and identified discrete actions to address these issues during the calendar year. Identifying the Center’s primary drivers, issues, and plans is intended to serve as a lens focusing LC personnel, resources, and priorities throughout the year.

  18. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  19. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  20. Computer Lexis and Terminology

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2011-04-01

    Full Text Available Computer becomes a widely used tool in everyday work and at home. Every computer user sees texts on its screen containing a lot of words naming new concepts. Those words come from the terminology used by specialists. The common vocabury between computer terminology and lexis of everyday language comes into existence. The article deals with the part of computer terminology which goes to everyday usage and the influence of ordinary language to computer terminology. The relation between English and Lithuanian computer terminology, the construction and pronouncing of acronyms are discussed as well.

  1. Computations in plasma physics

    International Nuclear Information System (INIS)

    Cohen, B.I.; Killeen, J.

    1984-01-01

    A review of computer application in plasma physics is presented. Computer contribution to the investigation of magnetic and inertial confinement of a plasma and charged particle beam propagation is described. Typical utilization of computer for simulation and control of laboratory and cosmic experiments with a plasma and for data accumulation in these experiments is considered. Basic computational methods applied in plasma physics are discussed. Future trends of computer utilization in plasma reseaches are considered in terms of an increasing role of microprocessors and high-speed data plotters and the necessity of more powerful computer application

  2. Quantum computer science

    CERN Document Server

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  3. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  4. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  5. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  6. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  7. Aggregating job exit statuses of a plurality of compute nodes executing a parallel application

    Science.gov (United States)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Mundy, Michael B.

    2015-07-21

    Aggregating job exit statuses of a plurality of compute nodes executing a parallel application, including: identifying a subset of compute nodes in the parallel computer to execute the parallel application; selecting one compute node in the subset of compute nodes in the parallel computer as a job leader compute node; initiating execution of the parallel application on the subset of compute nodes; receiving an exit status from each compute node in the subset of compute nodes, where the exit status for each compute node includes information describing execution of some portion of the parallel application by the compute node; aggregating each exit status from each compute node in the subset of compute nodes; and sending an aggregated exit status for the subset of compute nodes in the parallel computer.

  8. Evaluation of valvular heart diseases with computed tomography

    International Nuclear Information System (INIS)

    Tomoda, Haruo; Hoshiai, Mitsumoto; Matsuyama, Seiya

    1982-01-01

    Forty-two patients with valvular heart diseases were studied with a third-generation computed tomographic system. The cardiac chambers (the atria and ventricles) were evaluated semiquantitatively, and valvular calcification was easily detected with computed tomography. Computed tomography was most valuable in revealing left atrial thrombi which were not identified by other diagnostic procedures in some cases. (author)

  9. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  10. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  11. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  12. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Watase, Yoshiyuki

    1991-09-15

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors.

  13. Searching with Quantum Computers

    OpenAIRE

    Grover, Lov K.

    2000-01-01

    This article introduces quantum computation by analogy with probabilistic computation. A basic description of the quantum search algorithm is given by representing the algorithm as a C program in a novel way.

  14. Book Review: Computational Topology

    DEFF Research Database (Denmark)

    Raussen, Martin

    2011-01-01

    Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...

  15. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  16. Know Your Personal Computer

    Indian Academy of Sciences (India)

    computer with IBM PC .... read by a human and not translated by a compiler are called .... by different stages of education becomes a computer scientist. ... ancestors knew and carried out the semantic actions without question or comment.

  17. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  18. SSCL computer planning

    International Nuclear Information System (INIS)

    Price, L.E.

    1990-01-01

    The SSC Laboratory is in the process of planning the acquisition of a substantial computing system to support the design of detectors. Advice has been sought from users and computer experts in several stages. This paper discuss this process

  19. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  20. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  1. Quantum Computer Science

    Science.gov (United States)

    Mermin, N. David

    2007-08-01

    Preface; 1. Cbits and Qbits; 2. General features and some simple examples; 3. Breaking RSA encryption with a quantum computer; 4. Searching with a quantum computer; 5. Quantum error correction; 6. Protocols that use just a few Qbits; Appendices; Index.

  2. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  3. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... are the limitations of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed ... nasal cavity by small openings. top of page What are some common uses of the procedure? CT ...

  4. Computer Technology Directory.

    Science.gov (United States)

    Exceptional Parent, 1990

    1990-01-01

    This directory lists approximately 300 commercial vendors that offer computer hardware, software, and communication aids for children with disabilities. The company listings indicate computer compatibility and specific disabilities served by their products. (JDD)

  5. My Computer Is Learning.

    Science.gov (United States)

    Good, Ron

    1986-01-01

    Describes instructional uses of computer programs found in David Heiserman's book "Projects in Machine Intelligence for Your Home Computer." The programs feature "creatures" of various colors that move around within a rectangular white border. (JN)

  6. What is Computed Tomography?

    Science.gov (United States)

    ... Imaging Medical X-ray Imaging What is Computed Tomography? Share Tweet Linkedin Pin it More sharing options ... Chest X ray Image back to top Computed Tomography (CT) Although also based on the variable absorption ...

  7. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  8. Computing for Belle

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    2s-1, 10 times as much as we obtain now. This presentation describes Belle's efficient computing operations, struggles to manage large amount of raw and physics data, and plans for Belle computing for Super KEKB/Belle.

  9. Computational Continuum Mechanics

    CERN Document Server

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  10. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  11. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  12. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  13. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  14. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Stroke Brain Tumors Computer Tomography (CT) Safety During Pregnancy Head and Neck Cancer X-ray, Interventional Radiology and Nuclear Medicine Radiation Safety Images related to Computed Tomography (CT) - ...

  15. Intimacy and Computer Communication.

    Science.gov (United States)

    Robson, Dave; Robson, Maggie

    1998-01-01

    Addresses the relationship between intimacy and communication that is based on computer technology. Discusses definitions of intimacy and the nature of intimate conversations that use computers as a communications medium. Explores implications for counseling. (MKA)

  16. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ... ray beam follows a spiral path. A special computer program processes this large volume of data to ...

  17. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  18. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  19. Nanoelectronics: Metrology and Computation

    International Nuclear Information System (INIS)

    Lundstrom, Mark; Clark, Jason V.; Klimeck, Gerhard; Raman, Arvind

    2007-01-01

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (www.nanoHUB.org) as a specific example

  20. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  1. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  2. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  3. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  4. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  5. Quantum mechanics and computation

    International Nuclear Information System (INIS)

    Cirac Sasturain, J. I.

    2000-01-01

    We review how some of the basic principles of Quantum Mechanics can be used in the field of computation. In particular, we explain why a quantum computer can perform certain tasks in a much more efficient way than the computers we have available nowadays. We give the requirements for a quantum system to be able to implement a quantum computer and illustrate these requirements in some particular physical situations. (Author) 16 refs

  6. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  7. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  8. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  9. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  10. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  11. The Computer Revolution.

    Science.gov (United States)

    Berkeley, Edmund C.

    "The Computer Revolution", a part of the "Second Industrial Revolution", is examined with reference to the social consequences of computers. The subject is introduced in an opening section which discusses the revolution in the handling of information and the history, powers, uses, and working s of computers. A second section examines in detail the…

  12. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  13. Physics of quantum computation

    International Nuclear Information System (INIS)

    Belokurov, V.V.; Khrustalev, O.A.; Sadovnichij, V.A.; Timofeevskaya, O.D.

    2003-01-01

    In the paper, the modern status of the theory of quantum computation is considered. The fundamental principles of quantum computers and their basic notions such as quantum processors and computational basis states of the quantum Turing machine as well as the quantum Fourier transform are discussed. Some possible experimental realizations on the basis of NMR methods are given

  14. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  15. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  16. Visitor's Computer Guidelines | CTIO

    Science.gov (United States)

    Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments Guidelines Library Facilities Outreach NOAO-S EPO Program team Art of Darkness Image Gallery EPO/CADIAS ‹› You are here CTIO Home » Astronomers » Visitor's Computer Guidelines Visitor's Computer

  17. Medical Computational Thinking

    DEFF Research Database (Denmark)

    Musaeus, Peter; Tatar, Deborah Gail; Rosen, Michael A.

    2017-01-01

    Computational thinking (CT) in medicine means deliberating when to pursue computer-mediated solutions to medical problems and evaluating when such solutions are worth pursuing in order to assist in medical decision making. Teaching computational thinking (CT) at medical school should be aligned...

  18. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  19. Emission computed tomography

    International Nuclear Information System (INIS)

    Ott, R.J.

    1986-01-01

    Emission Computed Tomography is a technique used for producing single or multiple cross-sectional images of the distribution of radionuclide labelled agents in vivo. The techniques of Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) are described with particular regard to the function of the detectors used to produce images and the computer techniques used to build up images. (UK)

  20. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...

  1. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...

  2. Beyond the Computer Literacy.

    Science.gov (United States)

    Streibel, Michael J.; Garhart, Casey

    1985-01-01

    Describes the approach taken in an education computing course for pre- and in-service teachers. Outlines the basic operational, analytical, and evaluation skills that are emphasized in the course, suggesting that these skills go beyond the attainment of computer literacy and can assist in the effective use of computers. (ML)

  3. Computer algebra applications

    International Nuclear Information System (INIS)

    Calmet, J.

    1982-01-01

    A survey of applications based either on fundamental algorithms in computer algebra or on the use of a computer algebra system is presented. Recent work in biology, chemistry, physics, mathematics and computer science is discussed. In particular, applications in high energy physics (quantum electrodynamics), celestial mechanics and general relativity are reviewed. (Auth.)

  4. Computer-assisted instruction

    NARCIS (Netherlands)

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss

  5. Designing with computational intelligence

    CERN Document Server

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  6. A new computing principle

    International Nuclear Information System (INIS)

    Fatmi, H.A.; Resconi, G.

    1988-01-01

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  7. Computers and Information Flow.

    Science.gov (United States)

    Patrick, R. L.

    This paper is designed to fill the need for an easily understood introduction to the computing and data processing field for the layman who has, or can expect to have, some contact with it. Information provided includes the unique terminology and jargon of the field, the various types of computers and the scope of computational capabilities, and…

  8. Computer naratology: narrative templates in computer games

    OpenAIRE

    Praks, Vítězslav

    2009-01-01

    Relations and actions between literature and computer games were examined. Study contains theoretical analysis of game as an aesthetic artefact. To play a game means to leave practical world for sake of a fictional world. Artistic communication has more similarities with game communication than with normal, practical communication. Game study can help us understand basic concepts of art communication (game rules - poetic rules, game world - fiction, function in game - meaning in art). Compute...

  9. Quantum computing with defects

    Science.gov (United States)

    Varley, Joel

    2011-03-01

    The development of a quantum computer is contingent upon the identification and design of systems for use as qubits, the basic units of quantum information. One of the most promising candidates consists of a defect in diamond known as the nitrogen-vacancy (NV-1) center, since it is an individually-addressable quantum system that can be initialized, manipulated, and measured with high fidelity at room temperature. While the success of the NV-1 stems from its nature as a localized ``deep-center'' point defect, no systematic effort has been made to identify other defects that might behave in a similar way. We provide guidelines for identifying other defect centers with similar properties. We present a list of physical criteria that these centers and their hosts should meet and explain how these requirements can be used in conjunction with electronic structure theory to intelligently sort through candidate systems. To elucidate these points, we compare electronic structure calculations of the NV-1 center in diamond with those of several deep centers in 4H silicon carbide (SiC). Using hybrid functionals, we report formation energies, configuration-coordinate diagrams, and defect-level diagrams to compare and contrast the properties of these defects. We find that the NC VSi - 1 center in SiC, a structural analog of the NV-1 center in diamond, may be a suitable center with very different optical transition energies. We also discuss how the proposed criteria can be translated into guidelines to discover NV analogs in other tetrahedrally coordinated materials. This work was performed in collaboration with J. R. Weber, W. F. Koehl, B. B. Buckley, A. Janotti, C. G. Van de Walle, and D. D. Awschalom. This work was supported by ARO, AFOSR, and NSF.

  10. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  11. Quantum computing and spintronics

    International Nuclear Information System (INIS)

    Kantser, V.

    2007-01-01

    Tentative to build a computer, which can operate according to the quantum laws, has leaded to concept of quantum computing algorithms and hardware. In this review we highlight recent developments which point the way to quantum computing on the basis solid state nanostructures after some general considerations concerning quantum information science and introducing a set of basic requirements for any quantum computer proposal. One of the major direction of research on the way to quantum computing is to exploit the spin (in addition to the orbital) degree of freedom of the electron, giving birth to the field of spintronics. We address some semiconductor approach based on spin orbit coupling in semiconductor nanostructures. (authors)

  12. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  13. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  14. Scalable optical quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Manykin, E A; Mel' nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre ' Kurchatov Institute' , Moscow (Russian Federation)

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  15. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  16. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  17. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  18. Computer algebra and operators

    Science.gov (United States)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  19. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  20. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-01-01

    The application of computers to controlled thermonuclear research (CTR) is essential. In the near future the use of computers in the numerical modeling of fusion systems should increase substantially. A recent panel has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies is called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. To meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR Laboratories by a communication network. The crucial element needed for success is trained personnel. The number of people with knowledge of plasma science and engineering trained in numerical methods and computer science must be increased substantially in the next few years. Nuclear engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing