WorldWideScience

Sample records for acm computing surveys

  1. News from the Library: A one-stop-shop for computing literature: ACM Digital Library

    CERN Multimedia

    CERN Library

    2011-01-01

    The Association for Computing Machinery, ACM, is the world’s largest educational and scientific computing society. Among others, the ACM provides the computing field's premier Digital Library and serves its members and the computing profession with leading-edge publications, conferences, and career resources.   ACM Digital Library is available to the CERN community. The most popular journal here at CERN is Communications of the ACM. However, the collection offers access to a series of other valuable important academic journals such as Journal of the ACM and even fulltext of a series of classical books. In addition, users have access to the ACM Guide to Computing Literature, the most comprehensive bibliographic database focusing on computing, integrated with ACM’s full-text articles and including features such as ACM Author Profile Pages - which provides bibliographic and bibliometric data for over 1,000,000 authors in the field. ACM Digital Library is an excellent com...

  2. ACM CCS 2013-2015 Student Travel Support

    Science.gov (United States)

    2016-10-29

    ACM CCS 2013-2015 Student Travel Support Under the ARO funded effort titled “ACM CCS 2013-2015 Student Travel Support,” from 2013 to 2015, George...Computer and Communications Security (ACM CCS ). The views, opinions and/or findings contained in this report are those of the author(s) and should not...AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 travel grant, acm ccs REPORT

  3. ACME-III and ACME-IV Final Campaign Reports

    Energy Technology Data Exchange (ETDEWEB)

    Biraud, S. C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-01-01

    The goals of the Atmospheric Radiation Measurement (ARM) Climate Research Facility’s third and fourth Airborne Carbon Measurements (ACME) field campaigns, ACME-III and ACME-IV, are: 1) to measure and model the exchange of CO2, water vapor, and other greenhouse gases by the natural, agricultural, and industrial ecosystems of the Southern Great Plains (SGP) region; 2) to develop quantitative approaches to relate these local fluxes to the concentration of greenhouse gases measured at the Central Facility tower and in the atmospheric column above the ARM SGP Central Facility, 3) to develop and test bottom-up measurement and modeling approaches to estimate regional scale carbon balances, and 4) to develop and test inverse modeling approaches to estimate regional scale carbon balance and anthropogenic sources over continental regions. Regular soundings of the atmosphere from near the surface into the mid-troposphere are essential for this research.

  4. Preliminary proceedings of the 2001 ACM SIGPLAN Haskell workshop

    NARCIS (Netherlands)

    Hinze, R.

    2001-01-01

    This volume contains the preliminary proceedings of the 2001 ACM SIGPLAN Haskell Workshop, which was held on 2nd September 2001 in Firenze, Italy. The final proceedings will published by Elsevier Science as an issue of Electronic Notes in Theoretical Computer Science (Volume 59). The

  5. ACM Bundles on Del Pezzo surfaces

    Directory of Open Access Journals (Sweden)

    Joan Pons-Llopis

    2009-11-01

    Full Text Available ACM rank 1 bundles on del Pezzo surfaces are classified in terms of the rational normal curves that they contain. A complete list of ACM line bundles is provided. Moreover, for any del Pezzo surface X of degree less or equal than six and for any n ≥ 2 we construct a family of dimension ≥ n − 1 of non-isomorphic simple ACM bundles of rank n on X.

  6. Porcine bladder acellular matrix (ACM): protein expression, mechanical properties

    International Nuclear Information System (INIS)

    Farhat, Walid A; Chen Jun; Haig, Jennifer; Antoon, Roula; Litman, Jessica; Yeger, Herman; Sherman, Christopher; Derwin, Kathleen

    2008-01-01

    Experimentally, porcine bladder acellular matrix (ACM) that mimics extracellular matrix has excellent potential as a bladder substitute. Herein we investigated the spatial localization and expression of different key cellular and extracellular proteins in the ACM; furthermore, we evaluated the inherent mechanical properties of the resultant ACM prior to implantation. Using a proprietary decellularization method, the DNA contents in both ACM and normal bladder were measured; in addition we used immunohistochemistry and western blots to quantify and localize the different cellular and extracellular components, and finally the mechanical testing was performed using a uniaxial mechanical testing machine. The mean DNA content in the ACM was significantly lower in the ACM compared to the bladder. Furthermore, the immunohistochemical and western blot analyses showed that collagen I and IV were preserved in the ACM, but possibly denatured collagen III in the ACM. Furthermore, elastin, laminin and fibronectin were mildly reduced in the ACM. Although the ACM did not exhibit nucleated cells, residual cellular components (actin, myosin, vimentin and others) were still present. There was, on the other hand, no significant difference in the mean stiffness between the ACM and the bladder. Although our decellularization method is effective in removing nuclear material from the bladder while maintaining its inherent mechanical properties, further work is mandatory to determine whether these residual DNA and cellular remnants would lead to any immune reaction, or if the mechanical properties of the ACM are preserved upon implantation and cellularization

  7. Porcine bladder acellular matrix (ACM): protein expression, mechanical properties.

    Science.gov (United States)

    Farhat, Walid A; Chen, Jun; Haig, Jennifer; Antoon, Roula; Litman, Jessica; Sherman, Christopher; Derwin, Kathleen; Yeger, Herman

    2008-06-01

    Experimentally, porcine bladder acellular matrix (ACM) that mimics extracellular matrix has excellent potential as a bladder substitute. Herein we investigated the spatial localization and expression of different key cellular and extracellular proteins in the ACM; furthermore, we evaluated the inherent mechanical properties of the resultant ACM prior to implantation. Using a proprietary decellularization method, the DNA contents in both ACM and normal bladder were measured; in addition we used immunohistochemistry and western blots to quantify and localize the different cellular and extracellular components, and finally the mechanical testing was performed using a uniaxial mechanical testing machine. The mean DNA content in the ACM was significantly lower in the ACM compared to the bladder. Furthermore, the immunohistochemical and western blot analyses showed that collagen I and IV were preserved in the ACM, but possibly denatured collagen III in the ACM. Furthermore, elastin, laminin and fibronectin were mildly reduced in the ACM. Although the ACM did not exhibit nucleated cells, residual cellular components (actin, myosin, vimentin and others) were still present. There was, on the other hand, no significant difference in the mean stiffness between the ACM and the bladder. Although our decellularization method is effective in removing nuclear material from the bladder while maintaining its inherent mechanical properties, further work is mandatory to determine whether these residual DNA and cellular remnants would lead to any immune reaction, or if the mechanical properties of the ACM are preserved upon implantation and cellularization.

  8. Porcine bladder acellular matrix (ACM): protein expression, mechanical properties

    Energy Technology Data Exchange (ETDEWEB)

    Farhat, Walid A [Department of Surgery, Division of Urology, University of Toronto and Hospital for Sick Children, Toronto, ON M5G 1X8 (Canada); Chen Jun; Haig, Jennifer; Antoon, Roula; Litman, Jessica; Yeger, Herman [Department of Developmental and Stem Cell Biology, Research Institute, Hospital for Sick Children, Toronto, ON M5G 1X8 (Canada); Sherman, Christopher [Department of Anatomic Pathology, Sunnybrook and Women' s College Health Sciences Centre, Toronto, ON (Canada); Derwin, Kathleen [Department of Biomedical Engineering, Lerner Research Institute and Orthopaedic Research Center, Cleveland Clinic Foundation, Cleveland, OH (United States)], E-mail: walid.farhat@sickkids.ca

    2008-06-01

    Experimentally, porcine bladder acellular matrix (ACM) that mimics extracellular matrix has excellent potential as a bladder substitute. Herein we investigated the spatial localization and expression of different key cellular and extracellular proteins in the ACM; furthermore, we evaluated the inherent mechanical properties of the resultant ACM prior to implantation. Using a proprietary decellularization method, the DNA contents in both ACM and normal bladder were measured; in addition we used immunohistochemistry and western blots to quantify and localize the different cellular and extracellular components, and finally the mechanical testing was performed using a uniaxial mechanical testing machine. The mean DNA content in the ACM was significantly lower in the ACM compared to the bladder. Furthermore, the immunohistochemical and western blot analyses showed that collagen I and IV were preserved in the ACM, but possibly denatured collagen III in the ACM. Furthermore, elastin, laminin and fibronectin were mildly reduced in the ACM. Although the ACM did not exhibit nucleated cells, residual cellular components (actin, myosin, vimentin and others) were still present. There was, on the other hand, no significant difference in the mean stiffness between the ACM and the bladder. Although our decellularization method is effective in removing nuclear material from the bladder while maintaining its inherent mechanical properties, further work is mandatory to determine whether these residual DNA and cellular remnants would lead to any immune reaction, or if the mechanical properties of the ACM are preserved upon implantation and cellularization.

  9. AcmD, a homolog of the major autolysin AcmA of Lactococcus lactis, binds to the cell wall and contributes to cell separation and autolysis

    NARCIS (Netherlands)

    Visweswaran, Ganesh Ram R; Steen, Anton; Leenhouts, Kees; Szeliga, Monika; Ruban, Beata; Hesseling-Meinders, Anne; Dijkstra, Bauke W; Kuipers, Oscar P; Kok, Jan; Buist, Girbe

    2013-01-01

    Lactococcus lactis expresses the homologous glucosaminidases AcmB, AcmC, AcmA and AcmD. The latter two have three C-terminal LysM repeats for peptidoglycan binding. AcmD has much shorter intervening sequences separating the LysM repeats and a lower iso-electric point (4.3) than AcmA (10.3). Under

  10. Quark ACM with topologically generated gluon mass

    Science.gov (United States)

    Choudhury, Ishita Dutta; Lahiri, Amitabha

    2016-03-01

    We investigate the effect of a small, gauge-invariant mass of the gluon on the anomalous chromomagnetic moment (ACM) of quarks by perturbative calculations at one-loop level. The mass of the gluon is taken to have been generated via a topological mass generation mechanism, in which the gluon acquires a mass through its interaction with an antisymmetric tensor field Bμν. For a small gluon mass ( ACM at momentum transfer q2 = -M Z2. We compare those with the ACM calculated for the gluon mass arising from a Proca mass term. We find that the ACM of up, down, strange and charm quarks vary significantly with the gluon mass, while the ACM of top and bottom quarks show negligible gluon mass dependence. The mechanism of gluon mass generation is most important for the strange quarks ACM, but not so much for the other quarks. We also show the results at q2 = -m t2. We find that the dependence on gluon mass at q2 = -m t2 is much less than at q2 = -M Z2 for all quarks.

  11. Porosity of porcine bladder acellular matrix: impact of ACM thickness.

    Science.gov (United States)

    Farhat, Walid; Chen, Jun; Erdeljan, Petar; Shemtov, Oren; Courtman, David; Khoury, Antoine; Yeger, Herman

    2003-12-01

    The objectives of this study are to examine the porosity of bladder acellular matrix (ACM) using deionized (DI) water as the model fluid and dextran as the indicator macromolecule, and to correlate the porosity to the ACM thickness. Porcine urinary bladders from pigs weighing 20-50 kg were sequentially extracted in detergent containing solutions, and to modify the ACM thickness, stretched bladders were acellularized in the same manner. Luminal and abluminal ACM specimens were subjected to fixed static DI water pressure (10 cm); and water passing through the specimens was collected at specific time interval. While for the macromolecule porosity testing, the diffusion rate and direction of 10,000 MW fluoroescein-labeled dextrans across the ACM specimens mounted in Ussing's chambers were measured. Both experiments were repeated on the thin stretched ACM. In both ACM types, the fluid porosity in both directions did not decrease with increased test duration (3 h); in addition, the abluminal surface was more porous to fluid than the luminal surface. On the other hand, when comparing thin to thick ACM, the porosity in either direction was higher in the thick ACM. Macromolecule porosity, as measured by absorbance, was higher for the abluminal thick ACM than the luminal side, but this characteristic was reversed in the thin ACM. Comparing thin to thick ACM, the luminal side in the thin ACM was more porous to dextran than in the thick ACM, but this characteristic was reversed for the abluminal side. The porcine bladder ACM possesses directional porosity and acellularizing stretched urinary bladders may increase structural density and alter fluid and macromolecule porosity. Copyright 2003 Wiley Periodicals, Inc. J Biomed Mater Res 67A: 970-974, 2003

  12. Distribution of the ACME-arcA gene among meticillin-resistant Staphylococcus haemolyticus and identification of a novel ccr allotype in ACME-arcA-positive isolates.

    Science.gov (United States)

    Pi, Borui; Yu, Meihong; Chen, Yagang; Yu, Yunsong; Li, Lanjuan

    2009-06-01

    The aim of this study was to investigate the prevalence and characteristics of ACME (arginine catabolic mobile element)-arcA-positive isolates among meticillin-resistant Staphylococcus haemolyticus (MRSH). ACME-arcA, native arcA and SCCmec elements were detected by PCR. Susceptibilities to 10 antimicrobial agents were compared between ACME-arcA-positive and -negative isolates by chi-square test. PFGE was used to investigate the clonal relatedness of ACME-arcA-positive isolates. The phylogenetic relationships of ACME-arcA and native arcA were analysed using the neighbour-joining methods of mega software. A total of 42 (47.7 %) of 88 isolates distributed in 13 PFGE types were positive for the ACME-arcA gene. There were no significant differences in antimicrobial susceptibility between ACME-arcA-positive and -negative isolates. A novel ccr allotype (ccrAB(SHP)) was identified in ACME-arcA-positive isolates. Among 42 ACME-arcA-positive isolates: 8 isolates harboured SCCmec V, 8 isolates harboured class C1 mec complex and ccrAB(SHP); 22 isolates harbouring class C1 mec complex and 4 isolates harbouring class C2 mec complex were negative for all known ccr allotypes. The ACME-arcA-positive isolates were first found in MRSH with high prevalence and clonal diversity, which suggests a mobility of ACME within MRSH. The results from this study revealed that MRSH is likely to be one of the potential reservoirs of ACME for Staphylococcus aureus.

  13. Transient Inverse Calibration of the Site-Wide Groundwater Flow Model (ACM-2): FY03 Progress Report

    International Nuclear Information System (INIS)

    Vermeul, Vince R.; Bergeron, Marcel P.; Cole, C R.; Murray, Christopher J.; Nichols, William E.; Scheibe, Timothy D.; Thorne, Paul D.; Waichler, Scott R.; Xie, YuLong

    2003-01-01

    this task is to assess uncertainty based on the most current model (ACM-2), this preliminary work provided an effective basis for developing the approach and implementation methodology. A strategy was developed to facilitate inverse calibration analysis of the large number of stochastic ACMs generated. These stochastic ACMs are random selections from a range of possible model structures, all of which are consistent with available observations. However, a single inverse run requires many forward flow model runs, and full inverse analysis of the large number of combinations of stochastic alternative models is not now computationally feasible. Thus, a two-part approach was developed: (1) full inverse modeling of selected realizations combined with limited forward modeling and (2) implementation of the UCODE/CFEST inverse modeling framework to enhance computational capabilities

  14. Additive Construction with Mobile Emplacement (ACME)

    Science.gov (United States)

    Vickers, John

    2015-01-01

    The Additive Construction with Mobile Emplacement (ACME) project is developing technology to build structures on planetary surfaces using in-situ resources. The project focuses on the construction of both 2D (landing pads, roads, and structure foundations) and 3D (habitats, garages, radiation shelters, and other structures) infrastructure needs for planetary surface missions. The ACME project seeks to raise the Technology Readiness Level (TRL) of two components needed for planetary surface habitation and exploration: 3D additive construction (e.g., contour crafting), and excavation and handling technologies (to effectively and continuously produce in-situ feedstock). Additionally, the ACME project supports the research and development of new materials for planetary surface construction, with the goal of reducing the amount of material to be launched from Earth.

  15. CLIC-ACM: Acquisition and Control System

    CERN Document Server

    Bielawski, B; Magnoni, S

    2014-01-01

    CLIC [1] (Compact Linear Collider) is a world-wide collaboration to study the next terascale lepton collider, relying upon a very innovative concept of two-beamacceleration. In this scheme, the power is transported to the main accelerating structures by a primary electron beam. The Two Beam Module (TBM) is a compact integration with a high filling factor of all components: RF, Magnets, Instrumentation, Vacuum, Alignment and Stabilization. This paper describes the very challenging aspects of designing the compact system to serve as a dedicated Acquisition & Control Module (ACM) for all signals of the TBM. Very delicate conditions must be considered, in particular radiation doses that could reach several kGy in the tunnel. In such severe conditions shielding and hardened electronics will have to be taken into consideration. In addition, with more than 300 ADC&DAC channels per ACM and about 21000 ACMs in total, it appears clearly that power consumption will be an important issue. It is also obvious that...

  16. Automated software system for checking the structure and format of ACM SIG documents

    Science.gov (United States)

    Mirza, Arsalan Rahman; Sah, Melike

    2017-04-01

    Microsoft (MS) Office Word is one of the most commonly used software tools for creating documents. MS Word 2007 and above uses XML to represent the structure of MS Word documents. Metadata about the documents are automatically created using Office Open XML (OOXML) syntax. We develop a new framework, which is called ADFCS (Automated Document Format Checking System) that takes the advantage of the OOXML metadata, in order to extract semantic information from MS Office Word documents. In particular, we develop a new ontology for Association for Computing Machinery (ACM) Special Interested Group (SIG) documents for representing the structure and format of these documents by using OWL (Web Ontology Language). Then, the metadata is extracted automatically in RDF (Resource Description Framework) according to this ontology using the developed software. Finally, we generate extensive rules in order to infer whether the documents are formatted according to ACM SIG standards. This paper, introduces ACM SIG ontology, metadata extraction process, inference engine, ADFCS online user interface, system evaluation and user study evaluations.

  17. ACME: A scalable parallel system for extracting frequent patterns from a very long sequence

    KAUST Repository

    Sahli, Majed

    2014-10-02

    Modern applications, including bioinformatics, time series, and web log analysis, require the extraction of frequent patterns, called motifs, from one very long (i.e., several gigabytes) sequence. Existing approaches are either heuristics that are error-prone, or exact (also called combinatorial) methods that are extremely slow, therefore, applicable only to very small sequences (i.e., in the order of megabytes). This paper presents ACME, a combinatorial approach that scales to gigabyte-long sequences and is the first to support supermaximal motifs. ACME is a versatile parallel system that can be deployed on desktop multi-core systems, or on thousands of CPUs in the cloud. However, merely using more compute nodes does not guarantee efficiency, because of the related overheads. To this end, ACME introduces an automatic tuning mechanism that suggests the appropriate number of CPUs to utilize, in order to meet the user constraints in terms of run time, while minimizing the financial cost of cloud resources. Our experiments show that, compared to the state of the art, ACME supports three orders of magnitude longer sequences (e.g., DNA for the entire human genome); handles large alphabets (e.g., English alphabet for Wikipedia); scales out to 16,384 CPUs on a supercomputer; and supports elastic deployment in the cloud.

  18. ACME: A scalable parallel system for extracting frequent patterns from a very long sequence

    KAUST Repository

    Sahli, Majed; Mansour, Essam; Kalnis, Panos

    2014-01-01

    -long sequences and is the first to support supermaximal motifs. ACME is a versatile parallel system that can be deployed on desktop multi-core systems, or on thousands of CPUs in the cloud. However, merely using more compute nodes does not guarantee efficiency

  19. Model Diagnostics for the Department of Energy's Accelerated Climate Modeling for Energy (ACME) Project

    Science.gov (United States)

    Smith, B.

    2015-12-01

    In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs

  20. Design of ACM system based on non-greedy punctured LDPC codes

    Science.gov (United States)

    Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng

    2017-08-01

    In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.

  1. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

    Science.gov (United States)

    Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

    2013-12-01

    Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Autonomic Cluster Management System (ACMS): A Demonstration of Autonomic Principles at Work

    Science.gov (United States)

    Baldassari, James D.; Kopec, Christopher L.; Leshay, Eric S.; Truszkowski, Walt; Finkel, David

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of achieving significant computational capabilities for high-performance computing applications, while simultaneously affording the ability to. increase that capability simply by adding more (inexpensive) processors. However, the task of manually managing and con.guring a cluster quickly becomes impossible as the cluster grows in size. Autonomic computing is a relatively new approach to managing complex systems that can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Automatic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management.

  3. Analysis on working pressure selection of ACME integral test facility

    International Nuclear Information System (INIS)

    Chen Lian; Chang Huajian; Li Yuquan; Ye Zishen; Qin Benke

    2011-01-01

    An integral effects test facility, advanced core cooling mechanism experiment facility (ACME) was designed to verify the performance of the passive safety system and validate its safety analysis codes of a pressurized water reactor power plant. Three test facilities for AP1000 design were introduced and review was given. The problems resulted from the different working pressures of its test facilities were analyzed. Then a detailed description was presented on the working pressure selection of ACME facility as well as its characteristics. And the approach of establishing desired testing initial condition was discussed. The selected 9.3 MPa working pressure covered almost all important passive safety system enables the ACME to simulate the LOCAs with the same pressure and property similitude as the prototype. It's expected that the ACME design would be an advanced core cooling integral test facility design. (authors)

  4. Characterization of a Novel Arginine Catabolic Mobile Element (ACME) and Staphylococcal Chromosomal Cassette mec Composite Island with Significant Homology to Staphylococcus epidermidis ACME type II in Methicillin-Resistant Staphylococcus aureus Genotype ST22-MRSA-IV.

    LENUS (Irish Health Repository)

    Shore, Anna C

    2011-02-22

    The arginine catabolic mobile element (ACME) is prevalent among ST8-MRSA-IVa (USA300) isolates and evidence suggests that ACME enhances the ability of ST8-MRSA-IVa to grow and survive on its host. ACME has been identified in a small number of isolates belonging to other MRSA clones but is widespread among coagulase-negative staphylococci (CoNS). This study reports the first description of ACME in two distinct strains of the pandemic ST22-MRSA-IV clone. A total of 238 MRSA isolates recovered in Ireland between 1971 and 2008 were investigated for ACME using a DNA microarray. Twenty-three isolates (9.7%) were ACME-positive, all were either MRSA genotype ST8-MRSA-IVa (7\\/23, 30%) or ST22-MRSA-IV (16\\/23, 70%). Whole-genome sequencing and comprehensive molecular characterization revealed the presence of a novel 46-kb ACME and SCCmec composite island (ACME\\/SCCmec-CI) in ST22-MRSA-IVh isolates (n = 15). This ACME\\/SCCmec-CI consists of a 12-kb DNA region previously identified in ACME type II in S. epidermidis ATCC 12228, a truncated copy of the J1 region of SCCmec I and a complete SCCmec IVh element. The composite island has a novel genetic organization with ACME located within orfX and SCCmec located downstream of ACME. One pvl-positive ST22-MRSA-IVa isolate carried ACME located downstream of SCCmec IVa as previously described in ST8-MRSA-IVa. These results suggest that ACME has been acquired by ST22-MRSA-IV on two independent occasions. At least one of these instances may have involved horizontal transfer and recombination events between MRSA and CoNS. The presence of ACME may enhance dissemination of ST22-MRSA-IV, an already successful MRSA clone.

  5. Contribution of the collagen adhesin Acm to pathogenesis of Enterococcus faecium in experimental endocarditis.

    Science.gov (United States)

    Nallapareddy, Sreedhar R; Singh, Kavindra V; Murray, Barbara E

    2008-09-01

    Enterococcus faecium is a multidrug-resistant opportunist causing difficult-to-treat nosocomial infections, including endocarditis, but there are no reports experimentally demonstrating E. faecium virulence determinants. Our previous studies showed that some clinical E. faecium isolates produce a cell wall-anchored collagen adhesin, Acm, and that an isogenic acm deletion mutant of the endocarditis-derived strain TX0082 lost collagen adherence. In this study, we show with a rat endocarditis model that TX0082 Deltaacm::cat is highly attenuated versus wild-type TX0082, both in established (72 h) vegetations (P Acm the first factor shown to be important for E. faecium pathogenesis. In contrast, no mortality differences were observed in a mouse peritonitis model. While 5 of 17 endocarditis isolates were Acm nonproducers and failed to adhere to collagen in vitro, all had an intact, highly conserved acm locus. Highly reduced acm mRNA levels (>or=50-fold reduction relative to an Acm producer) were found in three of these five nonadherent isolates, including the sequenced strain TX0016, by quantitative reverse transcription-PCR, indicating that acm transcription is downregulated in vitro in these isolates. However, examination of TX0016 cells obtained directly from infected rat vegetations by flow cytometry showed that Acm was present on 40% of cells grown during infection. Finally, we demonstrated a significant reduction in E. faecium collagen adherence by affinity-purified anti-Acm antibodies from E. faecium endocarditis patient sera, suggesting that Acm may be a potential immunotarget for strategies to control this emerging pathogen.

  6. Asbestos-Containing Materials (ACM) and Demolition

    Science.gov (United States)

    There are specific federal regulatory requirements that require the identification of asbestos-containing materials (ACM) in many of the residential buildings that are being demolished or renovated by a municipality.

  7. Proceedings of the 2nd ACM SIGSPATIAL International Workshop on Indoor Spatial Awareness

    DEFF Research Database (Denmark)

    These proceedings contain the papers selected for presentation at the Second International Workshop on Indoor Spatial Awareness, hosted by ACM SIGSPATIAL and held in conjunction with the18th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems (ACM SIGSPATIAL GIS...

  8. De afschaffing van de bezwaarfase bij boetebesluiten van de ACM

    NARCIS (Netherlands)

    Jans, J.H.; Outhuijse, A.

    Per 1 maart 2013 ontstaat door samenvoeging van de NMa, de OPTA en de Consumentenautoriteit, de Autoriteit Consument en Markt. Om de ACM slagvaardig te laten functioneren, wordt voorgesteld het handhavingsinstrumentarium m.b.t. het markttoezicht van de ACM te vereenvoudigen. Eén van de voorstellen

  9. ACME: A Basis for Architecture Exchange

    National Research Council Canada - National Science Library

    Wile, David

    2003-01-01

    .... It remains useful in that role, but since the project's inception the Acme language and its support toolkit have grown into a solid foundation upon which new software architecture design and analysis...

  10. Characterization of a novel arginine catabolic mobile element (ACME) and staphylococcal chromosomal cassette mec composite island with significant homology to Staphylococcus epidermidis ACME type II in methicillin-resistant Staphylococcus aureus genotype ST22-MRSA-IV.

    LENUS (Irish Health Repository)

    Shore, Anna C

    2011-05-01

    The arginine catabolic mobile element (ACME) is prevalent among methicillin-resistant Staphylococcus aureus (MRSA) isolates of sequence type 8 (ST8) and staphylococcal chromosomal cassette mec (SCCmec) type IVa (USA300) (ST8-MRSA-IVa isolates), and evidence suggests that ACME enhances the ability of ST8-MRSA-IVa to grow and survive on its host. ACME has been identified in a small number of isolates belonging to other MRSA clones but is widespread among coagulase-negative staphylococci (CoNS). This study reports the first description of ACME in two distinct strains of the pandemic ST22-MRSA-IV clone. A total of 238 MRSA isolates recovered in Ireland between 1971 and 2008 were investigated for ACME using a DNA microarray. Twenty-three isolates (9.7%) were ACME positive, and all were either MRSA genotype ST8-MRSA-IVa (7\\/23, 30%) or MRSA genotype ST22-MRSA-IV (16\\/23, 70%). Whole-genome sequencing and comprehensive molecular characterization revealed the presence of a novel 46-kb ACME and staphylococcal chromosomal cassette mec (SCCmec) composite island (ACME\\/SCCmec-CI) in ST22-MRSA-IVh isolates (n=15). This ACME\\/SCCmec-CI consists of a 12-kb DNA region previously identified in ACME type II in S. epidermidis ATCC 12228, a truncated copy of the J1 region of SCCmec type I, and a complete SCCmec type IVh element. The composite island has a novel genetic organization, with ACME located within orfX and SCCmec located downstream of ACME. One PVL locus-positive ST22-MRSA-IVa isolate carried ACME located downstream of SCCmec type IVa, as previously described in ST8-MRSA-IVa. These results suggest that ACME has been acquired by ST22-MRSA-IV on two independent occasions. At least one of these instances may have involved horizontal transfer and recombination events between MRSA and CoNS. The presence of ACME may enhance dissemination of ST22-MRSA-IV, an already successful MRSA clone.

  11. Highlights from ACM SIGSPATIAL GIS 2011: the 19th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems: (Chicago, Illinois - November 1 - 4, 2011)

    DEFF Research Database (Denmark)

    Jensen, Christian S.; Ofek, Eyal; Tanin, Egemen

    2012-01-01

    ACM SIGSPATIAL GIS 2011 was the 19th gathering of the premier event on spatial information and Geographic Information Systems (GIS). It is also the fourth year that the conference was held under the auspices of ACM's most recent special interest group, SIGSPATIAL. Since its start in 1993, the con...

  12. Proceedings of the 2014 ACM international conference on Interactive experiences for TV and online video

    NARCIS (Netherlands)

    P. Olivier (Patrick); P. Wright; T. Bartindale; M. Obrist (Marianna); P.S. Cesar Garcia (Pablo Santiago); S. Basapur

    2014-01-01

    htmlabstractIt is our great pleasure to introduce the 2014 ACM International Conference on Interactive Experiences for Television and Online Video -- ACM TVX 2014. ACM TVX is a leading annual conference that brings together international researchers and practitioners from a wide range of

  13. Inhibition of Enterococcus faecium adherence to collagen by antibodies against high-affinity binding subdomains of Acm.

    Science.gov (United States)

    Nallapareddy, Sreedhar R; Sillanpää, Jouko; Ganesh, Vannakambadi K; Höök, Magnus; Murray, Barbara E

    2007-06-01

    Strains of Enterococcus faecium express a cell wall-anchored protein, Acm, which mediates adherence to collagen. Here, we (i) identify the minimal and high-affinity binding subsegments of Acm and (ii) show that anti-Acm immunoglobulin Gs (IgGs) purified against these subsegments reduced E. faecium TX2535 strain collagen adherence up to 73 and 50%, respectively, significantly more than the total IgGs against the full-length Acm A domain (28%) (P Acm adherence with functional subsegment-specific antibodies raises the possibility of their use as therapeutic or prophylactic agents.

  14. Cloud Computing at the Tactical Edge

    Science.gov (United States)

    2012-10-01

    Web Services and Cloud Computing to Create Next-Generation Mobile Applications,” 627-634. Proceedings of the 24th ACM SIGPLAN Conference Companion on...the 1st Workshop on Mobile Middleware. Leuven, Belgium, December 2008. ACM, 2008. [Kumar 2010] Kumar, K. & Lu, Y. “ Cloud Computing for Mobile Users

  15. 76 FR 64943 - Proposed Cercla Administrative Cost Recovery Settlement; ACM Smelter and Refinery Site, Located...

    Science.gov (United States)

    2011-10-19

    ... Settlement; ACM Smelter and Refinery Site, Located in Cascade County, MT AGENCY: Environmental Protection... projected future response costs concerning the ACM Smelter and Refinery NPL Site (Site), Operable Unit 1..., Helena, MT 59626. Mr. Sturn can be reached at (406) 457-5027. Comments should reference the ACM Smelter...

  16. Study on the percent of frequency of ACME-Arca in clinical isolates ...

    African Journals Online (AJOL)

    ACME is a mobile element of Arginine catabolic in Staphylococcus epidermidis that codes specific virulence factors. The purpose of this study was to examine the specific features and prevalence of ACME-arcA in the isolates of Staphylococcus epidermidis resistant to Methicillin isolated by clinical samples in Isfahan.

  17. On-resin conversion of Cys(Acm)-containing peptides to their corresponding Cys(Scm) congeners.

    Science.gov (United States)

    Mullen, Daniel G; Weigel, Benjamin; Barany, George; Distefano, Mark D

    2010-05-01

    The Acm protecting group for the thiol functionality of cysteine is removed under conditions (Hg(2+)) that are orthogonal to the acidic milieu used for global deprotection in Fmoc-based solid-phase peptide synthesis. This use of a toxic heavy metal for deprotection has limited the usefulness of Acm in peptide synthesis. The Acm group may be converted to the Scm derivative that can then be used as a reactive intermediate for unsymmetrical disulfide formation. It may also be removed by mild reductive conditions to generate unprotected cysteine. Conversion of Cys(Acm)-containing peptides to their corresponding Cys(Scm) derivatives in solution is often problematic because the sulfenyl chloride reagent used for this conversion may react with the sensitive amino acids tyrosine and tryptophan. In this protocol, we report a method for on-resin Acm to Scm conversion that allows the preparation of Cys(Scm)-containing peptides under conditions that do not modify other amino acids. (c) 2010 European Peptide Society and John Wiley & Sons, Ltd.

  18. CNTF-ACM promotes mitochondrial respiration and oxidative stress in cortical neurons through upregulating L-type calcium channel activity.

    Science.gov (United States)

    Sun, Meiqun; Liu, Hongli; Xu, Huanbai; Wang, Hongtao; Wang, Xiaojing

    2016-09-01

    A specialized culture medium termed ciliary neurotrophic factor-treated astrocyte-conditioned medium (CNTF-ACM) allows investigators to assess the peripheral effects of CNTF-induced activated astrocytes upon cultured neurons. CNTF-ACM has been shown to upregulate neuronal L-type calcium channel current activity, which has been previously linked to changes in mitochondrial respiration and oxidative stress. Therefore, the aim of this study was to evaluate CNTF-ACM's effects upon mitochondrial respiration and oxidative stress in rat cortical neurons. Cortical neurons, CNTF-ACM, and untreated control astrocyte-conditioned medium (UC-ACM) were prepared from neonatal Sprague-Dawley rat cortical tissue. Neurons were cultured in either CNTF-ACM or UC-ACM for a 48-h period. Changes in the following parameters before and after treatment with the L-type calcium channel blocker isradipine were assessed: (i) intracellular calcium levels, (ii) mitochondrial membrane potential (ΔΨm), (iii) oxygen consumption rate (OCR) and adenosine triphosphate (ATP) formation, (iv) intracellular nitric oxide (NO) levels, (v) mitochondrial reactive oxygen species (ROS) production, and (vi) susceptibility to the mitochondrial complex I toxin rotenone. CNTF-ACM neurons displayed the following significant changes relative to UC-ACM neurons: (i) increased intracellular calcium levels (p ACM (p ACM promotes mitochondrial respiration and oxidative stress in cortical neurons through elevating L-type calcium channel activity.

  19. The acellular matrix (ACM) for bladder tissue engineering: A quantitative magnetic resonance imaging study.

    Science.gov (United States)

    Cheng, Hai-Ling Margaret; Loai, Yasir; Beaumont, Marine; Farhat, Walid A

    2010-08-01

    Bladder acellular matrices (ACMs) derived from natural tissue are gaining increasing attention for their role in tissue engineering and regeneration. Unlike conventional scaffolds based on biodegradable polymers or gels, ACMs possess native biomechanical and many acquired biologic properties. Efforts to optimize ACM-based scaffolds are ongoing and would be greatly assisted by a noninvasive means to characterize scaffold properties and monitor interaction with cells. MRI is well suited to this role, but research with MRI for scaffold characterization has been limited. This study presents initial results from quantitative MRI measurements for bladder ACM characterization and investigates the effects of incorporating hyaluronic acid, a natural biomaterial useful in tissue-engineering and regeneration. Measured MR relaxation times (T(1), T(2)) and diffusion coefficient were consistent with increased water uptake and glycosaminoglycan content observed on biochemistry in hyaluronic acid ACMs. Multicomponent MRI provided greater specificity, with diffusion data showing an acellular environment and T(2) components distinguishing the separate effects of increased glycosaminoglycans and hydration. These results suggest that quantitative MRI may provide useful information on matrix composition and structure, which is valuable in guiding further development using bladder ACMs for organ regeneration and in strategies involving the use of hyaluronic acid.

  20. VANET '13: Proceeding of the Tenth ACM International Workshop on Vehicular Inter-networking, Systems, and Applications

    NARCIS (Netherlands)

    Gozalvez, J.; Kargl, Frank; Mittag, J.; Kravets, R.; Tsai, M.; Unknown, [Unknown

    This year marks a very important date for the ACM international workshop on Vehicular inter-networking, systems, and applications as ACM VANET celebrates now its 10th edition. Starting in 2004 as "ACM international workshop on Vehicular ad hoc networks" already the change in title indicates that

  1. Clinical isolates of Enterococcus faecium exhibit strain-specific collagen binding mediated by Acm, a new member of the MSCRAMM family.

    Science.gov (United States)

    Nallapareddy, Sreedhar R; Weinstock, George M; Murray, Barbara E

    2003-03-01

    A collagen-binding adhesin of Enterococcus faecium, Acm, was identified. Acm shows 62% similarity to the Staphylococcus aureus collagen adhesin Cna over the entire protein and is more similar to Cna (60% and 75% similarity with Cna A and B domains respectively) than to the Enterococcus faecalis collagen-binding adhesin, Ace, which shares homology with Acm only in the A domain. Despite the detection of acm in 32 out of 32 E. faecium isolates, only 11 of these (all clinical isolates, including four vancomycin-resistant endocarditis isolates and seven other isolates) exhibited binding to collagen type I (CI). Although acm from three CI-binding vancomycin-resistant E. faecium clinical isolates showed 100% identity, analysis of acm genes and their promoter regions from six non-CI-binding strains identified deletions or mutations that introduced stop codons and/or IS elements within the gene or the promoter region in five out of six strains, suggesting that the presence of an intact functional acm gene is necessary for binding of E. faecium strains to CI. Recombinant Acm A domain showed specific and concentration-dependent binding to collagen, and this protein competed with E. faecium binding to immobilized CI. Consistent with the adherence phenotype and sequence data, probing with Acm-specific IgGs purified from anti-recombinant Acm A polyclonal rabbit serum confirmed the surface expression of Acm in three out of three collagen-binding clinical isolates of E. faecium tested, but in none of the strains with a non-functional pseudo acm gene. Introduction of a functional acm gene into two non-CI-binding natural acm mutant strains conferred a CI-binding phenotype, further confirming that native Acm is sufficient for the binding of E. faecium to CI. These results demonstrate that acm, which encodes a potential virulence factor, is functional only in certain infection-derived clinical isolates of E. faecium, and suggest that Acm is the primary adhesin responsible for the

  2. Formation of personality’s acme-qualities as a component of physical education specialists’ acmeological competence

    Directory of Open Access Journals (Sweden)

    T.Hr. Dereka

    2016-10-01

    Full Text Available Purpose: to determine characteristics of acme-qualities’ formation in physical education specialists and determine correlations between components. Material: in the research students of “Physical education” specialty (n=194 participated. For assessment personality’s qualities special tests were used. Organization abilities, communicative abilities, creative potential, demand in achievement, emotional information level, control of emotions and etc. were assessed. Results: we determined components of personality’s acme-competence component in physical education specialists. We found density and orientation of correlation and influence of acme-qualities on personality’s component. By the results of factorial analysis we grouped, classified components by four factors and created their visual picture. The accumulated percentage of the studied factors’ dispersion was determined. Conclusions: continuous professional training of physical education specialists on acme-principles resulted in formation of personality’s acme-qualities. They facilitate manifestation of personality’s activity in the process of professional formation and constant self-perfection.

  3. Hypofractionated stereotactic radiotherapy (HFSRT) for who grade I anterior clinoid meningiomas (ACM).

    Science.gov (United States)

    Demiral, Selcuk; Dincoglan, Ferrat; Sager, Omer; Gamsiz, Hakan; Uysal, Bora; Gundem, Esin; Elcim, Yelda; Dirican, Bahar; Beyzadeoglu, Murat

    2016-11-01

    While microsurgical resection plays a central role in the management of ACMs, extensive surgery may be associated with substantial morbidity particularly for tumors in intimate association with critical structures. In this study, we evaluated the use of HFSRT in the management of ACM. A total of 22 patients with ACM were treated using HFSRT. Frameless image guided volumetric modulated arc therapy (VMAT) was performed with a 6 MV linear accelerator (LINAC). The total dose was 25 Gy delivered in five fractions over five consecutive treatment days. Local control (LC) and progression free survival (PFS) rates were calculated using the Kaplan-Meier method. Common Terminology Criteria for Adverse Events, version 4.0 was used in toxicity grading. Out of the total 22 patients, outcomes of 19 patients with at least 36 months of periodic follow-up were assessed. Median patient age was 40 years old (range 24-77 years old). Median follow-up time was 53 months (range 36-63 months). LC and PFS rates were 100 and 89.4 % at 1 and 3 years, respectively. Only two patients (10.5 %) experienced clinical deterioration during the follow-up period. LINAC-based HFSRT offers high rates of LC and PFS for patients with ACMs.

  4. Representation of deforestation impacts on climate, water, and nutrient cycles in the ACME earth system model

    Science.gov (United States)

    Cai, X.; Riley, W. J.; Zhu, Q.

    2017-12-01

    Deforestation causes a series of changes to the climate, water, and nutrient cycles. Employing a state-of-the-art earth system model—ACME (Accelerated Climate Modeling for Energy), we comprehensively investigate the impacts of deforestation on these processes. We first assess the performance of the ACME Land Model (ALM) in simulating runoff, evapotranspiration, albedo, and plant productivity at 42 FLUXNET sites. The single column mode of ACME is then used to examine climate effects (temperature cooling/warming) and responses of runoff, evapotranspiration, and nutrient fluxes to deforestation. This approach separates local effects of deforestation from global circulation effects. To better understand the deforestation effects in a global context, we use the coupled (atmosphere, land, and slab ocean) mode of ACME to demonstrate the impacts of deforestation on global climate, water, and nutrient fluxes. Preliminary results showed that the land component of ACME has advantages in simulating these processes and that local deforestation has potentially large impacts on runoff and atmospheric processes.

  5. Autolysis of Lactococcus lactis caused by induced overproduction of its major autolysin, AcmA

    NARCIS (Netherlands)

    Buist, Girbe; Karsens, H; Nauta, A; van Sinderen, D; Venema, G; Kok, J

    The optical density of a culture of Lactococcus lactis MG1363 was reduced more than 60% during prolonged stationary phase, Reduction in optical density (autolysis) was almost absent in a culture of an isogenic mutant containing a deletion in the major autolysin gene, acmA. An acmA mutant carrying

  6. Timely activation of budding yeast APCCdh1 involves degradation of its inhibitor, Acm1, by an unconventional proteolytic mechanism.

    Directory of Open Access Journals (Sweden)

    Michael Melesse

    Full Text Available Regulated proteolysis mediated by the ubiquitin proteasome system is a fundamental and essential feature of the eukaryotic cell division cycle. Most proteins with cell cycle-regulated stability are targeted for degradation by one of two related ubiquitin ligases, the Skp1-cullin-F box protein (SCF complex or the anaphase-promoting complex (APC. Here we describe an unconventional cell cycle-regulated proteolytic mechanism that acts on the Acm1 protein, an inhibitor of the APC activator Cdh1 in budding yeast. Although Acm1 can be recognized as a substrate by the Cdc20-activated APC (APCCdc20 in anaphase, APCCdc20 is neither necessary nor sufficient for complete Acm1 degradation at the end of mitosis. An APC-independent, but 26S proteasome-dependent, mechanism is sufficient for complete Acm1 clearance from late mitotic and G1 cells. Surprisingly, this mechanism appears distinct from the canonical ubiquitin targeting pathway, exhibiting several features of ubiquitin-independent proteasomal degradation. For example, Acm1 degradation in G1 requires neither lysine residues in Acm1 nor assembly of polyubiquitin chains. Acm1 was stabilized though by conditional inactivation of the ubiquitin activating enzyme Uba1, implying some requirement for the ubiquitin pathway, either direct or indirect. We identified an amino terminal predicted disordered region in Acm1 that contributes to its proteolysis in G1. Although ubiquitin-independent proteasome substrates have been described, Acm1 appears unique in that its sensitivity to this mechanism is strictly cell cycle-regulated via cyclin-dependent kinase (Cdk phosphorylation. As a result, Acm1 expression is limited to the cell cycle window in which Cdk is active. We provide evidence that failure to eliminate Acm1 impairs activation of APCCdh1 at mitotic exit, justifying its strict regulation by cell cycle-dependent transcription and proteolytic mechanisms. Importantly, our results reveal that strict cell

  7. Timely Activation of Budding Yeast APCCdh1 Involves Degradation of Its Inhibitor, Acm1, by an Unconventional Proteolytic Mechanism

    Science.gov (United States)

    Melesse, Michael; Choi, Eunyoung; Hall, Hana; Walsh, Michael J.; Geer, M. Ariel; Hall, Mark C.

    2014-01-01

    Regulated proteolysis mediated by the ubiquitin proteasome system is a fundamental and essential feature of the eukaryotic cell division cycle. Most proteins with cell cycle-regulated stability are targeted for degradation by one of two related ubiquitin ligases, the Skp1-cullin-F box protein (SCF) complex or the anaphase-promoting complex (APC). Here we describe an unconventional cell cycle-regulated proteolytic mechanism that acts on the Acm1 protein, an inhibitor of the APC activator Cdh1 in budding yeast. Although Acm1 can be recognized as a substrate by the Cdc20-activated APC (APCCdc20) in anaphase, APCCdc20 is neither necessary nor sufficient for complete Acm1 degradation at the end of mitosis. An APC-independent, but 26S proteasome-dependent, mechanism is sufficient for complete Acm1 clearance from late mitotic and G1 cells. Surprisingly, this mechanism appears distinct from the canonical ubiquitin targeting pathway, exhibiting several features of ubiquitin-independent proteasomal degradation. For example, Acm1 degradation in G1 requires neither lysine residues in Acm1 nor assembly of polyubiquitin chains. Acm1 was stabilized though by conditional inactivation of the ubiquitin activating enzyme Uba1, implying some requirement for the ubiquitin pathway, either direct or indirect. We identified an amino terminal predicted disordered region in Acm1 that contributes to its proteolysis in G1. Although ubiquitin-independent proteasome substrates have been described, Acm1 appears unique in that its sensitivity to this mechanism is strictly cell cycle-regulated via cyclin-dependent kinase (Cdk) phosphorylation. As a result, Acm1 expression is limited to the cell cycle window in which Cdk is active. We provide evidence that failure to eliminate Acm1 impairs activation of APCCdh1 at mitotic exit, justifying its strict regulation by cell cycle-dependent transcription and proteolytic mechanisms. Importantly, our results reveal that strict cell-cycle expression profiles

  8. Comparative study of numerical schemes of TVD3, UNO3-ACM and optimized compact scheme

    Science.gov (United States)

    Lee, Duck-Joo; Hwang, Chang-Jeon; Ko, Duck-Kon; Kim, Jae-Wook

    1995-01-01

    Three different schemes are employed to solve the benchmark problem. The first one is a conventional TVD-MUSCL (Monotone Upwind Schemes for Conservation Laws) scheme. The second scheme is a UNO3-ACM (Uniformly Non-Oscillatory Artificial Compression Method) scheme. The third scheme is an optimized compact finite difference scheme modified by us: the 4th order Runge Kutta time stepping, the 4th order pentadiagonal compact spatial discretization with the maximum resolution characteristics. The problems of category 1 are solved by using the second (UNO3-ACM) and third (Optimized Compact) schemes. The problems of category 2 are solved by using the first (TVD3) and second (UNO3-ACM) schemes. The problem of category 5 is solved by using the first (TVD3) scheme. It can be concluded from the present calculations that the Optimized Compact scheme and the UN03-ACM show good resolutions for category 1 and category 2 respectively.

  9. An Unexpected Location of the Arginine Catabolic Mobile Element (ACME) in a USA300-Related MRSA Strain

    DEFF Research Database (Denmark)

    Damkjær Bartels, Mette; Hansen, Lars H.; Boye, Kit

    2011-01-01

    In methicillin resistant Staphylococcus aureus (MRSA), the arginine catabolic mobile element (ACME) was initially described in USA300 (t008-ST8) where it is located downstream of the staphylococcal cassette chromosome mec (SCCmec). A common health-care associated MRSA in Copenhagen, Denmark (t024......-ST8) is clonally related to USA300 and is frequently PCR positive for the ACME specific arcA-gene. This study is the first to describe an ACME element upstream of the SCCmec in MRSA. By traditional SCCmec typing schemes, the SCCmec of t024-ST8 strain M1 carries SCCmec IVa, but full sequencing...... of SCCmec, M1 had two new DR between the orfX gene and the J3 region of the SCCmec. The region between the orfX DR (DR1) and DR2 contained the ccrAB4 genes. An ACME II-like element was located between DR2 and DR3. The entire 26,468 bp sequence between DR1 and DR3 was highly similar to parts of the ACME...

  10. A SURVEY ON UBIQUITOUS COMPUTING

    Directory of Open Access Journals (Sweden)

    Vishal Meshram

    2016-01-01

    Full Text Available This work presents a survey of ubiquitous computing research which is the emerging domain that implements communication technologies into day-to-day life activities. This research paper provides a classification of the research areas on the ubiquitous computing paradigm. In this paper, we present common architecture principles of ubiquitous systems and analyze important aspects in context-aware ubiquitous systems. In addition, this research work presents a novel architecture of ubiquitous computing system and a survey of sensors needed for applications in ubiquitous computing. The goals of this research work are three-fold: i serve as a guideline for researchers who are new to ubiquitous computing and want to contribute to this research area, ii provide a novel system architecture for ubiquitous computing system, and iii provides further research directions required into quality-of-service assurance of ubiquitous computing.

  11. Twenty Years of Creativity Research in Human-Computer Interaction: Current State and Future Directions

    DEFF Research Database (Denmark)

    Frich Pedersen, Jonas; Biskjaer, Michael Mose; Dalsgaard, Peter

    2018-01-01

    Creativity has been a growing topic within the ACM community since the 1990s. However, no clear overview of this trend has been offered. We present a thorough survey of 998 creativity-related publications in the ACM Digital Library collected using keyword search to determine prevailing approaches......, topics, and characteristics of creativity-oriented Human-Computer Interaction (HCI) research. . A selected sample based on yearly citations yielded 221 publications, which were analyzed using constant comparison analysis. We found that HCI is almost exclusively responsible for creativity......-oriented publications; they focus on collaborative creativity rather than individual creativity; there is a general lack of definition of the term ‘creativity’; empirically based contributions are prevalent; and many publications focus on new tools, often developed by researchers. On this basis, we present three...

  12. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  13. Computational Sociolinguistics: A Survey

    NARCIS (Netherlands)

    Nguyen, Dong-Phuong; Doğruöz, A. Seza; Rosé, Carolyn P.; de Jong, Franciska M.G.

    Language is a social phenomenon and variation is inherent to its social nature. Recently, there has been a surge of interest within the computational linguistics (CL) community in the social dimension of language. In this article we present a survey of the emerging field of “computational

  14. Computational Sociolinguistics: A Survey.

    NARCIS (Netherlands)

    de Jong, F.M.G.; Nguyen, Dong

    2016-01-01

    Language is a social phenomenon and variation is inherent to its social nature. Recently, there has been a surge of interest within the computational linguistics (CL) community in the social dimension of language. In this article we present a survey of the emerging field of “computational

  15. Proceedings of the ACM SIGIR Workshop ''Searching Spontaneous Conversational Speech''

    NARCIS (Netherlands)

    de Jong, Franciska M.G.; Oard, Douglas; Ordelman, Roeland J.F.; Raaijmakers, Stephan

    2007-01-01

    The Proceedings contain the contributions to the workshop on Searching Spontaneous Conversational Speech organized in conjunction with the 30th ACM SIGIR, Amsterdam 2007. The papers reflect some of the emerging focus areas and cross-cutting research topics, together addressing evaluation metrics,

  16. Molecular characteristics of clinical methicillin-resistant Staphylococcus pseudintermedius harboring arginine catabolic mobile element (ACME) from dogs and cats.

    Science.gov (United States)

    Yang, Ching; Wan, Min-Tao; Lauderdale, Tsai-Ling; Yeh, Kuang-Sheng; Chen, Charles; Hsiao, Yun-Hsia; Chou, Chin-Cheng

    2017-06-01

    This study aimed to investigate the presence of arginine catabolic mobile element (ACME) and its associated molecular characteristics in methicillin-resistant Staphylococcus pseudintermedius (MRSP). Among the 72 S. pseudintermedius recovered from various infection sites of dogs and cats, 52 (72.2%) were MRSP. ACME-arcA was detected commonly (69.2%) in these MRSP isolates, and was more frequently detected in those from the skin than from other body sites (P=0.047). There was a wide genetic diversity among the ACME-arcA-positive MRSP isolates, which comprised three SCCmec types (II-III, III and V) and 15 dru types with two predominant clusters (9a and 11a). Most MRSP isolates were multidrug-resistant. Since S. pseudintermedius could serve as a reservoir of ACME, further research on this putative virulence factor is recommended. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. AcmB Is an S-Layer-Associated β-N-Acetylglucosaminidase and Functional Autolysin in Lactobacillus acidophilus NCFM.

    Science.gov (United States)

    Johnson, Brant R; Klaenhammer, Todd R

    2016-09-15

    Autolysins, also known as peptidoglycan hydrolases, are enzymes that hydrolyze specific bonds within bacterial cell wall peptidoglycan during cell division and daughter cell separation. Within the genome of Lactobacillus acidophilus NCFM, there are 11 genes encoding proteins with peptidoglycan hydrolase catalytic domains, 9 of which are predicted to be functional. Notably, 5 of the 9 putative autolysins in L. acidophilus NCFM are S-layer-associated proteins (SLAPs) noncovalently colocalized along with the surface (S)-layer at the cell surface. One of these SLAPs, AcmB, a β-N-acetylglucosaminidase encoded by the gene lba0176 (acmB), was selected for functional analysis. In silico analysis revealed that acmB orthologs are found exclusively in S-layer- forming species of Lactobacillus Chromosomal deletion of acmB resulted in aberrant cell division, autolysis, and autoaggregation. Complementation of acmB in the ΔacmB mutant restored the wild-type phenotype, confirming the role of this SLAP in cell division. The absence of AcmB within the exoproteome had a pleiotropic effect on the extracellular proteins covalently and noncovalently bound to the peptidoglycan, which likely led to the observed decrease in the binding capacity of the ΔacmB strain for mucin and extracellular matrices fibronectin, laminin, and collagen in vitro These data suggest a functional association between the S-layer and the multiple autolysins noncovalently colocalized at the cell surface of L. acidophilus NCFM and other S-layer-producing Lactobacillus species. Lactobacillus acidophilus is one of the most widely used probiotic microbes incorporated in many dairy foods and dietary supplements. This organism produces a surface (S)-layer, which is a self-assembling crystalline array found as the outermost layer of the cell wall. The S-layer, along with colocalized associated proteins, is an important mediator of probiotic activity through intestinal adhesion and modulation of the mucosal immune

  18. A functional collagen adhesin gene, acm, in clinical isolates of Enterococcus faecium correlates with the recent success of this emerging nosocomial pathogen.

    Science.gov (United States)

    Nallapareddy, Sreedhar R; Singh, Kavindra V; Okhuysen, Pablo C; Murray, Barbara E

    2008-09-01

    Enterococcus faecium recently evolved from a generally avirulent commensal into a multidrug-resistant health care-associated pathogen causing difficult-to-treat infections, but little is known about the factors responsible for this change. We previously showed that some E. faecium strains express a cell wall-anchored collagen adhesin, Acm. Here we analyzed 90 E. faecium isolates (99% acm(+)) and found that the Acm protein was detected predominantly in clinically derived isolates, while the acm gene was present as a transposon-interrupted pseudogene in 12 of 47 isolates of nonclinical origin. A highly significant association between clinical (versus fecal or food) origin and collagen adherence (P Acm detected by whole-cell enzyme-linked immunosorbent assay and flow cytometry. Thirty-seven of 41 sera from patients with E. faecium infections showed reactivity with recombinant Acm, while only 4 of 30 community and hospitalized patient control group sera reacted (P Acm were present in all 14 E. faecium endocarditis patient sera. Although pulsed-field gel electrophoresis indicated that multiple strains expressed collagen adherence, multilocus sequence typing demonstrated that the majority of collagen-adhering isolates, as well as 16 of 17 endocarditis isolates, are part of the hospital-associated E. faecium genogroup referred to as clonal complex 17 (CC17), which has emerged globally. Taken together, our findings support the hypothesis that Acm has contributed to the emergence of E. faecium and CC17 in nosocomial infections.

  19. Comparison of the chemical behaviour of humanized ACMS VS. Human IGG radiolabeled with 99mTc

    International Nuclear Information System (INIS)

    Rivero Santamaria, Alejandro; Zayas Crespo, Francisco; Mesa Duennas, Niurka; Castillo Vitloch, Adolfo J.

    2003-01-01

    The purpose of this work is to compare the chemical behaviour of humanized AcMs vs. human IgG radiolabeled with 99 mTc. to this end, 3 immunoglobulins were analyzed, the IgG (human), the humanized monoclonal antibody R3 (Acm-R3h) and the humanized monoclonal antibody T1. The results obtained reveal slight differences as regards the behaviour of theses immunoglobulins before the labelling with 99T c, which shows differences in the chemical behaviour of these proteins. Although in theory the modifications that are made to the AcMs in order to humanize them must not affect their chemical behaviour, the obtained data indicate that the conditions for their radiolabelling should not be extrapolated from other proteins; on the contrary, particular procedures should be elaborated for each AcM-h

  20. Selected Publications in Image Understanding and Computer Vision from 1974 to 1983

    Science.gov (United States)

    1985-04-18

    Germany, September 26-28, 1978), Plenum, New York, 1979. 9. Reconnaissance des Formes et Intelligence Artificielle (2’me Congres AFCET-IRIA, Toulouse...the last decade. .To L..... ABBREVIATIONS - AI Artificial Intelligence BC Biological Cybernetics CACM Communications of the ACM CG Computer Graphics... Intelligence PACM Proceedings of the ACM "P-IEEE Proceedings of the IEEE P-NCC Proceedings of the National Computer Conference PR Pattern Recognition PRL

  1. Green Cloud Computing: A Literature Survey

    Directory of Open Access Journals (Sweden)

    Laura-Diana Radu

    2017-11-01

    Full Text Available Cloud computing is a dynamic field of information and communication technologies (ICTs, introducing new challenges for environmental protection. Cloud computing technologies have a variety of application domains, since they offer scalability, are reliable and trustworthy, and offer high performance at relatively low cost. The cloud computing revolution is redesigning modern networking, and offering promising environmental protection prospects as well as economic and technological advantages. These technologies have the potential to improve energy efficiency and to reduce carbon footprints and (e-waste. These features can transform cloud computing into green cloud computing. In this survey, we review the main achievements of green cloud computing. First, an overview of cloud computing is given. Then, recent studies and developments are summarized, and environmental issues are specifically addressed. Finally, future research directions and open problems regarding green cloud computing are presented. This survey is intended to serve as up-to-date guidance for research with respect to green cloud computing.

  2. HPDC ´12 : proceedings of the 21st ACM symposium on high-performance parallel and distributed computing, June 18-22, 2012, Delft, The Netherlands

    NARCIS (Netherlands)

    Epema, D.H.J.; Kielmann, T.; Ripeanu, M.

    2012-01-01

    Welcome to ACM HPDC 2012! This is the twenty-first year of HPDC and we are pleased to report that our community continues to grow in size, quality and reputation. The program consists of three days packed with presentations on the latest developments in high-performance parallel and distributed

  3. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  4. Listening to professional voices: draft 2 of the ACM code of ethics and professional conduct

    OpenAIRE

    Flick, Catherine; Brinkman, Bo; Gotterbarn, D. W.; Miller, Keith; Vazansky, Kate; Wolf, Marty J.

    2017-01-01

    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link. For the first time since 1992, the ACM Code of Ethics and Professional Conduct (the Code) is being updated. The Code Update Task Force in conjunction with the Committee on Professional Ethics is seeking advice from ACM members on the update. We indicated many of the motivations for changing the Code when we shared Draft 1 of Code 2018 with the ...

  5. Special issue of Higher-Order and Symbolic Computation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Sabry, Amr

    This issue of HOSC is dedicated to the general topic of continuations. It grew out of the third ACM SIGPLAN Workshop on Continuations (CW'01), which took place in London, UK on January 16, 2001 [3]. The notion of continuation is ubiquitous in many different areas of computer science, including...... and streamline Filinski's earlier work in the previous special issue of HOSC (then LISP and Symbolic Computation) that grew out of the first ACM SIGPLAN Workshop on Continuations [1, 2]. Hasegawa and Kakutani's article is the journal version of an article presented at FOSSACS 2001 and that received the EATCS...

  6. Searching Spontaneous Conversational Speech. Proceedings of ACM SIGIR Workshop (SSCS2008)

    NARCIS (Netherlands)

    Köhler, J.; Larson, M; de Jong, Franciska M.G.; Ordelman, Roeland J.F.; Kraaij, W.

    2008-01-01

    The second workshop on Searching Spontaneous Conversational Speech (SSCS 2008) was held in Singapore on July 24, 2008 in conjunction with the 31st Annual International ACM SIGIR Conference. The goal of the workshop was to bring the speech community and the information retrieval community together.

  7. Cloud Computing Security Issue: Survey

    Science.gov (United States)

    Kamal, Shailza; Kaur, Rajpreet

    2011-12-01

    Cloud computing is the growing field in IT industry since 2007 proposed by IBM. Another company like Google, Amazon, and Microsoft provides further products to cloud computing. The cloud computing is the internet based computing that shared recourses, information on demand. It provides the services like SaaS, IaaS and PaaS. The services and recourses are shared by virtualization that run multiple operation applications on cloud computing. This discussion gives the survey on the challenges on security issues during cloud computing and describes some standards and protocols that presents how security can be managed.

  8. Extra-Margins in ACM's Adjusted NMa ‘Mortgage-Rate-Calculation Method

    NARCIS (Netherlands)

    Dijkstra, M.; Schinkel, M.P.

    2013-01-01

    We analyse the development since 2004 of our concept of extra-margins on Dutch mortgages (Dijkstra & Schinkel, 2012), based on funding cost estimations in ACM (2013), which are an update of those in NMa (2011). Neither costs related to increased mortgage-specific risks, nor the inclusion of Basel

  9. Gender and Computers: Two Surveys of Computer-Related Attitudes.

    Science.gov (United States)

    Wilder, Gita; And Others

    1985-01-01

    Describes two surveys used to (1) determine sex differences in attitudes toward computers and video games among schoolchildren and the relationship of these attitudes to attitudes about science, math, and writing; and (2) sex differences in attitudes toward computing among a select group of highly motivated college freshmen. (SA)

  10. A Prediction of the Damping Properties of Hindered Phenol AO-60/polyacrylate Rubber (AO-60/ACM) Composites through Molecular Dynamics Simulation

    Science.gov (United States)

    Yang, Da-Wei; Zhao, Xiu-Ying; Zhang, Geng; Li, Qiang-Guo; Wu, Si-Zhu

    2016-05-01

    Molecule dynamics (MD) simulation, a molecular-level method, was applied to predict the damping properties of AO-60/polyacrylate rubber (AO-60/ACM) composites before experimental measures were performed. MD simulation results revealed that two types of hydrogen bond, namely, type A (AO-60) -OH•••O=C- (ACM), type B (AO-60) - OH•••O=C- (AO-60) were formed. Then, the AO-60/ACM composites were fabricated and tested to verify the accuracy of the MD simulation through dynamic mechanical thermal analysis (DMTA). DMTA results showed that the introduction of AO-60 could remarkably improve the damping properties of the composites, including the increase of glass transition temperature (Tg) alongside with the loss factor (tan δ), also indicating the AO-60/ACM(98/100) had the best damping performance amongst the composites which verified by the experimental.

  11. ACM-based automatic liver segmentation from 3-D CT images by combining multiple atlases and improved mean-shift techniques.

    Science.gov (United States)

    Ji, Hongwei; He, Jiangping; Yang, Xin; Deklerck, Rudi; Cornelis, Jan

    2013-05-01

    In this paper, we present an autocontext model(ACM)-based automatic liver segmentation algorithm, which combines ACM, multiatlases, and mean-shift techniques to segment liver from 3-D CT images. Our algorithm is a learning-based method and can be divided into two stages. At the first stage, i.e., the training stage, ACM is performed to learn a sequence of classifiers in each atlas space (based on each atlas and other aligned atlases). With the use of multiple atlases, multiple sequences of ACM-based classifiers are obtained. At the second stage, i.e., the segmentation stage, the test image will be segmented in each atlas space by applying each sequence of ACM-based classifiers. The final segmentation result will be obtained by fusing segmentation results from all atlas spaces via a multiclassifier fusion technique. Specially, in order to speed up segmentation, given a test image, we first use an improved mean-shift algorithm to perform over-segmentation and then implement the region-based image labeling instead of the original inefficient pixel-based image labeling. The proposed method is evaluated on the datasets of MICCAI 2007 liver segmentation challenge. The experimental results show that the average volume overlap error and the average surface distance achieved by our method are 8.3% and 1.5 m, respectively, which are comparable to the results reported in the existing state-of-the-art work on liver segmentation.

  12. A Novel Observation-Guided Approach for Evaluating Mesoscale Convective Systems Simulated by the DOE ACME Model

    Science.gov (United States)

    Feng, Z.; Ma, P. L.; Hardin, J. C.; Houze, R.

    2017-12-01

    Mesoscale convective systems (MCSs) are the largest type of convective storms that develop when convection aggregates and induces mesoscale circulation features. Over North America, MCSs contribute over 60% of the total warm-season precipitation and over half of the extreme daily precipitation in the central U.S. Our recent study (Feng et al. 2016) found that the observed increases in springtime total and extreme rainfall in this region are dominated by increased frequency and intensity of long-lived MCSs*. To date, global climate models typically do not run at a resolution high enough to explicitly simulate individual convective elements and may not have adequate process representations for MCSs, resulting in a large deficiency in projecting changes of the frequency of extreme precipitation events in future climate. In this study, we developed a novel observation-guided approach specifically designed to evaluate simulated MCSs in the Department of Energy's climate model, Accelerated Climate Modeling for Energy (ACME). The ACME model has advanced treatments for convection and subgrid variability and for this study is run at 25 km and 100 km grid spacings. We constructed a robust MCS database consisting of over 500 MCSs from 3 warm-season observations by applying a feature-tracking algorithm to 4-km resolution merged geostationary satellite and 3-D NEXRAD radar network data over the Continental US. This high-resolution MCS database is then down-sampled to the 25 and 100 km ACME grids to re-characterize key MCS properties. The feature-tracking algorithm is adapted with the adjusted characteristics to identify MCSs from ACME model simulations. We demonstrate that this new analysis framework is useful for evaluating ACME's warm-season precipitation statistics associated with MCSs, and provides insights into the model process representations related to extreme precipitation events for future improvement. *Feng, Z., L. R. Leung, S. Hagos, R. A. Houze, C. D. Burleyson

  13. Call for participation first ACM workshop on education in computer security

    OpenAIRE

    Irvine, Cynthia; Orman, Hilarie

    1997-01-01

    Taken from the NPS website. The security of information systems and networks is a growing concern. Experts are needed to design and organize the protection mechanisms for these systems. Both government and industry increasingly seek individuals with knowledge and skills in computer security. In the past, most traditional computer science curricula bypassed formal studies in computer security altogether. An understanding of computer security was achieved largely through on-the-job ...

  14. HT 2011 : Proceedings of the 22nd ACM Conference on Hypertext and Hypermedia, Eindhoven, The Netherlands, June 6-9, 2011

    NARCIS (Netherlands)

    De Bra, P.M.E.; Gronbak, K.

    2011-01-01

    Foreword. It is our great pleasure to welcome you to ACM Hypertext 2011, the 22nd ACM Conference on Hypertext and Hypermedia, and to the "Land of the Innovator", the campus of the Eindhoven University of Technology, located in the "city of light" Eindhoven, the Netherlands. Hypertext is an exciting

  15. A survey of computer science capstone course literature

    Science.gov (United States)

    Dugan, Robert F., Jr.

    2011-09-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software process phases, project type, documentation, tools, groups, and instructor administration. We reflected on these issues and thecomputer science capstone course we have taught for seven years. The survey summarized, organized, and synthesized the literature to provide a referenced resource for computer science instructors and researchers interested in computer science capstone courses.

  16. ADAPTIF CONSERVATION (ACM MODEL IN INCREASING FAMILY SUPPORT AND COMPLIANCE TREATMENT IN PATIENT WITH PULONARY TUBERCULOSIS IN SURABAYA CITY REGION

    Directory of Open Access Journals (Sweden)

    Siti Nur Kholifah

    2017-04-01

    Full Text Available Introduction: Tuberculosis (TB in Indonesia is still health problem and the prevalence rate is high. Discontinuing medication and lack of family support are the causalities. Numbers of strategies to overcome are seemingly not succeeded. Roles and responsibilities of family nursing are crucial to improve participation, motivation of individual, family and community in prevention, including pulmonary tuberculosis. Unfortunately, models of pulmonary tuberculosis currently unavailable. The combination of adaptation and conservation in complementarily improving family support and compliance in medication is introduced in this study. Method: This research intended to analyze Adaptive Conservation Model (ACM in extending family support and treatment compliance. Modeling steps including model analysis, expert validation, field trial, implementation and recommending the output model. Research subject involves 15 families who implement family Assistance and supervision in Medication (ASM and other 15 families with ACM. Result: The study revealed ACM is better than ASM on the case of family support and medication compliances. It supports the role of environment as influential factor on individual health belief, values and decision making. Therefore, it is advised to apply ACM in enhancing family support and compliance of pulmonary TB patients. Discussion: Social and family supports to ACM group obtained by developing interaction through communication. Family interaction necessary to improve family support to pulmonary tuberculosis patients. And social support plays as motivator to maintain compliance on medication

  17. Development and first application of an Aerosol Collection Module (ACM) for quasi online compound specific aerosol measurements

    Science.gov (United States)

    Hohaus, Thorsten; Kiendler-Scharr, Astrid; Trimborn, Dagmar; Jayne, John; Wahner, Andreas; Worsnop, Doug

    2010-05-01

    Atmospheric aerosols influence climate and human health on regional and global scales (IPCC, 2007). In many environments organics are a major fraction of the aerosol influencing its properties. Due to the huge variety of organic compounds present in atmospheric aerosol current measurement techniques are far from providing a full speciation of organic aerosol (Hallquist et al., 2009). The development of new techniques for compound specific measurements with high time resolution is a timely issue in organic aerosol research. Here we present first laboratory characterisations of an aerosol collection module (ACM) which was developed to allow for the sampling and transfer of atmospheric PM1 aerosol. The system consists of an aerodynamic lens system focussing particles on a beam. This beam is directed to a 3.4 mm in diameter surface which is cooled to -30 °C with liquid nitrogen. After collection the aerosol sample can be evaporated from the surface by heating it to up to 270 °C. The sample is transferred through a 60cm long line with a carrier gas. In order to test the ACM for linearity and sensitivity we combined it with a GC-MS system. The tests were performed with octadecane aerosol. The octadecane mass as measured with the ACM-GC-MS was compared versus the mass as calculated from SMPS derived total volume. The data correlate well (R2 0.99, slope of linear fit 1.1) indicating 100 % collection efficiency. From 150 °C to 270 °C no effect of desorption temperature on transfer efficiency could be observed. The ACM-GC-MS system was proven to be linear over the mass range 2-100 ng and has a detection limit of ~ 2 ng. First experiments applying the ACM-GC-MS system were conducted at the Jülich Aerosol Chamber. Secondary organic aerosol (SOA) was formed from ozonolysis of 600 ppbv of b-pinene. The major oxidation product nopinone was detected in the aerosol and could be shown to decrease from 2 % of the total aerosol to 0.5 % of the aerosol over the 48 hours of

  18. Responses of Mixed-Phase Cloud Condensates and Cloud Radiative Effects to Ice Nucleating Particle Concentrations in NCAR CAM5 and DOE ACME Climate Models

    Science.gov (United States)

    Liu, X.; Shi, Y.; Wu, M.; Zhang, K.

    2017-12-01

    Mixed-phase clouds frequently observed in the Arctic and mid-latitude storm tracks have the substantial impacts on the surface energy budget, precipitation and climate. In this study, we first implement the two empirical parameterizations (Niemand et al. 2012 and DeMott et al. 2015) of heterogeneous ice nucleation for mixed-phase clouds in the NCAR Community Atmosphere Model Version 5 (CAM5) and DOE Accelerated Climate Model for Energy Version 1 (ACME1). Model simulated ice nucleating particle (INP) concentrations based on Niemand et al. and DeMott et al. are compared with those from the default ice nucleation parameterization based on the classical nucleation theory (CNT) in CAM5 and ACME, and with in situ observations. Significantly higher INP concentrations (by up to a factor of 5) are simulated from Niemand et al. than DeMott et al. and CNT especially over the dust source regions in both CAM5 and ACME. Interestingly the ACME model simulates higher INP concentrations than CAM5, especially in the Polar regions. This is also the case when we nudge the two models' winds and temperature towards the same reanalysis, indicating more efficient transport of aerosols (dust) to the Polar regions in ACME. Next, we examine the responses of model simulated cloud liquid water and ice water contents to different INP concentrations from three ice nucleation parameterizations (Niemand et al., DeMott et al., and CNT) in CAM5 and ACME. Changes in liquid water path (LWP) reach as much as 20% in the Arctic regions in ACME between the three parameterizations while the LWP changes are smaller and limited in the Northern Hemispheric mid-latitudes in CAM5. Finally, the impacts on cloud radiative forcing and dust indirect effects on mixed-phase clouds are quantified with the three ice nucleation parameterizations in CAM5 and ACME.

  19. Proceedings of the 16th ACM SIGPLAN international conference on Functional programming

    DEFF Research Database (Denmark)

    Danvy, Olivier

    Welcome to the 16th ACM SIGPLAN International Conference on Functional Programming -- ICFP'11. The picture, on the front cover, is of Mount Fuji, seen from the 20th floor of the National Institute of Informatics (NII). It was taken by Sebastian Fischer in January 2011. In Japanese, the characters...

  20. A Survey on PageRank Computing

    OpenAIRE

    Berkhin, Pavel

    2005-01-01

    This survey reviews the research related to PageRank computing. Components of a PageRank vector serve as authority weights for web pages independent of their textual content, solely based on the hyperlink structure of the web. PageRank is typically used as a web search ranking component. This defines the importance of the model and the data structures that underly PageRank processing. Computing even a single PageRank is a difficult computational task. Computing many PageRanks is a much mor...

  1. Microstructure and chemical bonding of DLC films deposited on ACM rubber by PACVD

    NARCIS (Netherlands)

    Martinez-Martinez, D.; Schenkel, M.; Pei, Y.T.; Sánchez-López, J.C.; Hosson, J.Th.M. De

    2011-01-01

    The microstructure and chemical bonding of DLC films prepared by plasma assisted chemical vapor deposition on acrylic rubber (ACM) are studied in this paper. The temperature variation produced by the ion impingement during plasma cleaning and subsequent film deposition was used to modify the film

  2. Computer-Assisted Spanish-Composition Survey--1986.

    Science.gov (United States)

    Harvey, T. Edward

    1986-01-01

    A survey of high school and higher education teachers' (N=208) attitudes regarding the use of computers for Spanish-composition instruction revealed that: the lack of foreign-character support remains the major frustration; most teachers used Apple or IBM computers; and there was mixed opinion regarding the real versus the expected benefits of…

  3. Theoretical interpretation of the nuclear structure of 88Se within the ACM and the QPM models.

    Science.gov (United States)

    Gratchev, I. N.; Thiamova, G.; Alexa, P.; Simpson, G. S.; Ramdhane, M.

    2018-02-01

    The four-parameter algebraic collective model (ACM) Hamiltonian is used to describe the nuclear structure of 88Se. It is shown that the ACM is capable of providing a reasonable description of the excitation energies and relative positions of the ground-state band and γ band. The most probable interpretation of the nuclear structure of 88Se is that of a transitional nucleus. The Quasiparticle-plus-Phonon Model (QPM) was also applied to describe the nuclear motion in 88Se. Preliminarily calculations show that the collectivity of second excited state {2}2+ is weak and that this state contains a strong two-quasiparticle component.

  4. Artifical Intelligence for Human Computing

    NARCIS (Netherlands)

    Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.; Unknown, [Unknown

    2007-01-01

    This book constitutes the thoroughly refereed post-proceedings of two events discussing AI for Human Computing: one Special Session during the Eighth International ACM Conference on Multimodal Interfaces (ICMI 2006), held in Banff, Canada, in November 2006, and a Workshop organized in conjunction

  5. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    Science.gov (United States)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  6. GPUs: An Emerging Platform for General-Purpose Computation

    Science.gov (United States)

    2007-08-01

    programming; real-time cinematic quality graphics Peak stream (26) License required (limited time no- cost evaluation program) Commercially...folding.stanford.edu (accessed 30 March 2007). 2. Fan, Z.; Qiu, F.; Kaufman, A.; Yoakum-Stover, S. GPU Cluster for High Performance Computing. ACM/IEEE...accessed 30 March 2007). 8. Goodnight, N.; Wang, R.; Humphreys, G. Computation on Programmable Graphics Hardware. IEEE Computer Graphics and

  7. ACME - Algorithms for Contact in a Multiphysics Environment API Version 1.0

    International Nuclear Information System (INIS)

    BROWN, KEVIN H.; SUMMERS, RANDALL M.; GLASS, MICHEAL W.; GULLERUD, ARNE S.; HEINSTEIN, MARTIN W.; JONES, REESE E.

    2001-01-01

    An effort is underway at Sandia National Laboratories to develop a library of algorithms to search for potential interactions between surfaces represented by analytic and discretized topological entities. This effort is also developing algorithms to determine forces due to these interactions for transient dynamics applications. This document describes the Application Programming Interface (API) for the ACME (Algorithms for Contact in a Multiphysics Environment) library

  8. Equivalency of Paper versus Tablet Computer Survey Data

    Science.gov (United States)

    Ravert, Russell D.; Gomez-Scott, Jessica; Donnellan, M. Brent

    2015-01-01

    Survey responses collected via paper surveys and computer tablets were compared to test for differences between those methods of obtaining self-report data. College students (N = 258) were recruited in public campus locations and invited to complete identical surveys on either paper or iPad tablet. Only minor homogeneity differences were found…

  9. Effects of activated ACM on expression of signal transducers in cerebral cortical neurons of rats.

    Science.gov (United States)

    Wang, Xiaojing; Li, Zhengli; Zhu, Changgeng; Li, Zhongyu

    2007-06-01

    To explore the roles of astrocytes in the epileptogenesis, astrocytes and neurons were isolated, purified and cultured in vitro from cerebral cortex of rats. The astrocytes were activated by ciliary neurotrophic factor (CNTF) and astrocytic conditioned medium (ACM) was collected to treat neurons for 4, 8 and 12 h. By using Western blot, the expression of calmodulin dependent protein kinase II (CaMK II), inducible nitric oxide synthase (iNOS) and adenylate cyclase (AC) was detected in neurons. The results showed that the expression of CaMK II, iNOS and AC was increased significantly in the neurons treated with ACM from 4 h to 12 h (PACM and such signal pathways as NOS-NO-cGMP, Ca2+/CaM-CaMK II and AC-cAMP-PKA might take part in the signal transduction of epileptogenesis.

  10. Evaluating tablet computers as a survey tool in rural communities.

    Science.gov (United States)

    Newell, Steve M; Logan, Henrietta L; Guo, Yi; Marks, John G; Shepperd, James A

    2015-01-01

    Although tablet computers offer advantages in data collection over traditional paper-and-pencil methods, little research has examined whether the 2 formats yield similar responses, especially with underserved populations. We compared the 2 survey formats and tested whether participants' responses to common health questionnaires or perceptions of usability differed by survey format. We also tested whether we could replicate established paper-and-pencil findings via tablet computer. We recruited a sample of low-income community members living in the rural southern United States. Participants were 170 residents (black = 49%; white = 36%; other races and missing data = 15%) drawn from 2 counties meeting Florida's state statutory definition of rural with 100 persons or fewer per square mile. We randomly assigned participants to complete scales (Center for Epidemiologic Studies Depression Inventory and Regulatory Focus Questionnaire) along with survey format usability ratings via paper-and-pencil or tablet computer. All participants rated a series of previously validated posters using a tablet computer. Finally, participants completed comparisons of the survey formats and reported survey format preferences. Participants preferred using the tablet computer and showed no significant differences between formats in mean responses, scale reliabilities, or in participants' usability ratings. Overall, participants reported similar scales responses and usability ratings between formats. However, participants reported both preferring and enjoying responding via tablet computer more. Collectively, these findings are among the first data to show that tablet computers represent a suitable substitute among an underrepresented rural sample for paper-and-pencil methodology in survey research. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  11. Numerical analysis of boosting scheme for scalable NMR quantum computation

    International Nuclear Information System (INIS)

    SaiToh, Akira; Kitagawa, Masahiro

    2005-01-01

    Among initialization schemes for ensemble quantum computation beginning at thermal equilibrium, the scheme proposed by Schulman and Vazirani [in Proceedings of the 31st ACM Symposium on Theory of Computing (STOC'99) (ACM Press, New York, 1999), pp. 322-329] is known for the simple quantum circuit to redistribute the biases (polarizations) of qubits and small time complexity. However, our numerical simulation shows that the number of qubits initialized by the scheme is rather smaller than expected from the von Neumann entropy because of an increase in the sum of the binary entropies of individual qubits, which indicates a growth in the total classical correlation. This result--namely, that there is such a significant growth in the total binary entropy--disagrees with that of their analysis

  12. A survey on Barrett's esophagus analysis using machine learning.

    Science.gov (United States)

    de Souza, Luis A; Palm, Christoph; Mendel, Robert; Hook, Christian; Ebigbo, Alanna; Probst, Andreas; Messmann, Helmut; Weber, Silke; Papa, João P

    2018-05-01

    This work presents a systematic review concerning recent studies and technologies of machine learning for Barrett's esophagus (BE) diagnosis and treatment. The use of artificial intelligence is a brand new and promising way to evaluate such disease. We compile some works published at some well-established databases, such as Science Direct, IEEEXplore, PubMed, Plos One, Multidisciplinary Digital Publishing Institute (MDPI), Association for Computing Machinery (ACM), Springer, and Hindawi Publishing Corporation. Each selected work has been analyzed to present its objective, methodology, and results. The BE progression to dysplasia or adenocarcinoma shows a complex pattern to be detected during endoscopic surveillance. Therefore, it is valuable to assist its diagnosis and automatic identification using computer analysis. The evaluation of the BE dysplasia can be performed through manual or automated segmentation through machine learning techniques. Finally, in this survey, we reviewed recent studies focused on the automatic detection of the neoplastic region for classification purposes using machine learning methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. 80 A/cm2 electron beams from metal targets irradiated by KrCl and XeCl excimer lasers

    Science.gov (United States)

    Beloglazov, A.; Martino, M.; Nassisi, V.

    1996-05-01

    Due to the growing demand for high-current and long-duration electron-beam devices, laser electron sources were investigated in our laboratory. Experiments on electron-beam generation and propagation from aluminium and copper targets illuminated by XeCl (308 nm) and KrCl (222 nm) excimer lasers, were carried out under plasma ignition due to laser irradiation. This plasma supplied a spontaneous accelerating electric field of about 370 kV/m without an external accelerating voltage. By applying the modified one-dimensional Poisson equation, we computed the expected current and we also estimated the plasma concentration during the accelerating process. At 40 kV of accelerating voltage, an output current pulse of about 80 A/cm2 was detected from an Al target irradiated by the shorter wavelength laser.

  14. Segmentation of solid subregion of high grade gliomas in MRI images based on active contour model (ACM)

    Science.gov (United States)

    Seow, P.; Win, M. T.; Wong, J. H. D.; Abdullah, N. A.; Ramli, N.

    2016-03-01

    Gliomas are tumours arising from the interstitial tissue of the brain which are heterogeneous, infiltrative and possess ill-defined borders. Tumour subregions (e.g. solid enhancing part, edema and necrosis) are often used for tumour characterisation. Tumour demarcation into substructures facilitates glioma staging and provides essential information. Manual segmentation had several drawbacks that include laborious, time consuming, subjected to intra and inter-rater variability and hindered by diversity in the appearance of tumour tissues. In this work, active contour model (ACM) was used to segment the solid enhancing subregion of the tumour. 2D brain image acquisition data using 3T MRI fast spoiled gradient echo sequence in post gadolinium of four histologically proven high-grade glioma patients were obtained. Preprocessing of the images which includes subtraction and skull stripping were performed and then followed by ACM segmentation. The results of the automatic segmentation method were compared against the manual delineation of the tumour by a trainee radiologist. Both results were further validated by an experienced neuroradiologist and a brief quantitative evaluations (pixel area and difference ratio) were performed. Preliminary results of the clinical data showed the potential of ACM model in the application of fast and large scale tumour segmentation in medical imaging.

  15. Segmentation of solid subregion of high grade gliomas in MRI images based on active contour model (ACM)

    International Nuclear Information System (INIS)

    Seow, P; Win, M T; Wong, J H D; Ramli, N; Abdullah, N A

    2016-01-01

    Gliomas are tumours arising from the interstitial tissue of the brain which are heterogeneous, infiltrative and possess ill-defined borders. Tumour subregions (e.g. solid enhancing part, edema and necrosis) are often used for tumour characterisation. Tumour demarcation into substructures facilitates glioma staging and provides essential information. Manual segmentation had several drawbacks that include laborious, time consuming, subjected to intra and inter-rater variability and hindered by diversity in the appearance of tumour tissues. In this work, active contour model (ACM) was used to segment the solid enhancing subregion of the tumour. 2D brain image acquisition data using 3T MRI fast spoiled gradient echo sequence in post gadolinium of four histologically proven high-grade glioma patients were obtained. Preprocessing of the images which includes subtraction and skull stripping were performed and then followed by ACM segmentation. The results of the automatic segmentation method were compared against the manual delineation of the tumour by a trainee radiologist. Both results were further validated by an experienced neuroradiologist and a brief quantitative evaluations (pixel area and difference ratio) were performed. Preliminary results of the clinical data showed the potential of ACM model in the application of fast and large scale tumour segmentation in medical imaging. (paper)

  16. Validation of the Adolescent Concerns Measure (ACM): Evidence from Exploratory and Confirmatory Factor Analysis

    Science.gov (United States)

    Ang, Rebecca P.; Chong, Wan Har; Huan, Vivien S.; Yeo, Lay See

    2007-01-01

    This article reports the development and initial validation of scores obtained from the Adolescent Concerns Measure (ACM), a scale which assesses concerns of Asian adolescent students. In Study 1, findings from exploratory factor analysis using 619 adolescents suggested a 24-item scale with four correlated factors--Family Concerns (9 items), Peer…

  17. A Survey of Computer Science Capstone Course Literature

    Science.gov (United States)

    Dugan, Robert F., Jr.

    2011-01-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software…

  18. Research foci of computing research in South Africa as reflected by publications in the South African computer journal

    CSIR Research Space (South Africa)

    Kotzé, P

    2009-01-01

    Full Text Available of research articles published in SACJ over its first 40 volumes of the journal using the ACM Computing Classification Scheme as basis. In their analysis the authors divided the publications into three cycles of more or less six years in order to identify...

  19. Application of computers in a Radiological Survey Program

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.; Doane, R.W.; Little, C.A.; Perdue, P.T.

    1984-01-01

    A brief description of some of the applications of computers in a radiological survey program is presented. It has been our experience that computers and computer software have allowed our staff personnel to more productively use their time by using computers to perform the mechanical acquisition, analyses, and storage of data. It is hoped that other organizations may similarly profit from this experience. This effort will ultimately minimize errors and reduce program costs

  20. Public computing options for individuals with cognitive impairments: survey outcomes.

    Science.gov (United States)

    Fox, Lynn Elizabeth; Sohlberg, McKay Moore; Fickas, Stephen; Lemoncello, Rik; Prideaux, Jason

    2009-09-01

    To examine availability and accessibility of public computing for individuals with cognitive impairment (CI) who reside in the USA. A telephone survey was administered as a semi-structured interview to 145 informants representing seven types of public facilities across three geographically distinct regions using a snowball sampling technique. An Internet search of wireless (Wi-Fi) hotspots supplemented the survey. Survey results showed the availability of public computer terminals and Internet hotspots was greatest in the urban sample, followed by the mid-sized and rural cities. Across seven facility types surveyed, libraries had the highest percentage of access barriers, including complex queue procedures, login and password requirements, and limited technical support. University assistive technology centres and facilities with a restricted user policy, such as brain injury centres, had the lowest incidence of access barriers. Findings suggest optimal outcomes for people with CI will result from a careful match of technology and the user that takes into account potential barriers and opportunities to computing in an individual's preferred public environments. Trends in public computing, including the emergence of widespread Wi-Fi and limited access to terminals that permit auto-launch applications, should guide development of technology designed for use in public computing environments.

  1. Formal Specification and Analysis of Cloud Computing Management

    Science.gov (United States)

    2012-01-24

    te r Cloud Computing in a Nutshell We begin this introduction to Cloud Computing with a famous quote by Larry Ellison: “The interesting thing about...the wording of some of our ads.” — Larry Ellison, Oracle CEO [106] In view of this statement, we summarize the essential aspects of Cloud Computing...1] M. Abadi, M. Burrows , M. Manasse, and T. Wobber. Moderately hard, memory-bound functions. ACM Transactions on Internet Technology, 5(2):299–327

  2. Extreme Scale Computing Studies

    Science.gov (United States)

    2010-12-01

    systems that would fall under the Exascale rubric . In this chapter, we first discuss the attributes by which achievement of the label “Exascale” may be...Carrington, and E. Strohmaier. A Genetic Algorithms Approach to Modeling the Performance of Memory-bound Computations. Reno, NV, November 2007. ACM/IEEE... genetic stochasticity (random mating, mutation, etc). Outcomes are thus stochastic as well, and ecologists wish to ask questions like, “What is the

  3. Paradigms and laboratories in the core computer science curriculum: An overview

    NARCIS (Netherlands)

    Hartel, Pieter H.; Hertzberger, L.O.

    1995-01-01

    Recent issues of the bulletin of the ACM SIGCSE have been scrutinised to find evidence that the use of laboratory sessions and different programming paradigms improve learning difficult concepts and techniques, such as recursion and problemsolving. Many authors in the surveyed literature be- lieve

  4. TWO NOVEL ACM (ACTIVE CONTOUR MODEL) METHODS FOR INTRAVASCULAR ULTRASOUND IMAGE SEGMENTATION

    International Nuclear Information System (INIS)

    Chen, Chi Hau; Potdat, Labhesh; Chittineni, Rakesh

    2010-01-01

    One of the attractive image segmentation methods is the Active Contour Model (ACM) which has been widely used in medical imaging as it always produces sub-regions with continuous boundaries. Intravascular ultrasound (IVUS) is a catheter based medical imaging technique which is used for quantitative assessment of atherosclerotic disease. Two methods of ACM realizations are presented in this paper. The gradient descent flow based on minimizing energy functional can be used for segmentation of IVUS images. However this local operation alone may not be adequate to work with the complex IVUS images. The first method presented consists of basically combining the local geodesic active contours and global region-based active contours. The advantage of combining the local and global operations is to allow curves deforming under the energy to find only significant local minima and delineate object borders despite noise, poor edge information and heterogeneous intensity profiles. Results for this algorithm are compared to standard techniques to demonstrate the method's robustness and accuracy. In the second method, the energy function is appropriately modified and minimized using a Hopfield neural network. Proper modifications in the definition of the bias of the neurons have been introduced to incorporate image characteristics. The method overcomes distortions in the expected image pattern, such as due to the presence of calcium, and employs a specialized structure of the neural network and boundary correction schemes which are based on a priori knowledge about the vessel geometry. The presented method is very fast and has been evaluated using sequences of IVUS frames.

  5. Application of computers in a radiological survey program

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.; Doane, R.W.; Little, C.A.; Perdue, P.T.

    1984-01-01

    Computers have become increasingly important in data analysis and data management as well as assisting in report preparation in the Oak Ridge National Laboratory (ORNL) Radiological Survey Activities (RASA) Program. The primary function of the RASA program is to collect, analyze, report, and manage data collected to characterize the radiological condition of potentially contaminated sites identified in the Department of Energy's (DOE) remedial action programs. Three different computer systems are routinely utilized in ORNL/RASA operations. Two of these systems are employed in specific functions. A Nuclear Data (ND) 682 is used to perform isotopic analysis of gamma spectroscopic data generated by high-purity germanium detectors for air, water and soil samples. The ND682 employs a 16,000-channel analyzer that is routinely used with four germanium spectrometers. Word processing and data management are accomplished using the INtext system implemented on a DEC PDP-11 computer. A group of personal computers are used to perform a diverse number of functions. These computer systems are Commodore Business Machines (CBM) Model 8032 with a dual floppy disk storage medium and line printers (with optional X-Y plotters). The CBM's are utilized for: (1) data analysis -- raw data from radiation detection instrumentation are stored and manipulated with customized computer programs; (2) data reduction -- raw data are converted into report-ready tables using customized programs; (3) data management -- radionuclide data on each air, water and soil sample are stored on diskettes along with location of archived samples; and (4) program management -- site surveys and report status are tracked by computer files as well as program budget information to provide contemporary information of program status

  6. Pair Programming as a Modern Method of Teaching Computer Science

    OpenAIRE

    Irena Nančovska Šerbec; Branko Kaučič; Jože Rugelj

    2008-01-01

    At the Faculty of Education, University of Ljubljana we educate future computer science teachers. Beside didactical, pedagogical, mathematical and other interdisciplinary knowledge, students gain knowledge and skills of programming that are crucial for computer science teachers. For all courses, the main emphasis is the absorption of professional competences, related to the teaching profession and the programming profile. The latter are selected according to the well-known document, the ACM C...

  7. Enabling Chemistry of Gases and Aerosols for Assessment of Short-Lived Climate Forcers: Improving Solar Radiation Modeling in the DOE-ACME and CESM models

    Energy Technology Data Exchange (ETDEWEB)

    Prather, Michael [Univ. of California, Irvine, CA (United States)

    2018-01-12

    This proposal seeks to maintain the DOE-ACME (offshoot of CESM) as one of the leading CCMs to evaluate near-term climate mitigation. It will implement, test, and optimize the new UCI photolysis codes within CESM CAM5 and new CAM versions in ACME. Fast-J is a high-order-accuracy (8 stream) code for calculating solar scattering and absorption in a single column atmosphere containing clouds, aerosols, and gases that was developed at UCI and implemented in CAM5 under the previous BER/SciDAC grant.

  8. Construction of improved temperature-sensitive and mobilizable vectors and their use for constructing mutations in the adhesin-encoding acm gene of poorly transformable clinical Enterococcus faecium strains.

    Science.gov (United States)

    Nallapareddy, Sreedhar R; Singh, Kavindra V; Murray, Barbara E

    2006-01-01

    Inactivation by allelic exchange in clinical isolates of the emerging nosocomial pathogen Enterococcus faecium has been hindered by lack of efficient tools, and, in this study, transformation of clinical isolates was found to be particularly problematic. For this reason, a vector for allelic replacement (pTEX5500ts) was constructed that includes (i) the pWV01-based gram-positive repAts replication region, which is known to confer a high degree of temperature intolerance, (ii) Escherichia coli oriR from pUC18, (iii) two extended multiple-cloning sites located upstream and downstream of one of the marker genes for efficient cloning of flanking regions for double-crossover mutagenesis, (iv) transcriptional terminator sites to terminate undesired readthrough, and (v) a synthetic extended promoter region containing the cat gene for allelic exchange and a high-level gentamicin resistance gene, aph(2'')-Id, to distinguish double-crossover recombination, both of which are functional in gram-positive and gram-negative backgrounds. To demonstrate the functionality of this vector, the vector was used to construct an acm (encoding an adhesin to collagen from E. faecium) deletion mutant of a poorly transformable multidrug-resistant E. faecium endocarditis isolate, TX0082. The acm-deleted strain, TX6051 (TX0082Deltaacm), was shown to lack Acm on its surface, which resulted in the abolishment of the collagen adherence phenotype observed in TX0082. A mobilizable derivative (pTEX5501ts) that contains oriT of Tn916 to facilitate conjugative transfer from the transformable E. faecalis strain JH2Sm::Tn916 to E. faecium was also constructed. Using this vector, the acm gene of a nonelectroporable E. faecium wound isolate was successfully interrupted. Thus, pTEX5500ts and its mobilizable derivative demonstrated their roles as important tools by helping to create the first reported allelic replacement in E. faecium; the constructed this acm deletion mutant will be useful for assessing the

  9. Computing the scattering properties of participating media using Lorenz-Mie theory

    DEFF Research Database (Denmark)

    2007-01-01

    This source code implements Lorenz-Mie theory using the formulas presented in the SIGGRAPH 2007 paper: J. R. Frisvad, N. J. Christensen, and H. W. Jensen: "Computing the Scattering Properties of Participating Media Using Lorenz-Mie Theory". Copyright (c) ACM 2007. This is the author's version...

  10. A Cellular Automata Approach to Computer Vision and Image Processing.

    Science.gov (United States)

    1980-09-01

    the ACM, vol. 15, no. 9, pp. 827-837. [ Duda and Hart] R. 0. Duda and P. E. Hart, Pattern Classification and Scene Analysis, Wiley, New York, 1973...Center TR-738, 1979. [Farley] Arthur M. Farley and Andrzej Proskurowski, "Gossiping in Grid Graphs", University of Oregon Computer Science Department CS-TR

  11. Humor in Human-Computer Interaction : A Short Survey

    NARCIS (Netherlands)

    Nijholt, Anton; Niculescu, Andreea; Valitutti, Alessandro; Banchs, Rafael E.; Joshi, Anirudha; Balkrishan, Devanuj K.; Dalvi, Girish; Winckler, Marco

    2017-01-01

    This paper is a short survey on humor in human-computer interaction. It describes how humor is designed and interacted with in social media, virtual agents, social robots and smart environments. Benefits and future use of humor in interactions with artificial entities are discussed based on

  12. Proceeding of the ACM/IEEE-CS Joint Conference on Digital Libraries (1st, Roanoke, Virginia, June 24-28, 2001).

    Science.gov (United States)

    Association for Computing Machinery, New York, NY.

    Papers in this Proceedings of the ACM/IEEE-CS Joint Conference on Digital Libraries (Roanoke, Virginia, June 24-28, 2001) discuss: automatic genre analysis; text categorization; automated name authority control; automatic event generation; linked active content; designing e-books for legal research; metadata harvesting; mapping the…

  13. Toward Practical Verification of Outsourced Computations Using Probabilistically Checkable Proofs (PCPs)

    Science.gov (United States)

    2010-07-12

    Germany, 1999. [8] L. Babai, L. Fortnow, L. A. Levin, and M. Szegedy. Checking Computations in Polylogarithmic Time. In STOC, 1991. [9] A. Ben- David ...their work. J. ACM, 42(1):269–291, 1995. [12] D. Chaum , C. Crépeau, and I. Damgard. Multiparty unconditionally secure protocols. In STOC, 1988. [13

  14. acme: The Amendable Coal-Fire Modeling Exercise. A C++ Class Library for the Numerical Simulation of Coal-Fires

    Science.gov (United States)

    Wuttke, Manfred W.

    2017-04-01

    At LIAG, we use numerical models to develop and enhance understanding of coupled transport processes and to predict the dynamics of the system under consideration. Topics include geothermal heat utilization, subrosion processes, and spontaneous underground coal fires. Although the details make it inconvenient if not impossible to apply a single code implementation to all systems, their investigations go along similar paths: They all depend on the solution of coupled transport equations. We thus saw a need for a modular code system with open access for the various communities to maximize the shared synergistic effects. To this purpose we develop the oops! ( open object-oriented parallel solutions) - toolkit, a C++ class library for the numerical solution of mathematical models of coupled thermal, hydraulic and chemical processes. This is used to develop problem-specific libraries like acme( amendable coal-fire modeling exercise), a class library for the numerical simulation of coal-fires and applications like kobra (Kohlebrand, german for coal-fire), a numerical simulation code for standard coal-fire models. Basic principle of the oops!-code system is the provision of data types for the description of space and time dependent data fields, description of terms of partial differential equations (pde), their discretisation and solving methods. Coupling of different processes, described by their particular pde is modeled by an automatic timescale-ordered operator-splitting technique. acme is a derived coal-fire specific application library, depending on oops!. If specific functionalities of general interest are implemented and have been tested they will be assimilated into the main oops!-library. Interfaces to external pre- and post-processing tools are easily implemented. Thus a construction kit which can be arbitrarily amended is formed. With the kobra-application constructed with acme we study the processes and propagation of shallow coal seam fires in particular in

  15. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  16. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  17. Automatic 2D segmentation of airways in thorax computed tomography images

    International Nuclear Information System (INIS)

    Cavalcante, Tarique da Silveira; Cortez, Paulo Cesar; Almeida, Thomaz Maia de; Felix, John Hebert da Silva; Holanda, Marcelo Alcantara

    2013-01-01

    Introduction: much of the world population is affected by pulmonary diseases, such as the bronchial asthma, bronchitis and bronchiectasis. The bronchial diagnosis is based on the airways state. In this sense, the automatic segmentation of the airways in Computed Tomography (CT) scans is a critical step in the aid to diagnosis of these diseases. Methods: this paper evaluates algorithms for airway automatic segmentation, using Neural Network Multilayer Perceptron (MLP) and Lung Densities Analysis (LDA) for detecting airways, along with Region Growing (RG), Active Contour Method (ACM) Balloon and Topology Adaptive to segment them. Results: we obtained results in three stages: comparative analysis of the detection algorithms MLP and LDA, with a gold standard acquired by three physicians with expertise in CT imaging of the chest; comparative analysis of segmentation algorithms ACM Balloon, ACM Topology Adaptive, MLP and RG; and evaluation of possible combinations between segmentation and detection algorithms, resulting in the complete method for automatic segmentation of the airways in 2D. Conclusion: the low incidence of false negative and the significant reduction of false positive, results in similarity coefficient and sensitivity exceeding 91% and 87% respectively, for a combination of algorithms with satisfactory segmentation quality. (author)

  18. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    Science.gov (United States)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  19. A survey of energy saving techniques for mobile computers

    NARCIS (Netherlands)

    Smit, Gerardus Johannes Maria; Havinga, Paul J.M.

    1997-01-01

    Portable products such as pagers, cordless and digital cellular telephones, personal audio equipment, and laptop computers are increasingly being used. Because these applications are battery powered, reducing power consumption is vital. In this report we first give a survey of techniques for

  20. Data mining in soft computing framework: a survey.

    Science.gov (United States)

    Mitra, S; Pal, S K; Mitra, P

    2002-01-01

    The present article provides a survey of the available literature on data mining using soft computing. A categorization has been provided based on the different soft computing tools and their hybridizations used, the data mining function implemented, and the preference criterion selected by the model. The utility of the different soft computing methodologies is highlighted. Generally fuzzy sets are suitable for handling the issues related to understandability of patterns, incomplete/noisy data, mixed media information and human interaction, and can provide approximate solutions faster. Neural networks are nonparametric, robust, and exhibit good learning and generalization capabilities in data-rich environments. Genetic algorithms provide efficient search algorithms to select a model, from mixed media data, based on some preference criterion/objective function. Rough sets are suitable for handling different types of uncertainty in data. Some challenges to data mining and the application of soft computing methodologies are indicated. An extensive bibliography is also included.

  1. A Survey of Computer Use by Undergraduate Psychology Departments in Virginia.

    Science.gov (United States)

    Stoloff, Michael L.; Couch, James V.

    1987-01-01

    Reports a survey of computer use in psychology departments in Virginia's four year colleges. Results showed that faculty, students, and clerical staff used word processing, statistical analysis, and database management most frequently. The three most numerous computers brands were the Apple II family, IBM PCs, and the Apple Macintosh. (Author/JDH)

  2. DISTRIBUTED COMPUTING SUPPORT CONTRACT USER SURVEY

    CERN Multimedia

    2001-01-01

    IT Division operates a Distributed Computing Support Service, which offers support to owners and users of all variety of desktops throughout CERN as well as more dedicated services for certain groups, divisions and experiments. It also provides the staff who operate the central and satellite Computing Helpdesks, it supports printers throughout the site and it provides the installation activities of the IT Division PC Service. We have published a questionnaire which seeks to gather your feedback on how the services are seen, how they are progressing and how they can be improved. Please take a few minutes to fill in this questionnaire. Replies will be treated in confidence if desired although you may also request an opportunity to be contacted by CERN's service management directly. Please tell us if you met problems but also if you had a successful conclusion to your request for assistance. You will find the questionnaire at the web site http://wwwinfo/support/survey/desktop-contract There will also be a link ...

  3. DISTRIBUTED COMPUTING SUPPORT SERVICE USER SURVEY

    CERN Multimedia

    2001-01-01

    IT Division operates a Distributed Computing Support Service, which offers support to owners and users of all variety of desktops throughout CERN as well as more dedicated services for certain groups, divisions and experiments. It also provides the staff who operate the central and satellite Computing Helpdesks, it supports printers throughout the site and it provides the installation activities of the IT Division PC Service. We have published a questionnaire, which seeks to gather your feedback on how the services are seen, how they are progressing and how they can be improved. Please take a few minutes to fill in this questionnaire. Replies will be treated in confidence if desired although you may also request an opportunity to be contacted by CERN's service management directly. Please tell us if you met problems but also if you had a successful conclusion to your request for assistance. You will find the questionnaire at the web site http://wwwinfo/support/survey/desktop-contract There will also be a link...

  4. Analysis of Residual Nuclide in a ACM and ACCT of 100-MeV proton beamline By measurement X-ray Spectrum

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jeong-Min; Yun, Sang-Pil; Kim, Han-Sung; Kwon, Hyeok-Jung; Cho, Yong-Sub [Korea Atomic Energy Research Institute, Gyeongju (Korea, Republic of)

    2015-10-15

    The proton beam is provides to users as various energy range from 20 MeV to 100 MeV. After protons generated from the ion source are accelerated to 100 MeV and irradiated to target through bending magnet and AC magnet. At this time, relatively high dose X-ray is emitted due to collision of proton and components of beamline. The generated X-ray is remaining after the accelerator is turned off and analyzing residual nuclides through the measurement of X-ray spectrum. Then identify the components that are the primary cause of residual nuclides are detected form the AC magnet(ACM) and associated components (ACCT). Analysis of the X-ray spectrum generated form the AC magnet(ACM) and AC current transformer(ACCT) of 100 MeV beamline according to the proton beam irradiation, most of the residual nuclides are identified it can be seen that emission in the stainless steel by beam loss.

  5. Cloud computing for energy management in smart grid - an application survey

    International Nuclear Information System (INIS)

    Naveen, P; Ing, Wong Kiing; Danquah, Michael Kobina; Sidhu, Amandeep S; Abu-Siada, Ahmed

    2016-01-01

    The smart grid is the emerging energy system wherein the application of information technology, tools and techniques that make the grid run more efficiently. It possesses demand response capacity to help balance electrical consumption with supply. The challenges and opportunities of emerging and future smart grids can be addressed by cloud computing. To focus on these requirements, we provide an in-depth survey on different cloud computing applications for energy management in the smart grid architecture. In this survey, we present an outline of the current state of research on smart grid development. We also propose a model of cloud based economic power dispatch for smart grid. (paper)

  6. ARM Airborne Carbon Measurements VI (ARM-ACME VI) Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Biraud, Sebastien [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-05-01

    From October 1, 2015 through September 30, 2016, AAF deployed a Cessna 206 aircraft over the Southern Great Plains, collecting observations of trace gas mixing ratios over the ARM/SGP Central Facility. The aircraft payload included two Atmospheric Observing Systems (AOS Inc.) analyzers for continuous measurements of CO2, and a 12-flask sampler for analysis of carbon cycle gases (CO2, CO, CH4, N2O, 13CO2). The aircraft payload also includes solar/infrared radiation measurements. This research (supported by DOE ARM and TES programs) builds upon previous ARM-ACME missions. The goal of these measurements is to improve understanding of: (a) the carbon exchange of the ARM region; (b) how CO2 and associated water and energy fluxes influence radiative forcing, convective processes, and CO2 concentrations over the ARM region, and (c) how greenhouse gases are transported on continental scales.

  7. A Survey of Current Computer Information Science (CIS) Students.

    Science.gov (United States)

    Los Rios Community Coll. District, Sacramento, CA. Office of Institutional Research.

    This document is a survey designed to be completed by current students of Computer Information Science (CIS) in the Los Rios Community College District (LRCCD), which consists of three community colleges: American River College, Cosumnes River College, and Sacramento City College. The students are asked about their educational goals and how…

  8. Empirical Validation and Application of the Computing Attitudes Survey

    Science.gov (United States)

    Dorn, Brian; Elliott Tew, Allison

    2015-01-01

    Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…

  9. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    Science.gov (United States)

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  10. Approaches and Tools Used to Teach the Computer Input/Output Subsystem: A Survey

    Science.gov (United States)

    Larraza-Mendiluze, Edurne; Garay-Vitoria, Nestor

    2015-01-01

    This paper surveys how the computer input/output (I/O) subsystem is taught in introductory undergraduate courses. It is important to study the educational process of the computer I/O subsystem because, in the curricula recommendations, it is considered a core topic in the area of knowledge of computer architecture and organization (CAO). It is…

  11. Selective desulfurization of cysteine in the presence of Cys(Acm) in polypeptides obtained by native chemical ligation.

    Science.gov (United States)

    Pentelute, Brad L; Kent, Stephen B H

    2007-02-15

    Increased versatility for the synthesis of proteins and peptides by native chemical ligation requires the ability to ligate at positions other than Cys. Here, we report that Raney nickel can be used under standard conditions for the selective desulfurization of Cys in the presence of Cys(Acm). This simple and practical tactic enables the more common Xaa-Ala junctions to be used as ligation sites for the chemical synthesis of Cys-containing peptides and proteins. [reaction: see text].

  12. Gender stereotypes, aggression, and computer games: an online survey of women.

    Science.gov (United States)

    Norris, Kamala O

    2004-12-01

    Computer games were conceptualized as a potential mode of entry into computer-related employment for women. Computer games contain increasing levels of realism and violence, as well as biased gender portrayals. It has been suggested that aggressive personality characteristics attract people to aggressive video games, and that more women do not play computer games because they are socialized to be non-aggressive. To explore gender identity and aggressive personality in the context of computers, an online survey was conducted on women who played computer games and women who used the computer but did not play computer games. Women who played computer games perceived their online environments as less friendly but experienced less sexual harassment online, were more aggressive themselves, and did not differ in gender identity, degree of sex role stereotyping, or acceptance of sexual violence when compared to women who used the computer but did not play video games. Finally, computer gaming was associated with decreased participation in computer-related employment; however, women with high masculine gender identities were more likely to use computers at work.

  13. Second-Language Composition Instruction, Computers and First-Language Pedagogy: A Descriptive Survey.

    Science.gov (United States)

    Harvey, T. Edward

    1987-01-01

    A national survey of full-time instructional faculty (N=208) at universities, 2-year colleges, and high schools regarding attitudes toward using computers in second-language composition instruction revealed a predomination of Apple and IBM-PC computers used, a major frustration in lack of foreign character support, and mixed opinions about real…

  14. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 3

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  15. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  16. Education in interactive media: a survey on the potentials of computers for visual literacy

    OpenAIRE

    Güleryüz, Hakan

    1996-01-01

    Ankara : Bilkent University, Department of Graphic Design and Institute of Fine Arts, 1996. Thesis (Master's) -- Bilkent University, 1996. Includes bibliographical references leaves 89-94. This study aims at investigating the potentials of multimedia and computers in design. For this purpose, a general survey on the historical development of computers for their use in education and possibilities related to the use of technology in education is conducted. Based on this survey, the dep...

  17. Preventive maintenance plan of the air-conditioning duct using the ACM-sensor

    International Nuclear Information System (INIS)

    Fukuba, Kazushi; Ito, Takanobu; Kojima, Akiko; Tanji, Kazuhiro; Sato, Yuki

    2013-01-01

    Air-conditioning duct is difficult to predict the date to occur of corrosion such as affect the function. Therefore, the current conservation method is mostly corrective maintenance. Therefore, we used the test pieces of six types and ACM-sensor in order to solve the corrosion speed from corrosion environment and relationship of corrosion quantity of test pieces. In addition, was used the duct molded articles various in order to check the corrosion degree of when processed the duct. As a result, we were selected crust body constituting a duct and optimal combination of the flange by solve the corrosion speed of the test pieces various. Thus, it performs preventive disposal before to occur of corrosion such as affect the function by predicting the duct life from corrosion speed, and lead to stability and safe operating by appropriate maintenance of equipment. (author)

  18. Reliability of a computer and Internet survey (Computer User Profile) used by adults with and without traumatic brain injury (TBI).

    Science.gov (United States)

    Kilov, Andrea M; Togher, Leanne; Power, Emma

    2015-01-01

    To determine test-re-test reliability of the 'Computer User Profile' (CUP) in people with and without TBI. The CUP was administered on two occasions to people with and without TBI. The CUP investigated the nature and frequency of participants' computer and Internet use. Intra-class correlation coefficients and kappa coefficients were conducted to measure reliability of individual CUP items. Descriptive statistics were used to summarize content of responses. Sixteen adults with TBI and 40 adults without TBI were included in the study. All participants were reliable in reporting demographic information, frequency of social communication and leisure activities and computer/Internet habits and usage. Adults with TBI were reliable in 77% of their responses to survey items. Adults without TBI were reliable in 88% of their responses to survey items. The CUP was practical and valuable in capturing information about social, leisure, communication and computer/Internet habits of people with and without TBI. Adults without TBI scored more items with satisfactory reliability overall in their surveys. Future studies may include larger samples and could also include an exploration of how people with/without TBI use other digital communication technologies. This may provide further information on determining technology readiness for people with TBI in therapy programmes.

  19. How to Implement Rigorous Computer Science Education in K-12 Schools? Some Answers and Many Questions

    Science.gov (United States)

    Hubwieser, Peter; Armoni, Michal; Giannakos, Michail N.

    2015-01-01

    Aiming to collect various concepts, approaches, and strategies for improving computer science education in K-12 schools, we edited this second special issue of the "ACM TOCE" journal. Our intention was to collect a set of case studies from different countries that would describe all relevant aspects of specific implementations of…

  20. Design of ET(B) receptor agonists: NMR spectroscopic and conformational studies of ET7-21[Leu7, Aib11, Cys(Acm)15].

    Science.gov (United States)

    Hewage, Chandralal M; Jiang, Lu; Parkinson, John A; Ramage, Robert; Sadler, Ian H

    2002-03-01

    In a previous report we have shown that the endothelin-B receptor-selective linear endothelin peptide, ET-1[Cys (Acm)1,15, Ala3, Leu7, Aib11], folds into an alpha-helical conformation in a methanol-d3/water co-solvent [Hewage et al. (1998) FEBS Lett., 425, 234-238]. To study the requirements for the structure-activity relationships, truncated analogues of this peptide were subjected to further studies. Here we report the solution conformation of ET7-21[Leu7, Aib11, Cys(Acm)15], in a methanol-d3/water co-solvent at pH 3.6, by NMR spectroscopic and molecular modelling studies. Further truncation of this short peptide results in it displaying poor agonist activity. The modelled structure shows that the peptide folds into an alpha-helical conformation between residues Lys9-His16, whereas the C-terminus prefers no fixed conformation. This truncated linear endothelin analogue is pivotal for designing endothelin-B receptor agonists.

  1. A Survey of Service Composition Mechanisms in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Brønsted, Jeppe; Hansen, Klaus Marius; Ingstrup, Mads

    2007-01-01

    Composition of services, i.e., providing new services by combining existing ones, is a pervasive idea in ubiquitous computing. We surveyed the field by looking at what features are actually present in technologies that support service composition in some form. Condensing this into a list...... of features allowed us to discuss the qualitative merits and drawbacks of various approaches to service composition, focusing in particular on usability, adaptability and efficiency. Moreover, we found that further research is needed into quality-of-service assurance of composites and into contingency...... management for composites—one of the concerns differentiating service composition in ubiquitous computing from its counterpart in less dynamic settings....

  2. Recent changes in flood damage in the United States from observations and ACME model

    Science.gov (United States)

    Leng, G.; Leung, L. R.

    2017-12-01

    Despite efforts to mitigate flood hazards in flood-prone areas, survey- and report-based flood databases show that flood damage has increased and emerged as one of the most costly disaster in the United States since the 1990s. Understanding the mechanism driving the changes in flood damage is therefore critical for reducing flood risk. In this study, we first conduct a comprehensive analysis of the changing characteristics of flood damage at local, state and country level. Results show a significant increasing trend in the number of flood hazards, causing economic losses of up to $7 billion per year. The ratio of flood events that caused tangible economical cost to the total flood events has exhibited a non-significant increasing trend before 2007 followed by a significant decrease, indicating a changing vulnerability to floods. Analysis also reveals distinct spatial and temporal patterns in the threshold intensity of flood hazards with tangible economical cost. To understand the mechanism behind the increasing flood damage, we develop a flood damage economic model coupled with the integrated hydrological modeling system of ACME that features a river routing model with an inundation parameterization and a water use and regulation model. The model is evaluated over the country against historical records. Several numerical experiments are then designed to explore the mechanisms behind the recent changes in flood damage from the perspective of flood hazard, exposure and vulnerability, which constitute flood damage. The role of human activities such as reservoir operations and water use in modifying regional floods are also explored using the new tool, with the goal of improving understanding and modeling of vulnerability to flood hazards.

  3. A computer network attack taxonomy and ontology

    CSIR Research Space (South Africa)

    Van Heerden, RP

    2012-01-01

    Full Text Available of the attack that occur after the attack goal has been achieved, and occurs because the attacker loses control of some systems. For example, after the launch of a DDOS (Distributed Denial of Service) attack, zombie computers may still connect to the target...-scrap- value-of-a-hacked-pc-revisited/ . Lancor, L., & Workman, R. (2007). Using Google Hacking to Enhance Defense Strategies. ACM SIGCSE Bulletin, 39 (1), 491-495. Lau, F., Rubin, S. H., Smith, M. H., & Trajkovic, L. (2000). Distributed Denial of Service...

  4. Interactive Synthesis of Code Level Security Rules

    Science.gov (United States)

    2017-04-01

    Proceedings of the 9th ACM conference on Computer and communications security, pages 235–244. ACM, 2002. [19] J. Davis. Hacking of government computers...Inductive programming meets the real world. Communications of the ACM, 58(11):90–99, 2015. [24] S. Hallem, B. Chelf, Y. Xie, and D. Engler. A system and...Software Engineering, pages 462–473. ACM, 2015. [37] S. H. Muggleton, D. Lin, and A. Tamaddoni-Nezhad. Meta-interpretive learning of higher- order dyadic

  5. Pomegranate MR images analysis using ACM and FCM algorithms

    Science.gov (United States)

    Morad, Ghobad; Shamsi, Mousa; Sedaaghi, M. H.; Alsharif, M. R.

    2011-10-01

    Segmentation of an image plays an important role in image processing applications. In this paper segmentation of pomegranate magnetic resonance (MR) images has been explored. Pomegranate has healthy nutritional and medicinal properties for which the maturity indices and quality of internal tissues play an important role in the sorting process in which the admissible determination of features mentioned above cannot be easily achieved by human operator. Seeds and soft tissues are the main internal components of pomegranate. For research purposes, such as non-destructive investigation, in order to determine the ripening index and the percentage of seeds in growth period, segmentation of the internal structures should be performed as exactly as possible. In this paper, we present an automatic algorithm to segment the internal structure of pomegranate. Since its intensity of stem and calyx is close to the internal tissues, the stem and calyx pixels are usually labeled to the internal tissues by segmentation algorithm. To solve this problem, first, the fruit shape is extracted from its background using active contour model (ACM). Then stem and calyx are removed using morphological filters. Finally the image is segmented by fuzzy c-means (FCM). The experimental results represent an accuracy of 95.91% in the presence of stem and calyx, while the accuracy of segmentation increases to 97.53% when stem and calyx are first removed by morphological filters.

  6. On the Relationship between Variational Level Set-Based and SOM-Based Active Contours

    Science.gov (United States)

    Abdelsamea, Mohammed M.; Gnecco, Giorgio; Gaber, Mohamed Medhat; Elyan, Eyad

    2015-01-01

    Most Active Contour Models (ACMs) deal with the image segmentation problem as a functional optimization problem, as they work on dividing an image into several regions by optimizing a suitable functional. Among ACMs, variational level set methods have been used to build an active contour with the aim of modeling arbitrarily complex shapes. Moreover, they can handle also topological changes of the contours. Self-Organizing Maps (SOMs) have attracted the attention of many computer vision scientists, particularly in modeling an active contour based on the idea of utilizing the prototypes (weights) of a SOM to control the evolution of the contour. SOM-based models have been proposed in general with the aim of exploiting the specific ability of SOMs to learn the edge-map information via their topology preservation property and overcoming some drawbacks of other ACMs, such as trapping into local minima of the image energy functional to be minimized in such models. In this survey, we illustrate the main concepts of variational level set-based ACMs, SOM-based ACMs, and their relationship and review in a comprehensive fashion the development of their state-of-the-art models from a machine learning perspective, with a focus on their strengths and weaknesses. PMID:25960736

  7. Phantom-Calibrated versus Automatic Coronary Artery Mass Quantification with Multidetector-Row Computed Tomography: In Vitro and In Vivo Study

    International Nuclear Information System (INIS)

    Serafin, Z.; Lasek, W.; Laskowska, K.

    2008-01-01

    Background: Coronary artery calcium scoring is used as a method for cardiovascular risk stratification and monitoring of coronary heart disease. Automatic software-based calcium mass calculation has been proposed to improve the performance of the procedure. Purpose: To compare two algorithms of calcium mass measurement, automatic and phantom calibrated, with respect to correlation, measurement error, and accuracy in vitro and in vivo. Material and Methods: A cardiac phantom with calcium cylinder inserts was scanned with sequential non-overlapping collimation 4x2.5 mm, at 120 kV and 165 mAs. Fifty adults (37 men; mean age 46.2 years) were examined with the same settings using prospective electrocardiographic triggering to detect and quantify coronary artery calcifications. Calculations were performed with two methods: software-based automatic calcium mass measurement (ACM) and phantom-calibrated calcium mass measurement (CCM). Results: The total phantom calcium masses measured with ACM and CCM were 175.0±13.8 mg and 163.0±4.4 mg, respectively (P<0.0001), and ACM produced a higher mean error (4.5 vs. 3.2; P<0.05). Results of ACM and CCM were strongly correlated to each other (R=0.73-0.96; P<0.0001). Mean image noise in the patient study was 8.72±1.68 HU. Results of patient calcium scoring with ACM and CCM were significantly different (median 70.3 mg and 59.7 mg, respectively; P<0.0001), with a mean systematic error of 17.5% (limit of agreement between 14.6% and 20.4%). The use of ACM resulted in an altered quartile classification for 14% of patients, as compared to CCM; all of these patients were classified into a higher category. Conclusion: Our data indicate that multidetector-row computed tomography coronary calcium mass determination based on dedicated phantom calibration shows lower measurement error than an automatic software-based calculation method. The tested automatic software does not yet seem to be a reliable option for calcium mass measurement

  8. Phantom-Calibrated versus Automatic Coronary Artery Mass Quantification with Multidetector-Row Computed Tomography: In Vitro and In Vivo Study

    Energy Technology Data Exchange (ETDEWEB)

    Serafin, Z.; Lasek, W.; Laskowska, K. (Dept. of Radiology and Diagnostic Imaging, Nicolaus Copernicus Univ., Collegium Medicum, Bydgoszcz (Poland))

    2008-11-15

    Background: Coronary artery calcium scoring is used as a method for cardiovascular risk stratification and monitoring of coronary heart disease. Automatic software-based calcium mass calculation has been proposed to improve the performance of the procedure. Purpose: To compare two algorithms of calcium mass measurement, automatic and phantom calibrated, with respect to correlation, measurement error, and accuracy in vitro and in vivo. Material and Methods: A cardiac phantom with calcium cylinder inserts was scanned with sequential non-overlapping collimation 4x2.5 mm, at 120 kV and 165 mAs. Fifty adults (37 men; mean age 46.2 years) were examined with the same settings using prospective electrocardiographic triggering to detect and quantify coronary artery calcifications. Calculations were performed with two methods: software-based automatic calcium mass measurement (ACM) and phantom-calibrated calcium mass measurement (CCM). Results: The total phantom calcium masses measured with ACM and CCM were 175.0+-13.8 mg and 163.0+-4.4 mg, respectively (P<0.0001), and ACM produced a higher mean error (4.5 vs. 3.2; P<0.05). Results of ACM and CCM were strongly correlated to each other (R=0.73-0.96; P<0.0001). Mean image noise in the patient study was 8.72+-1.68 HU. Results of patient calcium scoring with ACM and CCM were significantly different (median 70.3 mg and 59.7 mg, respectively; P<0.0001), with a mean systematic error of 17.5% (limit of agreement between 14.6% and 20.4%). The use of ACM resulted in an altered quartile classification for 14% of patients, as compared to CCM; all of these patients were classified into a higher category. Conclusion: Our data indicate that multidetector-row computed tomography coronary calcium mass determination based on dedicated phantom calibration shows lower measurement error than an automatic software-based calculation method. The tested automatic software does not yet seem to be a reliable option for calcium mass measurement

  9. Increasing Update Rates in the Building Walkthrough System with Automatic Model-Space Subdivision and Potentially Visible Set Calculations

    Science.gov (United States)

    1990-07-01

    34 ACM Computing Surveys. 6(1): 1- 55. [Syzmanski85] Syzmanski, T. G. and C. J. V. Wyk. (1985). " GOALIE : A Space Efficient System for VLSI Artwork...this. Essentially we initialize a stack with the root. We then pull an element of this stack and if it is a cell we run the occlusion operation on the

  10. CLIC-ACM: generic modular rad-hard data acquisition system based on CERN GBT versatile link

    International Nuclear Information System (INIS)

    Bielawski, B.; Locci, F.; Magnoni, S.

    2015-01-01

    CLIC is a world-wide collaboration to study the next ''terascale'' lepton collider, relying upon a very innovative concept of two-beam-acceleration. This accelerator, currently under study, will be composed of the subsequence of 21000 two-beam-modules. Each module requires more than 300 analogue and digital signals which need to be acquired and controlled in a synchronous way. CLIC-ACM (Acquisition and Control Module) is the 'generic' control and acquisition module developed to accommodate the controls of all these signals for various sub-systems and related specification in term of data bandwidth, triggering and timing synchronization. This paper describes the system architecture with respect to its radiation-tolerance, power consumption and scalability

  11. Computational needs survey of NASA automation and robotics missions. Volume 1: Survey and results

    Science.gov (United States)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is that mission computing requirements are frequently unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. A preliminary set of advanced mission computational processing requirements of automation and robotics (A&R) systems are provided for use by NASA, industry, and academic communities. These results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implementation capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Volume one includes the survey and results. Volume two contains the appendixes.

  12. Report on the "Secure Vehicular Communications: Results and Challenges Ahead" Workshop

    OpenAIRE

    Papadimitratos, Panos; Hubaux, Jean-Pierre

    2008-01-01

    © ACM, (2008). This is the author’s version of the work. It is posted here by permission of ACM for your personaluse. Not for redistribution. The definitive version was published in ACM SIGMOBILE Mobile Computing and Communications Review . http://doi.acm.org/10.114510/1394555.1394567 .QC 20110712

  13. Building confidence and credibility amid growing model and computing complexity

    Science.gov (United States)

    Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.

    2017-12-01

    As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.

  14. Technology survey of computer software as applicable to the MIUS project

    Science.gov (United States)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  15. The computation of equating errors in international surveys in education.

    Science.gov (United States)

    Monseur, Christian; Berezner, Alla

    2007-01-01

    Since the IEA's Third International Mathematics and Science Study, one of the major objectives of international surveys in education has been to report trends in achievement. The names of the two current IEA surveys reflect this growing interest: Trends in International Mathematics and Science Study (TIMSS) and Progress in International Reading Literacy Study (PIRLS). Similarly a central concern of the OECD's PISA is with trends in outcomes over time. To facilitate trend analyses these studies link their tests using common item equating in conjunction with item response modelling methods. IEA and PISA policies differ in terms of reporting the error associated with trends. In IEA surveys, the standard errors of the trend estimates do not include the uncertainty associated with the linking step while PISA does include a linking error component in the standard errors of trend estimates. In other words, PISA implicitly acknowledges that trend estimates partly depend on the selected common items, while the IEA's surveys do not recognise this source of error. Failing to recognise the linking error leads to an underestimation of the standard errors and thus increases the Type I error rate, thereby resulting in reporting of significant changes in achievement when in fact these are not significant. The growing interest of policy makers in trend indicators and the impact of the evaluation of educational reforms appear to be incompatible with such underestimation. However, the procedure implemented by PISA raises a few issues about the underlying assumptions for the computation of the equating error. After a brief introduction, this paper will describe the procedure PISA implemented to compute the linking error. The underlying assumptions of this procedure will then be discussed. Finally an alternative method based on replication techniques will be presented, based on a simulation study and then applied to the PISA 2000 data.

  16. Computer vision in roadway transportation systems: a survey

    Science.gov (United States)

    Loce, Robert P.; Bernal, Edgar A.; Wu, Wencheng; Bala, Raja

    2013-10-01

    There is a worldwide effort to apply 21st century intelligence to evolving our transportation networks. The goals of smart transportation networks are quite noble and manifold, including safety, efficiency, law enforcement, energy conservation, and emission reduction. Computer vision is playing a key role in this transportation evolution. Video imaging scientists are providing intelligent sensing and processing technologies for a wide variety of applications and services. There are many interesting technical challenges including imaging under a variety of environmental and illumination conditions, data overload, recognition and tracking of objects at high speed, distributed network sensing and processing, energy sources, as well as legal concerns. This paper presents a survey of computer vision techniques related to three key problems in the transportation domain: safety, efficiency, and security and law enforcement. A broad review of the literature is complemented by detailed treatment of a few selected algorithms and systems that the authors believe represent the state-of-the-art.

  17. Infrared Testing of the Wide-field Infrared Survey Telescope Grism Using Computer Generated Holograms

    Science.gov (United States)

    Dominguez, Margaret Z.; Content, David A.; Gong, Qian; Griesmann, Ulf; Hagopian, John G.; Marx, Catherine T; Whipple, Arthur L.

    2017-01-01

    Infrared Computer Generated Holograms (CGHs) were designed, manufactured and used to measure the performance of the grism (grating prism) prototype which includes testing Diffractive Optical Elements (DOE). The grism in the Wide Field Infrared Survey Telescope (WFIRST) will allow the surveying of a large section of the sky to find bright galaxies.

  18. Enhanced delegated computing using coherence

    Science.gov (United States)

    Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.

    2016-03-01

    A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.

  19. Framework for developing realistic MANET simulations

    CSIR Research Space (South Africa)

    Burke, ID

    2010-04-01

    Full Text Available : Proceedings of the second ACM international workshop on Principles of mobile computing. Toulouse, France, 2002. ACM. Chin, K.-W., Judge, J., Williams, A. & Kermode, R., 2002. Implementation experience with MANET routing protocols. SIGCOMM Comput. Commun... & computing. Long Beach, CA, USA, 2001. ACM. Fall, K. & Varadhan, K., 2003. NS-Manual. [PDF] Available at: http://www.isi.edu/nsnam/ns/doc/ns_doc.pdf . Garrido, P.P., Malumbres, M.P. & Calafate, C.T., 2008. ns-2 vs. OPNET: a comparative study of the IEEE...

  20. US Geological Survey National Computer Technology Meeting; Proceedings, Phoenix, Arizona, November 14-18, 1988

    Science.gov (United States)

    Balthrop, Barbara H.; Terry, J.E.

    1991-01-01

    The U.S. Geological Survey National Computer Technology Meetings (NCTM) are sponsored by the Water Resources Division and provide a forum for the presentation of technical papers and the sharing of ideas or experiences related to computer technology. This report serves as a proceedings of the meeting held in November, 1988 at the Crescent Hotel in Phoenix, Arizona. The meeting was attended by more than 200 technical and managerial people representing all Divisions of the U.S. Geological Survey.Scientists in every Division of the U.S. Geological Survey rely heavily upon state-of-the-art computer technology (both hardware and sofnuare). Today the goals of each Division are pursued in an environment where high speed computers, distributed communications, distributed data bases, high technology input/output devices, and very sophisticated simulation tools are used regularly. Therefore, information transfer and the sharing of advances in technology are very important issues that must be addressed regularly.This report contains complete papers and abstracts of papers that were presented at the 1988 NCTM. The report is divided into topical sections that reflect common areas of interest and application. In each section, papers are presented first followed by abstracts. For these proceedings, the publication of a complete paper or only an abstract was at the discretion of the author, although complete papers were encouraged.Some papers presented at the 1988 NCTM are not published in these proceedings.

  1. Computer technology forecasting at the National Laboratories

    International Nuclear Information System (INIS)

    Peskin, A.M.

    1980-01-01

    The DOE Office of ADP Management organized a group of scientists and computer professionals, mostly from their own national laboratories, to prepare an annually updated technology forecast to accompany the Department's five-year ADP Plan. The activities of the task force were originally reported in an informal presentation made at the ACM Conference in 1978. This presentation represents an update of that report. It also deals with the process of applying the results obtained at a particular computing center, Brookhaven National Laboratory. Computer technology forecasting is a difficult and hazardous endeavor, but it can reap considerable advantage. The forecast performed on an industry-wide basis can be applied to the particular needs of a given installation, and thus give installation managers considerable guidance in planning. A beneficial side effect of this process is that it forces installation managers, who might otherwise tend to preoccupy themselves with immediate problems, to focus on longer term goals and means to their ends

  2. Wirelessly powered sensor networks and computational RFID

    CERN Document Server

    2013-01-01

    The Wireless Identification and Sensing Platform (WISP) is the first of a new class of RF-powered sensing and computing systems.  Rather than being powered by batteries, these sensor systems are powered by radio waves that are either deliberately broadcast or ambient.  Enabled by ongoing exponential improvements in the energy efficiency of microelectronics, RF-powered sensing and computing is rapidly moving along a trajectory from impossible (in the recent past), to feasible (today), toward practical and commonplace (in the near future). This book is a collection of key papers on RF-powered sensing and computing systems including the WISP.  Several of the papers grew out of the WISP Challenge, a program in which Intel Corporation donated WISPs to academic applicants who proposed compelling WISP-based projects.  The book also includes papers presented at the first WISP Summit, a workshop held in Berkeley, CA in association with the ACM Sensys conference, as well as other relevant papers. The book provides ...

  3. A review of small canned computer programs for survey research and demographic analysis.

    Science.gov (United States)

    Sinquefield, J C

    1976-12-01

    A variety of small canned computer programs for survey research and demographic analysis appropriate for use in developing countries are reviewed in this article. The programs discussed are SPSS (Statistical Package for the Social Sciences); CENTS, CO-CENTS, CENTS-AID, CENTS-AIE II; MINI-TAB EDIT, FREQUENCIES, TABLES, REGRESSION, CLIENT RECORD, DATES, MULT, LIFE, and PREGNANCY HISTORY; FIVFIV and SINSIN; DCL (Demographic Computer Library); MINI-TAB Population Projection, Functional Population Projection, and Family Planning Target Projection. A description and evaluation for each program of uses, instruction manuals, computer requirements, and procedures for obtaining manuals and programs are provided. Such information is intended to facilitate and encourage the use of the computer by data processors in developing countries.

  4. Representing continuous t-norms in quantum computation with mixed states

    International Nuclear Information System (INIS)

    Freytes, H; Sergioli, G; Arico, A

    2010-01-01

    A model of quantum computation is discussed in (Aharanov et al 1997 Proc. 13th Annual ACM Symp. on Theory of Computation, STOC pp 20-30) and (Tarasov 2002 J. Phys. A: Math. Gen. 35 5207-35) in which quantum gates are represented by quantum operations acting on mixed states. It allows one to use a quantum computational model in which connectives of a four-valued logic can be realized as quantum gates. In this model, we give a representation of certain functions, known as t-norms (Menger 1942 Proc. Natl Acad. Sci. USA 37 57-60), that generalize the triangle inequality for the probability distribution-valued metrics. As a consequence an interpretation of the standard operations associated with the basic fuzzy logic (Hajek 1998 Metamathematics of Fuzzy Logic (Trends in Logic vol 4) (Dordrecht: Kluwer)) is provided in the frame of quantum computation.

  5. The survey of American college students computer technology preferences & purchasing plans

    CERN Document Server

    2009-01-01

    This report presents data from a survey of more than 400 American college students.  The report presents data on student computer ownership of both PCs and laptops, purchasing plans for PCs and laptops, as well as purchasing plans for cell phones and digital cameras.  The report also provides details on how student finance their computer purchases, how much money comes from parents or guardians, and how much from the student themselves, or from their parties.  In addition to data on PCs the report provides detailed info on use of popular word processing packages such as Word, WordPerfect and Open Office.

  6. Arecibo PALFA survey and Einstein@Home: binary pulsar discovery by volunteer computing

    NARCIS (Netherlands)

    Knispel, B.; Lazarus, P.; Allen, B.; Anderson, D.; Aulbert, C.; Bhat, N.D.R.; Bock, O.; Bogdanov, S.; Brazier, A.; Camilo, F.; Chatterjee, S.; Cordes, J.M.; Crawford, F.; Deneva, J.S.; Desvignes, G.; Fehrmann, H.; Freire, P.C.C.; Hammer, D.; Hessels, J.W.T.; Jenet, F.A.; Kaspi, V.M.; Kramer, M.; van Leeuwen, J.; Lorimer, D.R.; Lyne, A.G.; Machenschalk, B.; McLaughlin, M.A.; Messenger, C.; Nice, D.J.; Papa, M.A.; Pletsch, H.J.; Prix, R.; Ransom, S.M.; Siemens, X.; Stairs, I.H.; Stappers, B.W.; Stovall, K.; Venkataraman, A.

    2011-01-01

    We report the discovery of the 20.7 ms binary pulsar J1952+2630, made using the distributed computing project Einstein@Home in Pulsar ALFA survey observations with the Arecibo telescope. Follow-up observations with the Arecibo telescope confirm the binary nature of the system. We obtain a circular

  7. A Survey of Software Infrastructures and Frameworks for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Christoph Endres

    2005-01-01

    Full Text Available In this survey, we discuss 29 software infrastructures and frameworks which support the construction of distributed interactive systems. They range from small projects with one implemented prototype to large scale research efforts, and they come from the fields of Augmented Reality (AR, Intelligent Environments, and Distributed Mobile Systems. In their own way, they can all be used to implement various aspects of the ubiquitous computing vision as described by Mark Weiser [60]. This survey is meant as a starting point for new projects, in order to choose an existing infrastructure for reuse, or to get an overview before designing a new one. It tries to provide a systematic, relatively broad (and necessarily not very deep overview, while pointing to relevant literature for in-depth study of the systems discussed.

  8. Survey on Security Issues in Cloud Computing and Associated Mitigation Techniques

    Science.gov (United States)

    Bhadauria, Rohit; Sanyal, Sugata

    2012-06-01

    Cloud Computing holds the potential to eliminate the requirements for setting up of high-cost computing infrastructure for IT-based solutions and services that the industry uses. It promises to provide a flexible IT architecture, accessible through internet for lightweight portable devices. This would allow multi-fold increase in the capacity or capabilities of the existing and new software. In a cloud computing environment, the entire data reside over a set of networked resources, enabling the data to be accessed through virtual machines. Since these data-centers may lie in any corner of the world beyond the reach and control of users, there are multifarious security and privacy challenges that need to be understood and taken care of. Also, one can never deny the possibility of a server breakdown that has been witnessed, rather quite often in the recent times. There are various issues that need to be dealt with respect to security and privacy in a cloud computing scenario. This extensive survey paper aims to elaborate and analyze the numerous unresolved issues threatening the cloud computing adoption and diffusion affecting the various stake-holders linked to it.

  9. Survey of computed tomography doses in head and chest protocols

    International Nuclear Information System (INIS)

    Souza, Giordana Salvi de; Silva, Ana Maria Marques da

    2016-01-01

    Computed tomography is a clinical tool for the diagnosis of patients. However, the patient is subjected to a complex dose distribution. The aim of this study was to survey dose indicators in head and chest protocols CT scans, in terms of Dose-Length Product(DLP) and effective dose for adult and pediatric patients, comparing them with diagnostic reference levels in the literature. Patients were divided into age groups and the following image acquisition parameters were collected: age, kV, mAs, Volumetric Computed Tomography Dose Index (CTDIvol) and DLP. The effective dose was found multiplying DLP by correction factors. The results were obtained from the third quartile and showed the importance of determining kV and mAs values for each patient depending on the studied region, age and thickness. (author)

  10. Experiments in computing: a survey.

    Science.gov (United States)

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  11. Methods for computing water-quality loads at sites in the U.S. Geological Survey National Water Quality Network

    Science.gov (United States)

    Lee, Casey J.; Murphy, Jennifer C.; Crawford, Charles G.; Deacon, Jeffrey R.

    2017-10-24

    The U.S. Geological Survey publishes information on concentrations and loads of water-quality constituents at 111 sites across the United States as part of the U.S. Geological Survey National Water Quality Network (NWQN). This report details historical and updated methods for computing water-quality loads at NWQN sites. The primary updates to historical load estimation methods include (1) an adaptation to methods for computing loads to the Gulf of Mexico; (2) the inclusion of loads computed using the Weighted Regressions on Time, Discharge, and Season (WRTDS) method; and (3) the inclusion of loads computed using continuous water-quality data. Loads computed using WRTDS and continuous water-quality data are provided along with those computed using historical methods. Various aspects of method updates are evaluated in this report to help users of water-quality loading data determine which estimation methods best suit their particular application.

  12. Computational Challenge of Fractional Differential Equations and the Potential Solutions: A Survey

    Directory of Open Access Journals (Sweden)

    Chunye Gong

    2015-01-01

    Full Text Available We present a survey of fractional differential equations and in particular of the computational cost for their numerical solutions from the view of computer science. The computational complexities of time fractional, space fractional, and space-time fractional equations are O(N2M, O(NM2, and O(NM(M + N compared with O(MN for the classical partial differential equations with finite difference methods, where M, N are the number of space grid points and time steps. The potential solutions for this challenge include, but are not limited to, parallel computing, memory access optimization (fractional precomputing operator, short memory principle, fast Fourier transform (FFT based solutions, alternating direction implicit method, multigrid method, and preconditioner technology. The relationships of these solutions for both space fractional derivative and time fractional derivative are discussed. The authors pointed out that the technologies of parallel computing should be regarded as a basic method to overcome this challenge, and some attention should be paid to the fractional killer applications, high performance iteration methods, high order schemes, and Monte Carlo methods. Since the computation of fractional equations with high dimension and variable order is even heavier, the researchers from the area of mathematics and computer science have opportunity to invent cornerstones in the area of fractional calculus.

  13. The Asilomar Survey: Stakeholders' Opinions on Ethical Issues Related to Brain-Computer Interfacing

    NARCIS (Netherlands)

    Nijboer, Femke; Clausen, Jens; Allison, Brendan Z.; Haselager, Pim

    2013-01-01

    Brain-Computer Interface (BCI) research and (future) applications raise important ethical issues that need to be addressed to promote societal acceptance and adequate policies. Here we report on a survey we conducted among 145 BCI researchers at the 4th International BCI conference, which took place

  14. The Asilomar Survey: Stakeholders’ Opinions on Ethical Issues Related to Brain-Computer Interfacing

    NARCIS (Netherlands)

    Nijboer, F.; Clausen, J.; Allison, B.Z.; Haselager, W.F.G.

    2013-01-01

    Brain-Computer Interface (BCI) research and (future) applications raise important ethical issues that need to be addressed to promote societal acceptance and adequate policies. Here we report on a survey we conducted among 145 BCI researchers at the 4th International BCI conference, which took place

  15. Experimental determination of the partitioning coefficient and volatility of important BVOC oxidation products using the Aerosol Collection Module (ACM) coupled to a PTR-ToF-MS

    Science.gov (United States)

    Gkatzelis, G.; Hohaus, T.; Tillmann, R.; Schmitt, S. H.; Yu, Z.; Schlag, P.; Wegener, R.; Kaminski, M.; Kiendler-Scharr, A.

    2015-12-01

    Atmospheric aerosol can alter the Earth's radiative budget and global climate but can also affect human health. A dominant contributor to the submicrometer particulate matter (PM) is organic aerosol (OA). OA can be either directly emitted through e.g. combustion processes (primary OA) or formed through the oxidation of organic gases (secondary organic aerosol, SOA). A detailed understanding of SOA formation is of importance as it constitutes a major contribution to the total OA. The partitioning between the gas and particle phase as well as the volatility of individual components of SOA is yet poorly understood adding uncertainties and thus complicating climate modelling. In this work, a new experimental methodology was used for compound-specific analysis of organic aerosol. The Aerosol Collection Module (ACM) is a newly developed instrument that deploys an aerodynamic lens to separate the gas and particle phase of an aerosol. The particle phase is directed to a cooled sampling surface. After collection particles are thermally desorbed and transferred to a detector for further analysis. In the present work, the ACM was coupled to a Proton Transfer Reaction-Time of Flight-Mass Spectrometer (PTR-ToF-MS) to detect and quantify organic compounds partitioning between the gas and particle phase. This experimental approach was used in a set of experiments at the atmosphere simulation chamber SAPHIR to investigate SOA formation. Ozone oxidation with subsequent photochemical aging of β-pinene, limonene and real plant emissions from Pinus sylvestris (Scots pine) were studied. Simultaneous measurement of the gas and particle phase using the ACM-PTR-ToF-MS allows to report partitioning coefficients of important BVOC oxidation products. Additionally, volatility trends and changes of the SOA with photochemical aging are investigated and compared for all systems studied.

  16. Fabrication of 93.7 m long PLD-EuBCO + BaHfO_3 coated conductors with 103 A/cm W at 77 K under 3 T

    International Nuclear Information System (INIS)

    Yoshida, T.; Ibi, A.; Takahashi, T.; Yoshizumi, M.; Izumi, T.; Shiohara, Y.

    2015-01-01

    Highlights: • A 93.7 m long EuBCO + BHO CC with 103 A/cm W at 77 K under 3 T was obtained. • The 93.7 m long CC showed high I_c values and high n-values with high uniformity. • The average I_c value at 77 K under 3 T was estimated by that at 77 K under 0.3 T. - Abstract: Introduction of artificial pinning centers such as BaHfO_3 (BHO), BaZrO_3 (BZO) and BaSnO_3 (BSO) into REBa_2Cu_3O_7_−_δ (REBCO) coated conductor (CC) layers could improve the in-field critical currents (I_c) in wide ranges of temperatures and magnetic fields. In particular, a combination of EuBCO + BHO has been found to be effective for attaining high in-field I_c performance by means of IBAD/PLD process in short length samples. In this work, we have successfully fabricated a 93.7 m long EuBCO + BHO CC with 103 A/cm W at 77 K under a magnet field (B) of 3 T applied perpendicular to the CC (B//c). The 93.7 m long EuBCO + BHO CC had high uniformity of I_c values and n-values without any trend of fluctuations, independent of the external field up to 0.3 T. I_c–B–applied angle (θ) profiles of the 93.7 m long EuBCO + BHO CC sample showed the high in-field I_c values in all directions of applied magnetic fields especially B//c (at θ ∼ 180°, I_c = 157 A/cm W) at 77 K under 3 T. The profiles were about the same as those in a short length sample.

  17. Evidence for heterogeneity of astrocyte de-differentiation in vitro: astrocytes transform into intermediate precursor cells following induction of ACM from scratch-insulted astrocytes.

    Science.gov (United States)

    Yang, Hao; Qian, Xin-Hong; Cong, Rui; Li, Jing-wen; Yao, Qin; Jiao, Xi-Ying; Ju, Gong; You, Si-Wei

    2010-04-01

    Our previous study definitely demonstrated that the mature astrocytes could undergo a de-differentiation process and further transform into pluripotential neural stem cells (NSCs), which might well arise from the effect of diffusible factors released from scratch-insulted astrocytes. However, these neurospheres passaged from one neurosphere-derived from de-differentiated astrocytes possessed a completely distinct characteristic in the differentiation behavior, namely heterogeneity of differentiation. The heterogeneity in cell differentiation has become a crucial but elusive issue. In this study, we show that purified astrocytes could de-differentiate into intermediate precursor cells (IPCs) with addition of scratch-insulted astrocyte-conditioned medium (ACM) to the culture, which can express NG2 and A2B5, the IPCs markers. Apart from the number of NG2(+) and A2B5(+) cells, the percentage of proliferative cells as labeled with BrdU progressively increased with prolonged culture period ranging from 1 to 10 days. Meanwhile, the protein level of A2B5 in cells also increased significantly. These results revealed that not all astrocytes could de-differentiate fully into NSCs directly when induced by ACM, rather they generated intermediate or more restricted precursor cells that might undergo progressive de-differentiation to generate NSCs.

  18. Automatic 2D segmentation of airways in thorax computed tomography images; Segmentacao automatica 2D de vias aereas em imagens de tomografia computadorizada do torax

    Energy Technology Data Exchange (ETDEWEB)

    Cavalcante, Tarique da Silveira; Cortez, Paulo Cesar; Almeida, Thomaz Maia de, E-mail: tarique@lesc.ufc.br [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Dept. de Engenharia de Teleinformatica; Felix, John Hebert da Silva [Universidade da Integracao Internacional da Lusofonia Afro-Brasileira (UNILAB), Redencao, CE (Brazil). Departamento de Energias; Holanda, Marcelo Alcantara [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Fac. de Medicina

    2013-07-01

    Introduction: much of the world population is affected by pulmonary diseases, such as the bronchial asthma, bronchitis and bronchiectasis. The bronchial diagnosis is based on the airways state. In this sense, the automatic segmentation of the airways in Computed Tomography (CT) scans is a critical step in the aid to diagnosis of these diseases. Methods: this paper evaluates algorithms for airway automatic segmentation, using Neural Network Multilayer Perceptron (MLP) and Lung Densities Analysis (LDA) for detecting airways, along with Region Growing (RG), Active Contour Method (ACM) Balloon and Topology Adaptive to segment them. Results: we obtained results in three stages: comparative analysis of the detection algorithms MLP and LDA, with a gold standard acquired by three physicians with expertise in CT imaging of the chest; comparative analysis of segmentation algorithms ACM Balloon, ACM Topology Adaptive, MLP and RG; and evaluation of possible combinations between segmentation and detection algorithms, resulting in the complete method for automatic segmentation of the airways in 2D. Conclusion: the low incidence of false negative and the significant reduction of false positive, results in similarity coefficient and sensitivity exceeding 91% and 87% respectively, for a combination of algorithms with satisfactory segmentation quality. (author)

  19. Using the Superpopulation Model for Imputations and Variance Computation in Survey Sampling

    Directory of Open Access Journals (Sweden)

    Petr Novák

    2012-03-01

    Full Text Available This study is aimed at variance computation techniques for estimates of population characteristics based on survey sampling and imputation. We use the superpopulation regression model, which means that the target variable values for each statistical unit are treated as random realizations of a linear regression model with weighted variance. We focus on regression models with one auxiliary variable and no intercept, which have many applications and straightforward interpretation in business statistics. Furthermore, we deal with caseswhere the estimates are not independent and thus the covariance must be computed. We also consider chained regression models with auxiliary variables as random variables instead of constants.

  20. Survey of patient dose in computed tomography in Syria 2009.

    Science.gov (United States)

    Kharita, M H; Khazzam, S

    2010-09-01

    The radiation doses to patient in computed tomography (CT) in Syria have been investigated and compared with similar studies in different countries. This work surveyed 30 CT scanners from six different manufacturers distributed all over Syria. Some of the results in this paper were part of a project launched by the International Atomic Energy Agency in different regions of the world covering Asia, Africa and Eastern Europe. The dose quantities covered are CT dose index (CTDI(w)), dose-length product (DLP), effective dose (E) and collective dose. It was found that most CTDI(w) and DLP values were similar to the European reference levels and in line with the results of similar surveys in the world. The results were in good agreement with the UNSCEAR Report 2007. This study concluded a recommendation for national diagnostic reference level for the most common CT protocols in Syria. The results can be used as a base for future optimisation studies in the country.

  1. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature

    International Nuclear Information System (INIS)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included

  2. Survey on Security Issues in File Management in Cloud Computing Environment

    Science.gov (United States)

    Gupta, Udit

    2015-06-01

    Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.

  3. Improving simulated spatial distribution of productivity and biomass in Amazon forests using the ACME land model

    Science.gov (United States)

    Yang, X.; Thornton, P. E.; Ricciuto, D. M.; Shi, X.; Xu, M.; Hoffman, F. M.; Norby, R. J.

    2017-12-01

    Tropical forests play a crucial role in the global carbon cycle, accounting for one third of the global NPP and containing about 25% of global vegetation biomass and soil carbon. This is particularly true for tropical forests in the Amazon region, as it comprises approximately 50% of the world's tropical forests. It is therefore important for us to understand and represent the processes that determine the fluxes and storage of carbon in these forests. In this study, we show that the implementation of phosphorus (P) cycle and P limitation in the ACME Land Model (ALM) improves simulated spatial pattern of NPP. The P-enabled ALM is able to capture the west-to-east gradient of productivity, consistent with field observations. We also show that by improving the representation of mortality processes, ALM is able to reproduce the observed spatial pattern of above ground biomass across the Amazon region.

  4. Can Pearlite form Outside of the Hultgren Extrapolation of the Ae3 and Acm Phase Boundaries?

    Science.gov (United States)

    Aranda, M. M.; Rementeria, R.; Capdevila, C.; Hackenberg, R. E.

    2016-02-01

    It is usually assumed that ferrous pearlite can form only when the average austenite carbon concentration C 0 lies between the extrapolated Ae3 ( γ/ α) and Acm ( γ/ θ) phase boundaries (the "Hultgren extrapolation"). This "mutual supersaturation" criterion for cooperative lamellar nucleation and growth is critically examined from a historical perspective and in light of recent experiments on coarse-grained hypoeutectoid steels which show pearlite formation outside the Hultgren extrapolation. This criterion, at least as interpreted in terms of the average austenite composition, is shown to be unnecessarily restrictive. The carbon fluxes evaluated from Brandt's solution are sufficient to allow pearlite growth both inside and outside the Hultgren Extrapolation. As for the feasibility of the nucleation events leading to pearlite, the only criterion is that there are some local regions of austenite inside the Hultgren Extrapolation, even if the average austenite composition is outside.

  5. Use of Computer Imaging in Rhinoplasty: A Survey of the Practices of Facial Plastic Surgeons.

    Science.gov (United States)

    Singh, Prabhjyot; Pearlman, Steven

    2017-08-01

    The objective of this study was to quantify the use of computer imaging by facial plastic surgeons. AAFPRS Facial plastic surgeons were surveyed about their use of computer imaging during rhinoplasty consultations. The survey collected information about surgeon demographics, practice settings, practice patterns, and rates of computer imaging (CI) for primary and revision rhinoplasty. For those surgeons who used CI, additional information was also collected, which included who performed the imaging and whether the patient was given the morphed images after the consultation. A total of 238 out of 1200 (19.8%) facial plastic surgeons responded to the survey. Out of those who responded, 195 surgeons (83%) were board certified by the American Board of Facial Plastic and Reconstructive Surgeons (ABFPRS). The majority of respondents (150 surgeons, 63%) used CI during rhinoplasty consultation. Of the surgeons who use CI, 92% performed the image morphing themselves. Approximately two-thirds of surgeons who use CI gave their patient a printout of the morphed images after the consultation. Computer imaging (CI) is a frequently utilized tool for facial plastic surgeons during cosmetic consultations with patients. Based on these results of this study, it can be suggested that the majority of facial plastic surgeons who use CI do so for both primary and revision rhinoplasty. As more sophisticated systems become available, it is possible that utilization of CI modalities will increase. This provides the surgeon with further tools to use at his or her disposal during discussion of aesthetic surgery. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  6. Cloud Computing Security: A Survey

    OpenAIRE

    Khalil, Issa; Khreishah, Abdallah; Azeem, Muhammad

    2014-01-01

    Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing a...

  7. Enterococcus faecium biofilm formation: identification of major autolysin AtlAEfm, associated Acm surface localization, and AtlAEfm-independent extracellular DNA Release.

    Science.gov (United States)

    Paganelli, Fernanda L; Willems, Rob J L; Jansen, Pamela; Hendrickx, Antoni; Zhang, Xinglin; Bonten, Marc J M; Leavis, Helen L

    2013-04-16

    Enterococcus faecium is an important multidrug-resistant nosocomial pathogen causing biofilm-mediated infections in patients with medical devices. Insight into E. faecium biofilm pathogenesis is pivotal for the development of new strategies to prevent and treat these infections. In several bacteria, a major autolysin is essential for extracellular DNA (eDNA) release in the biofilm matrix, contributing to biofilm attachment and stability. In this study, we identified and functionally characterized the major autolysin of E. faecium E1162 by a bioinformatic genome screen followed by insertional gene disruption of six putative autolysin genes. Insertional inactivation of locus tag EfmE1162_2692 resulted in resistance to lysis, reduced eDNA release, deficient cell attachment, decreased biofilm, decreased cell wall hydrolysis, and significant chaining compared to that of the wild type. Therefore, locus tag EfmE1162_2692 was considered the major autolysin in E. faecium and renamed atlAEfm. In addition, AtlAEfm was implicated in cell surface exposure of Acm, a virulence factor in E. faecium, and thereby facilitates binding to collagen types I and IV. This is a novel feature of enterococcal autolysins not described previously. Furthermore, we identified (and localized) autolysin-independent DNA release in E. faecium that contributes to cell-cell interactions in the atlAEfm mutant and is important for cell separation. In conclusion, AtlAEfm is the major autolysin in E. faecium and contributes to biofilm stability and Acm localization, making AtlAEfm a promising target for treatment of E. faecium biofilm-mediated infections. IMPORTANCE Nosocomial infections caused by Enterococcus faecium have rapidly increased, and treatment options have become more limited. This is due not only to increasing resistance to antibiotics but also to biofilm-associated infections. DNA is released in biofilm matrix via cell lysis, caused by autolysin, and acts as a matrix stabilizer. In this study

  8. CHI '13 Extended Abstracts on Human Factors in Computing Systems

    DEFF Research Database (Denmark)

    also deeply appreciate the huge amount of time donated to this process by the 211-member program committee, who paid their own way to attend the face-to-face program committee meeting, an event larger than the average ACM conference. We are proud of the work of the CHI 2013 program committee and hope...... a tremendous amount of work from all areas of the human-computer interaction community. As co-chairs of the process, we are amazed at the ability of the community to organize itself to accomplish this task. We would like to thank the 2680 individual reviewers for their careful consideration of these papers. We...

  9. Optimizing Human Input in Social Network Analysis

    Science.gov (United States)

    2018-01-23

    7] T. L. Lai and H. Robbins, “Asymptotically efficient adaptive allocation rules,” Advances in applied mathematics , vol. 6, no. 1, pp. 4–22, 1985. [8...W. Whitt, “Heavy traffic limit theorems for queues: a survey,” in Mathematical Methods in Queueing Theory. Springer, 1974, pp. 307–350. [9] H...Regret lower bounds and optimal algorithms,” in Proceedings of the 2015 ACM SIGMETRICS International Conference on Measurement and Modeling of Computer

  10. Security and health protection while working with a computer. Survey into the knowledge of users about legal and other requirements.

    OpenAIRE

    Šmejkalová, Petra

    2005-01-01

    This bachelor thesis is aimed at the knowledge of general computer users with regards to work security and health protection. It summarizes the relevant legislation and recommendations of ergonomic specialists. The practical part analyses results of a survey, which examined the computer workplaces and user habits when working with a computer.

  11. Additive Construction with Mobile Emplacement (ACME) / Automated Construction of Expeditionary Structures (ACES) Materials Delivery System (MDS)

    Science.gov (United States)

    Mueller, R. P.; Townsend, I. I.; Tamasy, G. J.; Evers, C. J.; Sibille, L. J.; Edmunson, J. E.; Fiske, M. R.; Fikes, J. C.; Case, M.

    2018-01-01

    The purpose of the Automated Construction of Expeditionary Structures, Phase 3 (ACES 3) project is to incorporate the Liquid Goods Delivery System (LGDS) into the Dry Goods Delivery System (DGDS) structure to create an integrated and automated Materials Delivery System (MDS) for 3D printing structures with ordinary Portland cement (OPC) concrete. ACES 3 is a prototype for 3-D printing barracks for soldiers in forward bases, here on Earth. The LGDS supports ACES 3 by storing liquid materials, mixing recipe batches of liquid materials, and working with the Dry Goods Feed System (DGFS) previously developed for ACES 2, combining the materials that are eventually extruded out of the print nozzle. Automated Construction of Expeditionary Structures, Phase 3 (ACES 3) is a project led by the US Army Corps of Engineers (USACE) and supported by NASA. The equivalent 3D printing system for construction in space is designated Additive Construction with Mobile Emplacement (ACME) by NASA.

  12. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    Science.gov (United States)

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  13. Program software for the automated processing of gravity and magnetic survey data for the Mir computer

    Energy Technology Data Exchange (ETDEWEB)

    Lyubimov, G.A.

    1980-01-01

    A presentation is made of the content of program software for the automated processing of gravity and magnetic survey data for the small Mir-1 and Mir-2 computers as worked out on the Voronezh geophysical expedition.

  14. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems

    DEFF Research Database (Denmark)

    also deeply appreciate the huge amount of time donated to this process by the 211-member program committee, who paid their own way to attend the face-to-face program committee meeting, an event larger than the average ACM conference. We are proud of the work of the CHI 2013 program committee and hope...... a tremendous amount of work from all areas of the human-computer interaction community. As co-chairs of the process, we are amazed at the ability of the community to organize itself to accomplish this task. We would like to thank the 2680 individual reviewers for their careful consideration of these papers. We...

  15. Web-based Surveys: Changing the Survey Process

    OpenAIRE

    Gunn, Holly

    2002-01-01

    Web-based surveys are having a profound influence on the survey process. Unlike other types of surveys, Web page design skills and computer programming expertise play a significant role in the design of Web-based surveys. Survey respondents face new and different challenges in completing a Web-based survey. This paper examines the different types of Web-based surveys, the advantages and challenges of using Web-based surveys, the design of Web-based surveys, and the issues of validity, error, ...

  16. Computer Augmented Learning; A Survey.

    Science.gov (United States)

    Kindred, J.

    The report contains a description and summary of computer augmented learning devices and systems. The devices are of two general types programed instruction systems based on the teaching machines pioneered by Pressey and developed by Skinner, and the so-called "docile" systems that permit greater user-direction with the computer under student…

  17. The Asilomar Survey: Stakeholders? Opinions on Ethical Issues Related to Brain-Computer Interfacing

    OpenAIRE

    Nijboer, Femke; Clausen, Jens; Allison, Brendan Z.; Haselager, Pim

    2011-01-01

    Brain-Computer Interface (BCI) research and (future) applications raise important ethical issues that need to be addressed to promote societal acceptance and adequate policies. Here we report on a survey we conducted among 145 BCI researchers at the 4th International BCI conference, which took place in May–June 2010 in Asilomar, California. We assessed respondents’ opinions about a number of topics. First, we investigated preferences for terminology and definitions relating to BCIs. Second, w...

  18. Cloud Computing Security: A Survey

    Directory of Open Access Journals (Sweden)

    Issa M. Khalil

    2014-02-01

    Full Text Available Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing and outsourcing, create new challenges to the security community. Addressing these challenges requires, in addition to the ability to cultivate and tune the security measures developed for traditional computing systems, proposing new security policies, models, and protocols to address the unique cloud security challenges. In this work, we provide a comprehensive study of cloud computing security and privacy concerns. We identify cloud vulnerabilities, classify known security threats and attacks, and present the state-of-the-art practices to control the vulnerabilities, neutralize the threats, and calibrate the attacks. Additionally, we investigate and identify the limitations of the current solutions and provide insights of the future security perspectives. Finally, we provide a cloud security framework in which we present the various lines of defense and identify the dependency levels among them. We identify 28 cloud security threats which we classify into five categories. We also present nine general cloud attacks along with various attack incidents, and provide effectiveness analysis of the proposed countermeasures.

  19. A State-Wide Survey of South Australian Secondary Schools to Determine the Current Emphasis on Ergonomics and Computer Use

    Science.gov (United States)

    Sawyer, Janet; Penman, Joy

    2012-01-01

    This study investigated the pattern of teaching of healthy computing skills to high school students in South Australia. A survey approach was used to collect data, specifically to determine the emphasis placed by schools on ergonomics that relate to computer use. Participating schools were recruited through the Department for Education and Child…

  20. Research Article Special Issue

    African Journals Online (AJOL)

    2016-05-15

    May 15, 2016 ... ... such as [14,17]. In [19] this is introduced as atomic verification capability. Here .... 26th ACM Symposium on Theory of Computing,pages 544–553. ACM, 1994. ... Self-tallying elections and perfect ballot secrecy. In Proc. 5th.

  1. Audio computer-assisted self interview compared to traditional interview in an HIV-related behavioral survey in Vietnam.

    Science.gov (United States)

    Le, Linh Cu; Vu, Lan T H

    2012-10-01

    Globally, population surveys on HIV/AIDS and other sensitive topics have been using audio computer-assisted self interview for many years. This interview technique, however, is still new to Vietnam and little is known about its application and impact in general population surveys. One plausible hypothesis is that residents of Vietnam interviewed using this technique may provide a higher response rate and be more willing to reveal their true behaviors than if interviewed with traditional methods. This study aims to compare audio computer-assisted self interview with traditional face-to-face personal interview and self-administered interview with regard to rates of refusal and affirmative responses to questions on sensitive topics related to HIV/AIDS. In June 2010, a randomized study was conducted in three cities (Ha Noi, Da Nan and Can Tho), using a sample of 4049 residents aged 15 to 49 years. Respondents were randomly assigned to one of three interviewing methods: audio computer-assisted self interview, personal face-to-face interview, and self-administered paper interview. Instead of providing answers directly to interviewer questions as with traditional methods, audio computer-assisted self-interview respondents read the questions displayed on a laptop screen, while listening to the questions through audio headphones, then entered responses using a laptop keyboard. A MySQL database was used for data management and SPSS statistical package version 18 used for data analysis with bivariate and multivariate statistical techniques. Rates of high risk behaviors and mean values of continuous variables were compared for the three data collection methods. Audio computer-assisted self interview showed advantages over comparison techniques, achieving lower refusal rates and reporting higher prevalence of some sensitive and risk behaviors (perhaps indication of more truthful answers). Premarital sex was reported by 20.4% in the audio computer-assisted self-interview survey

  2. Perceived problems with computer gaming and internet use among adolescents: measurement tool for non-clinical survey studies

    Science.gov (United States)

    2014-01-01

    Background Existing instruments for measuring problematic computer and console gaming and internet use are often lengthy and often based on a pathological perspective. The objective was to develop and present a new and short non-clinical measurement tool for perceived problems related to computer use and gaming among adolescents and to study the association between screen time and perceived problems. Methods Cross-sectional school-survey of 11-, 13-, and 15-year old students in thirteen schools in the City of Aarhus, Denmark, participation rate 89%, n = 2100. The main exposure was time spend on weekdays on computer- and console-gaming and internet use for communication and surfing. The outcome measures were three indexes on perceived problems related to computer and console gaming and internet use. Results The three new indexes showed high face validity and acceptable internal consistency. Most schoolchildren with high screen time did not experience problems related to computer use. Still, there was a strong and graded association between time use and perceived problems related to computer gaming, console gaming (only boys) and internet use, odds ratios ranging from 6.90 to 10.23. Conclusion The three new measures of perceived problems related to computer and console gaming and internet use among adolescents are appropriate, reliable and valid for use in non-clinical surveys about young people’s everyday life and behaviour. These new measures do not assess Internet Gaming Disorder as it is listed in the DSM and therefore has no parity with DSM criteria. We found an increasing risk of perceived problems with increasing time spent with gaming and internet use. Nevertheless, most schoolchildren who spent much time with gaming and internet use did not experience problems. PMID:24731270

  3. OJPOT: Online Judge & Practice Oriented Teaching Idea in Programming Courses

    Science.gov (United States)

    Wang, Gui Ping; Chen, Shu Yu; Yang, Xin; Feng, Rui

    2016-01-01

    Practical abilities are important for students from majors including Computer Science and Engineering, and Electrical Engineering. Along with the popularity of ACM International Collegiate Programming Contest (ACM/ICPC) and other programming contests, online judge (OJ) websites achieve rapid development, thus providing a new kind of programming…

  4. Temperature profiles from mechanical bathythermograph (MBT) casts from the USS ACME in the North Pacific Ocean in support of the Fleet Observations of Oceanographic Data (FLOOD) project from 1968-04-05 to 1968-04-25 (NODC Accession 6800642)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — MBT data were collected from the USS ACME in support of the Fleet Observations of Oceanographic Data (FLOOD) project. Data were collected by US Navy; Ships of...

  5. The Snackbot: Documenting the Design of a Robot for Long-term Human-Robot Interaction

    Science.gov (United States)

    2009-03-01

    distributed robots. Proceedings of the Computer Supported Cooperative Work Conference’02. NY: ACM Press. [18] Kanda, T., Takayuki , H., Eaton, D., and...humanoid robots. Proceedings of HRI’06. New York, NY: ACM Press, 351-352. [23] Nabe, S., Kanda, T., Hiraki , K., Ishiguro, H., Kogure, K., and Hagita

  6. Survey of Education, Engineering, and Information Technology Students Knowledge of Green Computing in Nigerian University

    Directory of Open Access Journals (Sweden)

    Tajudeen Ahmed Shittu

    2016-02-01

    Full Text Available The use of computer system is growing rapidly and there is growing concern on the environmental hazard associated with its use. Thus, the need for every user’s to possess the knowledge of using computer in an environmental friendly manner.  This study therefore, investigated the knowledge of green computing possessed by university students in Nigeria. To achieve this, survey method was employed to carry out the study. The study involved students from three schools (Computer Science, Engineering, and Education. Purposive sampling method was used to draw three hundred (300 respondents that volunteer to answer the questionnaire administered for gathering the data of the study. The instrument used was adapted but modify and subjected to pilot testing to ascertain its validity and internal consistency. The reliability of the instrument showed a .75 Cronbach alpha level.  The first research question was answer with descriptive statistic (perecentage.  T-test and ANOVA was used to answer question two and three. The findings showed that the students do not possess adequate knowledge on conscious use of computing system. Also, the study showed that there is no significant difference in the green computing knowledge possesses among male and female as well as among student from the three schools. Based on these findings, the study suggested among other an aggressive campaign on green computing among university communities.

  7. On the Integration of Computer Algebra Systems (CAS) by Canadian Mathematicians: Results of a National Survey

    Science.gov (United States)

    Buteau, Chantal; Jarvis, Daniel H.; Lavicza, Zsolt

    2014-01-01

    In this article, we outline the findings of a Canadian survey study (N = 302) that focused on the extent of computer algebra systems (CAS)-based technology use in postsecondary mathematics instruction. Results suggest that a considerable number of Canadian mathematicians use CAS in research and teaching. CAS use in research was found to be the…

  8. Computers in Academic Architecture Libraries.

    Science.gov (United States)

    Willis, Alfred; And Others

    1992-01-01

    Computers are widely used in architectural research and teaching in U.S. schools of architecture. A survey of libraries serving these schools sought information on the emphasis placed on computers by the architectural curriculum, accessibility of computers to library staff, and accessibility of computers to library patrons. Survey results and…

  9. Theoretical and experimental analysis of amplitude control ablation and bipolar ablation in creating linear lesion and discrete lesions for treating atrial fibrillation.

    Science.gov (United States)

    Yan, Shengjie; Wu, Xiaomei; Wang, Weiqi

    2017-09-01

    Radiofrequency (RF) energy is often used to create a linear lesion or discrete lesions for blocking the accessory conduction pathways for treating atrial fibrillation. By using finite element analysis, we study the ablation effect of amplitude control ablation mode (AcM) and bipolar ablation mode (BiM) in creating a linear lesion and discrete lesions in a 5-mm-thick atrial wall; particularly, the characteristic of lesion shape has been investigated in amplitude control ablation. Computer models of multipolar catheter were developed to study the lesion dimensions in atrial walls created through AcM, BiM and special electrodes activated ablation methods in AcM and BiM. To validate the theoretical results in this study, an in vitro experiment with porcine cardiac tissue was performed. At 40 V/20 V root mean squared (RMS) of the RF voltage for AcM, the continuous and transmural lesion was created by AcM-15s, AcM-5s and AcM-ad-20V ablation in 5-mm-thick atrial wall. At 20 V RMS for BiM, the continuous but not transmural lesion was created. AcM ablation yielded asymmetrical and discrete lesions shape, whereas the lesion shape turned to more symmetrical and continuous as the electrodes alternative activated period decreased from 15 s to 5 s. Two discrete lesions were created when using AcM, AcM-ad-40V, BiM-ad-20V and BiM-ad-40V. The experimental and computational thermal lesion shapes created in cardiac tissue were in agreement. Amplitude control ablation technology and bipolar ablation technology are feasible methods to create continuous lesion or discrete for pulmonary veins isolation.

  10. PEP surveying procedures and equipment

    International Nuclear Information System (INIS)

    Linker, F.

    1982-06-01

    The PEP Survey and Alignment System, which employs both laser-based and optical survey methods, is described. The laser is operated in conjunction with the Tektronix 4051 computer and surveying instruments such as ARM and SAM, system which is designed to automate data input, reduction, and production of alignment instructions. The laser system is used when surveying ring quadrupoles, main bend magnets, sextupoles, and is optional when surveying RF cavities and insertion quadrupoles. Optical methods usually require that data be manually entered into the computer for alignment, but in some cases, an element can be aligned using nominal values of fiducial locations without use of the computer. Optical surveying is used in the alignment of NIT and SIT, low field bend magnets, wigglers, RF cavities, and insertion quadrupoles

  11. IT Barometer survey, Denmark

    DEFF Research Database (Denmark)

    Howard, Rob

    1998-01-01

    Survey results from Danish architects, engineers, contractors and property managers in the construction industry concerning their use of computers, communications, problems and needs.......Survey results from Danish architects, engineers, contractors and property managers in the construction industry concerning their use of computers, communications, problems and needs....

  12. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    International Nuclear Information System (INIS)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined

  13. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  14. Patient grouping for dose surveys and establishment of diagnostic reference levels in paediatric computed tomography

    International Nuclear Information System (INIS)

    Vassileva, J.; Rehani, M.

    2015-01-01

    There has been confusion in literature on whether paediatric patients should be grouped according to age, weight or other parameters when dealing with dose surveys. The present work aims to suggest a pragmatic approach to achieve reasonable accuracy for performing patient dose surveys in countries with limited resources. The analysis is based on a subset of data collected within the IAEA survey of paediatric computed tomography (CT) doses, involving 82 CT facilities from 32 countries in Asia, Europe, Africa and Latin America. Data for 6115 patients were collected, in 34.5 % of which data for weight were available. The present study suggests that using four age groups, <1, >1-5, >5-10 and >10-15 y, is realistic and pragmatic for dose surveys in less resource countries and for the establishment of DRLs. To ensure relevant accuracy of results, data for >30 patients in a particular age group should be collected if patient weight is not known. If a smaller sample is used, patient weight should be recorded and the median weight in the sample should be within 5-10 % from the median weight of the sample for which the DRLs were established. Comparison of results from different surveys should always be performed with caution, taking into consideration the way of grouping of paediatric patients. Dose results can be corrected for differences in patient weight/age group. (authors)

  15. Factors influencing the organizational adoption of cloud computing: a survey among cloud workers

    Directory of Open Access Journals (Sweden)

    Mark Stieninger

    2018-01-01

    Full Text Available Cloud computing presents an opportunity for organizations to leverage affordable, scalable, and agile technologies. However, even with the demonstrated value of cloud computing, organizations have been hesitant to adopt such technologies. Based on a multi-theoretical research model, this paper provides an empirical study targeted to better understand the adoption of cloud services. An online survey addressing the factors derived from literature for three specific popular cloud application types (cloud storage, cloud mail and cloud office was undertaken. The research model was analyzed by using variance-based structural equation modelling. Results show that the factors of compatibility, relative advantage, security and trust, as well as, a lower level of complexity lead to a more positive attitude towards cloud adoption. Complexity, compatibility, image and security and trust have direct and indirect effects on relative advantage. These factors further explain a large part of the attitude towards cloud adoption but not of its usage.

  16. Computer Series, 3: Computer Graphics for Chemical Education.

    Science.gov (United States)

    Soltzberg, Leonard J.

    1979-01-01

    Surveys the current scene in computer graphics from the point of view of a chemistry educator. Discusses the scope of current applications of computer graphics in chemical education, and provides information about hardware and software systems to promote communication with vendors of computer graphics equipment. (HM)

  17. National survey on dose data analysis in computed tomography.

    Science.gov (United States)

    Heilmaier, Christina; Treier, Reto; Merkle, Elmar Max; Alkhadi, Hatem; Weishaupt, Dominik; Schindera, Sebastian

    2018-05-28

    A nationwide survey was performed assessing current practice of dose data analysis in computed tomography (CT). All radiological departments in Switzerland were asked to participate in the on-line survey composed of 19 questions (16 multiple choice, 3 free text). It consisted of four sections: (1) general information on the department, (2) dose data analysis, (3) use of a dose management software (DMS) and (4) radiation protection activities. In total, 152 out of 241 Swiss radiological departments filled in the whole questionnaire (return rate, 63%). Seventy-nine per cent of the departments (n = 120/152) analyse dose data on a regular basis with considerable heterogeneity in the frequency (1-2 times per year, 45%, n = 54/120; every month, 35%, n = 42/120) and method of analysis. Manual analysis is carried out by 58% (n = 70/120) compared with 42% (n = 50/120) of departments using a DMS. Purchase of a DMS is planned by 43% (n = 30/70) of the departments with manual analysis. Real-time analysis of dose data is performed by 42% (n = 21/50) of the departments with a DMS; however, residents can access the DMS in clinical routine only in 20% (n = 10/50) of the departments. An interdisciplinary dose team, which among other things communicates dose data internally (63%, n = 76/120) and externally, is already implemented in 57% (n = 68/120) departments. Swiss radiological departments are committed to radiation safety. However, there is high heterogeneity among them regarding the frequency and method of dose data analysis as well as the use of DMS and radiation protection activities. • Swiss radiological departments are committed to and interest in radiation safety as proven by a 63% return rate of the survey. • Seventy-nine per cent of departments analyse dose data on a regular basis with differences in the frequency and method of analysis: 42% use a dose management software, while 58% currently perform manual dose data analysis. Of the latter, 43% plan to buy a dose

  18. A study of Computing doctorates in South Africa from 1978 to 2014

    Directory of Open Access Journals (Sweden)

    Ian D Sanders

    2015-12-01

    Full Text Available This paper studies the output of South African universities in terms of computing-related doctorates in order to determine trends in numbers of doctorates awarded and to identify strong doctoral study research areas. Data collected from a variety of sources relating to Computing doctorates conferred since the late 1970s was used to compare the situation in Computing with that of all doctorates. The number of Computing doctorates awarded has increased considerably over the period of study. Nearly three times as many doctorates were awarded in the period 2010–2014 as in 2000–2004. The universities producing the most Computing doctorates were either previously “traditional” universities or comprehensive universities formed by amalgamating a traditional research university with a technikon. Universities of technology have not yet produced many doctorates as they do not have a strong research tradition. The analysis of topic keywords using ACM Computing classifications is preliminary but shows that professional issues are dominant in Information Systems, models are often built in Computer Science and several topics, including computing in education, are evident in both IS and CS. The relevant data is in the public domain but access is difficult as record keeping was generally inconsistent and incomplete. In addition, electronic databases at universities are not easily searchable and access to HEMIS data is limited. The database built for this paper is more inclusive in terms of discipline-related data than others.

  19. A Survey on Data Storage and Information Discovery in the WSANs-Based Edge Computing Systems.

    Science.gov (United States)

    Ma, Xingpo; Liang, Junbin; Liu, Renping; Ni, Wei; Li, Yin; Li, Ran; Ma, Wenpeng; Qi, Chuanda

    2018-02-10

    In the post-Cloud era, the proliferation of Internet of Things (IoT) has pushed the horizon of Edge computing, which is a new computing paradigm with data are processed at the edge of the network. As the important systems of Edge computing, wireless sensor and actuator networks (WSANs) play an important role in collecting and processing the sensing data from the surrounding environment as well as taking actions on the events happening in the environment. In WSANs, in-network data storage and information discovery schemes with high energy efficiency, high load balance and low latency are needed because of the limited resources of the sensor nodes and the real-time requirement of some specific applications, such as putting out a big fire in a forest. In this article, the existing schemes of WSANs on data storage and information discovery are surveyed with detailed analysis on their advancements and shortcomings, and possible solutions are proposed on how to achieve high efficiency, good load balance, and perfect real-time performances at the same time, hoping that it can provide a good reference for the future research of the WSANs-based Edge computing systems.

  20. A Survey on Data Storage and Information Discovery in the WSANs-Based Edge Computing Systems

    Science.gov (United States)

    Liang, Junbin; Liu, Renping; Ni, Wei; Li, Yin; Li, Ran; Ma, Wenpeng; Qi, Chuanda

    2018-01-01

    In the post-Cloud era, the proliferation of Internet of Things (IoT) has pushed the horizon of Edge computing, which is a new computing paradigm with data processed at the edge of the network. As the important systems of Edge computing, wireless sensor and actuator networks (WSANs) play an important role in collecting and processing the sensing data from the surrounding environment as well as taking actions on the events happening in the environment. In WSANs, in-network data storage and information discovery schemes with high energy efficiency, high load balance and low latency are needed because of the limited resources of the sensor nodes and the real-time requirement of some specific applications, such as putting out a big fire in a forest. In this article, the existing schemes of WSANs on data storage and information discovery are surveyed with detailed analysis on their advancements and shortcomings, and possible solutions are proposed on how to achieve high efficiency, good load balance, and perfect real-time performances at the same time, hoping that it can provide a good reference for the future research of the WSANs-based Edge computing systems. PMID:29439442

  1. A Survey on Data Storage and Information Discovery in the WSANs-Based Edge Computing Systems

    Directory of Open Access Journals (Sweden)

    Xingpo Ma

    2018-02-01

    Full Text Available In the post-Cloud era, the proliferation of Internet of Things (IoT has pushed the horizon of Edge computing, which is a new computing paradigm with data processed at the edge of the network. As the important systems of Edge computing, wireless sensor and actuator networks (WSANs play an important role in collecting and processing the sensing data from the surrounding environment as well as taking actions on the events happening in the environment. In WSANs, in-network data storage and information discovery schemes with high energy efficiency, high load balance and low latency are needed because of the limited resources of the sensor nodes and the real-time requirement of some specific applications, such as putting out a big fire in a forest. In this article, the existing schemes of WSANs on data storage and information discovery are surveyed with detailed analysis on their advancements and shortcomings, and possible solutions are proposed on how to achieve high efficiency, good load balance, and perfect real-time performances at the same time, hoping that it can provide a good reference for the future research of the WSANs-based Edge computing systems.

  2. Wheeze sound analysis using computer-based techniques: a systematic review.

    Science.gov (United States)

    Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian

    2017-10-31

    Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.

  3. A primary care physician perspective survey on the limited use of handwriting and pen computing in the electronic medical record

    Directory of Open Access Journals (Sweden)

    Gary Arvary

    2002-09-01

    The use of handwriting in the EMR was broadly supported by this group of PCPs in private practice. Likewise, wireless pen computers were the overwhelming choice of computer for use during a consultation. In this group, older and lower volume physicians were less likely to desire a computer for use during a consultation. User acceptance of the EMR may be related to how closely it resembles the processes that are being automated. More surveys are required to determine the needs and expectations of physicians. The data also support other research studies that demonstrate the preference for handwriting and wireless computers, and the need for a limited, standardised and controlled vocabulary.

  4. Assessment of survey radiography and comparison with x-ray computed tomography for detection of hyperfunctioning adrenocortical tumors in dogs

    International Nuclear Information System (INIS)

    Voorhout, G.; Stolp, R.; Rijnberk, A.; Waes, P.F.G.M. van

    1990-01-01

    Results of abdominal survey radiography and x-ray computed tomography (CT) were compared in 13 dogs with hyperadrenocorticism histologically attributed to adrenocortical tumors. X-ray computed tomography enabled accurate localization of the tumor in all 13 dogs. Apart from 2 poorly demarcated irregular-shaped and mineralized carcinomas, there were no differences between adenoma (n = 3) and carcinoma (n = 10) on CT images. In 1 dog, invasion of the caudal vena cava by the tumor was suggested on CT images and was confirmed during surgery. Suspicion of adhesions between tumors of the right adrenal gland and the caudal vena cava on the basis of CT images was confirmed during surgery in only 2 of 6 dogs. Survey radiography allowed accurate localization of the tumor in 7 dogs (4 on the right side and 3 on the left). In 6 of these dogs, the tumor was visible as a well-demarcated soft tissue mass and, in the other dog, as a poorly demarcated mineralized mass. The smallest tumor visualized on survey radiographs had a diameter of 20 mm on CT images. Six tumors with diameter less than or equal to 20 mm were not visualized on survey radiographs. In 1 of these dogs, a mineralized nodule was found in the left adrenal region, without evidence of a mass. In a considerable number of cases, survey radiography can provide presurgical localization of adrenocortical tumors in dogs with hyperadrenocorticism; CT is redundant in these instances. In the absence of positive radiographic findings, CT is valuable for localization of adrenocortical tumors

  5. Clinical Computer Systems Survey (CLICS): learning about health information technology (HIT) in its context of use.

    Science.gov (United States)

    Lichtner, Valentina; Cornford, Tony; Klecun, Ela

    2013-01-01

    Successful health information technology (HIT) implementations need to be informed on the context of use and on users' attitudes. To this end, we developed the CLinical Computer Systems Survey (CLICS) instrument. CLICS reflects a socio-technical view of HIT adoption, and is designed to encompass all members of the clinical team. We used the survey in a large English hospital as part of its internal evaluation of the implementation of an electronic patient record system (EPR). The survey revealed extent and type of use of the EPR; how it related to and integrated with other existing systems; and people's views on its use, usability and emergent safety issues. Significantly, participants really appreciated 'being asked'. They also reminded us of the wider range of administrative roles engaged with EPR. This observation reveals pertinent questions as to our understanding of the boundaries between administrative tasks and clinical medicine - what we propose as the field of 'administrative medicine'.

  6. Wireless data collection of self-administered surveys using tablet computers.

    Science.gov (United States)

    Singleton, Kyle W; Lan, Mars; Arnold, Corey; Vahidi, Mani; Arangua, Lisa; Gelberg, Lillian; Bui, Alex A T

    2011-01-01

    The accurate and expeditious collection of survey data by coordinators in the field is critical in the support of research studies. Early methods that used paper documentation have slowly evolved into electronic capture systems. Indeed, tools such as REDCap and others illustrate this transition. However, many current systems are tailored web-browsers running on desktop/laptop computers, requiring keyboard and mouse input. We present a system that utilizes a touch screen interface running on a tablet PC with consideration for portability, limited screen space, wireless connectivity, and potentially inexperienced and low literacy users. The system was developed using C#, ASP.net, and SQL Server by multiple programmers over the course of a year. The system was developed in coordination with UCLA Family Medicine and is currently deployed for the collection of data in a group of Los Angeles area clinics of community health centers for a study on drug addiction and intervention.

  7. A multicopper oxidase is essential for manganese oxidation and laccase-like activity in Pedomicrobium sp. ACM 3067.

    Science.gov (United States)

    Ridge, Justin P; Lin, Marianne; Larsen, Eloise I; Fegan, Mark; McEwan, Alastair G; Sly, Lindsay I

    2007-04-01

    Pedomicrobium sp. ACM 3067 is a budding-hyphal bacterium belonging to the alpha-Proteobacteria which is able to oxidize soluble Mn2+ to insoluble manganese oxide. A cosmid, from a whole-genome library, containing the putative genes responsible for manganese oxidation was identified and a primer-walking approach yielded 4350 bp of novel sequence. Analysis of this sequence showed the presence of a predicted three-gene operon, moxCBA. The moxA gene product showed homology to multicopper oxidases (MCOs) and contained the characteristic four copper-binding motifs (A, B, C and D) common to MCOs. An insertion mutation of moxA showed that this gene was essential for both manganese oxidation and laccase-like activity. The moxB gene product showed homology to a family of outer membrane proteins which are essential for Type I secretion in Gram-negative bacteria. moxBA has not been observed in other manganese-oxidizing bacteria but homologues were identified in the genomes of several bacteria including Sinorhizobium meliloti 1021 and Agrobacterium tumefaciens C58. These results suggest that moxBA and its homologues constitute a family of genes encoding an MCO and a predicted component of the Type I secretion system.

  8. A Survey of Exemplar Teachers' Perceptions, Use, and Access of Computer-Based Games and Technology for Classroom Instruction

    Science.gov (United States)

    Proctor, Michael D.; Marks, Yaela

    2013-01-01

    This research reports and analyzes for archival purposes surveyed perceptions, use, and access by 259 United States based exemplar Primary and Secondary educators of computer-based games and technology for classroom instruction. Participating respondents were considered exemplary as they each won the Milken Educator Award during the 1996-2009…

  9. Pair Programming as a Modern Method of Teaching Computer Science

    Directory of Open Access Journals (Sweden)

    Irena Nančovska Šerbec

    2008-10-01

    Full Text Available At the Faculty of Education, University of Ljubljana we educate future computer science teachers. Beside didactical, pedagogical, mathematical and other interdisciplinary knowledge, students gain knowledge and skills of programming that are crucial for computer science teachers. For all courses, the main emphasis is the absorption of professional competences, related to the teaching profession and the programming profile. The latter are selected according to the well-known document, the ACM Computing Curricula. The professional knowledge is therefore associated and combined with the teaching knowledge and skills. In the paper we present how to achieve competences related to programming by using different didactical models (semiotic ladder, cognitive objectives taxonomy, problem solving and modern teaching method “pair programming”. Pair programming differs from standard methods (individual work, seminars, projects etc.. It belongs to the extreme programming as a discipline of software development and is known to have positive effects on teaching first programming language. We have experimentally observed pair programming in the introductory programming course. The paper presents and analyzes the results of using this method: the aspects of satisfaction during programming and the level of gained knowledge. The results are in general positive and demonstrate the promising usage of this teaching method.

  10. Attitudes towards Computer and Computer Self-Efficacy as Predictors of Preservice Mathematics Teachers' Computer Anxiety

    Science.gov (United States)

    Awofala, Adeneye O. A.; Akinoso, Sabainah O.; Fatade, Alfred O.

    2017-01-01

    The study investigated attitudes towards computer and computer self-efficacy as predictors of computer anxiety among 310 preservice mathematics teachers from five higher institutions of learning in Lagos and Ogun States of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…

  11. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  12. Knowledge of computer among healthcare professionals of India: a key toward e-health.

    Science.gov (United States)

    Gour, Neeraj; Srivastava, Dhiraj

    2010-11-01

    Information technology has radically changed the way that many people work and think. Over the years, technology has touched a new acme and now it is not confined to developed countries. Developing countries such as India have kept pace with the world in modern technology. Healthcare professionals can no longer ignore the application of information technology to healthcare because they are key to e-health. This study was conducted to enlighten the perspective and implications of computers among healthcare professionals, with the objective to assess the knowledge, use, and need of computers among healthcare professionals. A cross-sectional study of 240 healthcare professionals, including doctors, nurses, lab technicians, and pharmacists, was conducted. Each participant was interviewed using a pretested, semistructured format. Of 240 healthcare professionals, 57.91% were knowledgeable about computers. Of them, 22.08% had extensive knowledge and 35.83% had partial knowledge. Computer knowledge was greater among the age group 20-25 years (high knowledge-43.33% and partial knowledge-46.66%). Of 99 males, 21.21% were found to have good knowledge and 42.42% had partial knowledge. A majority of doctors and nurses used computer for study purposes. The remaining healthcare professionals used it basically for the sake of entertainment, Internet, and e-mail. A majority of all healthcare professionals (95.41%) requested computer training, which according to them would definitely help to make their future more bright and nurtured as well as to enhance their knowledge regarding computers.

  13. Computational imaging with multi-camera time-of-flight systems

    KAUST Repository

    Shrestha, Shikhar

    2016-07-11

    Depth cameras are a ubiquitous technology used in a wide range of applications, including robotic and machine vision, human computer interaction, autonomous vehicles as well as augmented and virtual reality. In this paper, we explore the design and applications of phased multi-camera time-of-flight (ToF) systems. We develop a reproducible hardware system that allows for the exposure times and waveforms of up to three cameras to be synchronized. Using this system, we analyze waveform interference between multiple light sources in ToF applications and propose simple solutions to this problem. Building on the concept of orthogonal frequency design, we demonstrate state-of-the-art results for instantaneous radial velocity capture via Doppler time-of-flight imaging and we explore new directions for optically probing global illumination, for example by de-scattering dynamic scenes and by non-line-of-sight motion detection via frequency gating. © 2016 ACM.

  14. Computation of deformations and stresses in graphite blocks for HTR core survey purposes

    International Nuclear Information System (INIS)

    Besdo, Dieter; Theymann, W.

    1975-01-01

    Stresses and deformations in graphite fuel elements for HTRs are caused by the temperature distribution and by irradiation under influence of creep, shrinking, thermal strains, and elastic deformations. The global deformations and the stress distribution in a prismatic fuel-element containing regularly distributed axial holes for the coolant flow and the fuel sticks, can be computed in the following manner: the block with its holes is treated as an effective homogeneous continuum with an equivalent global behaviour. Assuming that the fourth-order-tensor of the elastic constants is proportional to the corresponding tensor in the constitutive equations for creep, only the effective strains are of interest. The values of temperature and dose may be given in n points of the block at certain points of time. Then, the inelastic nonthermal strains are integrated by a Runge-Kutta-procedure in the n points. When interpolated and combined with thermal strains, they are incompatible. Hence, they produce elastic deformations which cause creep and can be computed by use of a Ritz-polynomial-series by help of a specific principle of the minimum of potential energy. Excessive computing time can be avoided easily since the influence of the local variation of the elastic constants within the block is almost negligible and, therefore, of practically no importance for the determination of the elastic strains. By this reason some matrices can be calculated a priori, and the elastic deformations are obtained by multiplications of these matrices rather than inversions. Therefore, this method is particularly suited for the computation of deformations and stresses for reactor core survey purposes where a large number (up to 7000 blocks) have to be treated

  15. Current Trends in Cloud Computing A Survey of Cloud Computing Systems

    OpenAIRE

    Harjit Singh

    2012-01-01

    Cloud computing that has become an increasingly important trend, is a virtualization technology that uses the internet and central remote servers to offer the sharing of resources that include infrastructures, software, applications and business processes to the market environment to fulfill the elastic demand. In today’s competitive environment, the service vitality, elasticity, choices and flexibility offered by this scalable technology are too attractive that makes the cloud computing to i...

  16. A Survey of Denial-of-Service and Distributed Denial of Service Attacks and Defenses in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Adrien Bonguet

    2017-08-01

    Full Text Available Cloud Computing is a computing model that allows ubiquitous, convenient and on-demand access to a shared pool of highly configurable resources (e.g., networks, servers, storage, applications and services. Denial-of-Service (DoS and Distributed Denial-of-Service (DDoS attacks are serious threats to the Cloud services’ availability due to numerous new vulnerabilities introduced by the nature of the Cloud, such as multi-tenancy and resource sharing. In this paper, new types of DoS and DDoS attacks in Cloud Computing are explored, especially the XML-DoS and HTTP-DoS attacks, and some possible detection and mitigation techniques are examined. This survey also provides an overview of the existing defense solutions and investigates the experiments and metrics that are usually designed and used to evaluate their performance, which is helpful for the future research in the domain.

  17. Active contour modes Crisp: new technique for segmentation of the lungs in CT images

    International Nuclear Information System (INIS)

    Reboucas Filho, Pedro Pedrosa; Cortez, Paulo Cesar; Holanda, Marcelo Alcantara

    2011-01-01

    This paper proposes a new active contour model (ACM), called ACM Crisp, and evaluates the segmentation of lungs in computed tomography (CT) images. An ACM draws a curve around or within the object of interest. This curve changes its shape, when some energy acts on it and moves towards the edges of the object. This process is performed by successive iterations of minimization of a given energy, associated with the curve. The ACMs described in the literature have limitations when used for segmentations of CT lung images. The ACM Crisp model overcomes these limitations, since it proposes automatic initiation and new external energy based on rules and radiological pulmonary densities. The paper compares other ACMs with the proposed method, which is shown to be superior. In order to validate the algorithm a medical expert in the field of Pulmonology of the Walter Cantidio University Hospital from the Federal University of Ceara carried out a qualitative analysis. In these analyses 100 CT lung images were used. The segmentation efficiency was evaluated into 5 categories with the following results for the ACM Crisp: 73% excellent, without errors, 20% acceptable, with small errors, and 7% reasonable, with large errors, 0% poor, covering only a small part of the lung, and 0% very bad, making a totally incorrect segmentation. In conclusion the ACM Crisp is considered a useful algorithm to segment CT lung images, and with potential to integrate medical diagnosis systems. (author)

  18. Models of parallel computation :a survey and classification

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yunquan; CHEN Guoliang; SUN Guangzhong; MIAO Qiankun

    2007-01-01

    In this paper,the state-of-the-art parallel computational model research is reviewed.We will introduce various models that were developed during the past decades.According to their targeting architecture features,especially memory organization,we classify these parallel computational models into three generations.These models and their characteristics are discussed based on three generations classification.We believe that with the ever increasing speed gap between the CPU and memory systems,incorporating non-uniform memory hierarchy into computational models will become unavoidable.With the emergence of multi-core CPUs,the parallelism hierarchy of current computing platforms becomes more and more complicated.Describing this complicated parallelism hierarchy in future computational models becomes more and more important.A semi-automatic toolkit that can extract model parameters and their values on real computers can reduce the model analysis complexity,thus allowing more complicated models with more parameters to be adopted.Hierarchical memory and hierarchical parallelism will be two very important features that should be considered in future model design and research.

  19. Improving simulated long-term responses of vegetation to temperature and precipitation extremes using the ACME land model

    Science.gov (United States)

    Ricciuto, D. M.; Warren, J.; Guha, A.

    2017-12-01

    While carbon and energy fluxes in current Earth system models generally have reasonable instantaneous responses to extreme temperature and precipitation events, they often do not adequately represent the long-term impacts of these events. For example, simulated net primary productivity (NPP) may decrease during an extreme heat wave or drought, but may recover rapidly to pre-event levels following the conclusion of the extreme event. However, field measurements indicate that long-lasting damage to leaves and other plant components often occur, potentially affecting the carbon and energy balance for months after the extreme event. The duration and frequency of such extreme conditions is likely to shift in the future, and therefore it is critical for Earth system models to better represent these processes for more accurate predictions of future vegetation productivity and land-atmosphere feedbacks. Here we modify the structure of the Accelerated Climate Model for Energy (ACME) land surface model to represent long-term impacts and test the improved model against observations from experiments that applied extreme conditions in growth chambers. Additionally, we test the model against eddy covariance measurements that followed extreme conditions at selected locations in North America, and against satellite-measured vegetation indices following regional extreme events.

  20. The ground support computer and in-orbit survey data analysis program for the SEEP experiment

    International Nuclear Information System (INIS)

    Voss, H.D.; Datlowe, D.W.; Mobilia, J.; Roselle, S.N.

    1985-01-01

    The ground support computer equipment (GSE) and production survey plot and analysis software are described for the Stimulated Emissions of Energetic Particles (SEEP) experiment on the S81-1 satellite. A general purpose satellite data acquisition circuit was developed based on a Z-80 portable microcomputer. By simply changing instrument control software and electrical connectors, automatic testing and control of the various SEEP instruments was accomplished. A new feature incorporated into the SEEP data analysis phase was the development of a correlative data base for all of the SEEP instruments. A CPU efficient survey plot program (with ephemeris) was developed to display the approximate 3100 hours of data, with a time resolution of 0.5 sec, from the ten instrument sensors. The details of the general purpose multigraph algorithms and plot formats are presented. For the first time new associations are being investigated of simultaneous particle, X-ray, optical and plasma density satellite measurements

  1. Computing the Deflection of the Vertical for Improving Aerial Surveys: A Comparison between EGM2008 and ITALGEO05 Estimates

    Directory of Open Access Journals (Sweden)

    Riccardo Barzaghi

    2016-07-01

    Full Text Available Recent studies on the influence of the anomalous gravity field in GNSS/INS applications have shown that neglecting the impact of the deflection of vertical in aerial surveys induces horizontal and vertical errors in the measurement of an object that is part of the observed scene; these errors can vary from a few tens of centimetres to over one meter. The works reported in the literature refer to vertical deflection values based on global geopotential model estimates. In this paper we compared this approach with the one based on local gravity data and collocation methods. In particular, denoted by ξ and η, the two mutually-perpendicular components of the deflection of the vertical vector (in the north and east directions, respectively, their values were computed by collocation in the framework of the Remove-Compute-Restore technique, applied to the gravity database used for estimating the ITALGEO05 geoid. Following this approach, these values have been computed at different altitudes that are relevant in aerial surveys. The (ξ, η values were then also estimated using the high degree EGM2008 global geopotential model and compared with those obtained in the previous computation. The analysis of the differences between the two estimates has shown that the (ξ, η global geopotential model estimate can be reliably used in aerial navigation applications that require the use of sensors connected to a GNSS/INS system only above a given height (e.g., 3000 m in this paper that must be defined by simulations.

  2. Computer self-efficacy - is there a gender gap in tertiary level introductory computing classes?

    Directory of Open Access Journals (Sweden)

    Shirley Gibbs

    Full Text Available This paper explores the relationship between introductory computing students, self-efficacy, and gender. Since the use of computers has become more common there has been speculation that the confidence and ability to use them differs between genders. Self-efficacy is an important and useful concept used to describe how a student may perceive their own ability or confidence in using and learning new technology. A survey of students in an introductory computing class has been completed intermittently since the late 1990\\'s. Although some questions have been adapted to meet the changing technology the aim of the survey has remain unchanged. In this study self-efficacy is measured using two self-rating questions. Students are asked to rate their confidence using a computer and also asked to give their perception of their computing knowledge. This paper examines these two aspects of a person\\'s computer self-efficacy in order to identify any differences that may occur between genders in two introductory computing classes, one in 1999 and the other in 2012. Results from the 1999 survey are compared with those from the survey completed in 2012 and investigated to ascertain if the perception that males were more likely to display higher computer self-efficacy levels than their female classmates does or did exist in a class of this type. Results indicate that while overall there has been a general increase in self-efficacy levels in 2012 compared with 1999, there is no significant gender gap.

  3. State-of-the-art and dissemination of computational tools for drug-design purposes: a survey among Italian academics and industrial institutions.

    Science.gov (United States)

    Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco

    2013-05-01

    During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted.

  4. Computer analysis of digital sky surveys using citizen science and manual classification

    Science.gov (United States)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  5. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    Energy Technology Data Exchange (ETDEWEB)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  6. Leading survey and research report for fiscal 1999. Survey and research on supercompiler technology; 1999 nendo supercompiler technology no chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Survey and research are conducted into the global computing technology and the next-generation parallel computer for their compiler technology and programming environment-related technology, which is for the preparation of basic key technologies for the embodiment of high-performance computing for the next generation, and efforts are exerted to extract and define technological problems and to deliberate a research system to achieve the goal. This fiscal year's achievements are mentioned below. Two territories were provided to be respectively covered by a Parallel Compiler Working Group and a Global Computing Working Group whose activities centered about overseas surveys and short-term reception of researchers from abroad. The Parallel Compiler Working Group was engaged in (1) the technological survey of the latest parallel compiler technology and, in its effort to execute researches under the project, in (2) the materialization of the contents of technology research and development and in (3) the materialization of a technology research and development system. The Global Computing Working Group was engaged in (1) the technological survey of the latest high-performance global computing and in (2) the survey of fields to accept global computing application. (NEDO)

  7. Leading survey and research report for fiscal 1999. Survey and research on supercompiler technology; 1999 nendo supercompiler technology no chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Survey and research are conducted into the global computing technology and the next-generation parallel computer for their compiler technology and programming environment-related technology, which is for the preparation of basic key technologies for the embodiment of high-performance computing for the next generation, and efforts are exerted to extract and define technological problems and to deliberate a research system to achieve the goal. This fiscal year's achievements are mentioned below. Two territories were provided to be respectively covered by a Parallel Compiler Working Group and a Global Computing Working Group whose activities centered about overseas surveys and short-term reception of researchers from abroad. The Parallel Compiler Working Group was engaged in (1) the technological survey of the latest parallel compiler technology and, in its effort to execute researches under the project, in (2) the materialization of the contents of technology research and development and in (3) the materialization of a technology research and development system. The Global Computing Working Group was engaged in (1) the technological survey of the latest high-performance global computing and in (2) the survey of fields to accept global computing application. (NEDO)

  8. The diffractive achromat full spectrum computational imaging with diffractive optics

    KAUST Repository

    Peng, Yifan

    2016-07-11

    Diffractive optical elements (DOEs) have recently drawn great attention in computational imaging because they can drastically reduce the size and weight of imaging devices compared to their refractive counterparts. However, the inherent strong dispersion is a tremendous obstacle that limits the use of DOEs in full spectrum imaging, causing unacceptable loss of color fidelity in the images. In particular, metamerism introduces a data dependency in the image blur, which has been neglected in computational imaging methods so far. We introduce both a diffractive achromat based on computational optimization, as well as a corresponding algorithm for correction of residual aberrations. Using this approach, we demonstrate high fidelity color diffractive-only imaging over the full visible spectrum. In the optical design, the height profile of a diffractive lens is optimized to balance the focusing contributions of different wavelengths for a specific focal length. The spectral point spread functions (PSFs) become nearly identical to each other, creating approximately spectrally invariant blur kernels. This property guarantees good color preservation in the captured image and facilitates the correction of residual aberrations in our fast two-step deconvolution without additional color priors. We demonstrate our design of diffractive achromat on a 0.5mm ultrathin substrate by photolithography techniques. Experimental results show that our achromatic diffractive lens produces high color fidelity and better image quality in the full visible spectrum. © 2016 ACM.

  9. A Survey of Urban Reconstruction

    KAUST Repository

    Musialski, P.

    2013-05-10

    This paper provides a comprehensive overview of urban reconstruction. While there exists a considerable body of literature, this topic is still under active research. The work reviewed in this survey stems from the following three research communities: computer graphics, computer vision and photogrammetry and remote sensing. Our goal is to provide a survey that will help researchers to better position their own work in the context of existing solutions, and to help newcomers and practitioners in computer graphics to quickly gain an overview of this vast field. Further, we would like to bring the mentioned research communities to even more interdisciplinary work, since the reconstruction problem itself is by far not solved. This paper provides a comprehensive overview of urban reconstruction. While there exists a considerable body of literature, this topic is still under active research. The work reviewed in this survey stems from the following three research communities: computer graphics, computer vision and photogrammetry and remote sensing. Our goal is to provide a survey that will help researchers to better position their own work in the context of existing solutions, and to help newcomers and practitioners in computer graphics to quickly gain an overview of this vast field. Further, we would like to bring the mentioned research communities to even more interdisciplinary work, since the reconstruction problem itself is by far not solved. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  10. A Survey of Urban Reconstruction

    KAUST Repository

    Musialski, P.; Wonka, Peter; Aliaga, D. G.; Wimmer, M.; van Gool, L.; Purgathofer, W.

    2013-01-01

    This paper provides a comprehensive overview of urban reconstruction. While there exists a considerable body of literature, this topic is still under active research. The work reviewed in this survey stems from the following three research communities: computer graphics, computer vision and photogrammetry and remote sensing. Our goal is to provide a survey that will help researchers to better position their own work in the context of existing solutions, and to help newcomers and practitioners in computer graphics to quickly gain an overview of this vast field. Further, we would like to bring the mentioned research communities to even more interdisciplinary work, since the reconstruction problem itself is by far not solved. This paper provides a comprehensive overview of urban reconstruction. While there exists a considerable body of literature, this topic is still under active research. The work reviewed in this survey stems from the following three research communities: computer graphics, computer vision and photogrammetry and remote sensing. Our goal is to provide a survey that will help researchers to better position their own work in the context of existing solutions, and to help newcomers and practitioners in computer graphics to quickly gain an overview of this vast field. Further, we would like to bring the mentioned research communities to even more interdisciplinary work, since the reconstruction problem itself is by far not solved. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  11. Computer algebra applications

    International Nuclear Information System (INIS)

    Calmet, J.

    1982-01-01

    A survey of applications based either on fundamental algorithms in computer algebra or on the use of a computer algebra system is presented. Recent work in biology, chemistry, physics, mathematics and computer science is discussed. In particular, applications in high energy physics (quantum electrodynamics), celestial mechanics and general relativity are reviewed. (Auth.)

  12. A Model of Onion Routing With Provable Anonymity

    Science.gov (United States)

    2006-08-30

    Lysyanskaya. “A Formal Treatment of Onion Routing.” CRYPTO 2005, pp. 169.187, 2005. [4] David Chaum . “The dining cryptographers problem...1988. [5] David Chaum . “Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms.” Communi- cations of the ACM, 24(2), pp. 84-88, 1981...network layer.” ACM Conference on Computer and Communications Security, pp. 193-206, 2002. [11] David Goldschlag, Michael Reed, and Paul Syverson

  13. A Study on How Conference Papers are Extended into Journal Articles in the Fields of Information Management

    Directory of Open Access Journals (Sweden)

    Yu-Hui Lu

    2016-12-01

    Full Text Available In recent years, some scholarly journals have indicated that they accept rewritten or extended version of a conference paper as long as new content are added to the journal article manuscript and that the original conference paper is included in the references. The extended publishing of conference papers now becomes topic worthy of investigation. This study analyzed the extended works of the conference papers from three major conferences in the fields of information management, i.e., Association for Computing Machinery (ACM’s ACM Conference on Electronic Commerce (ACM EC, ACM Conference on Knowledge Discovery and Data Mining (ACM KDD, and ACM International Conference on Information and Knowledge Management (ACM CIKM. Papers from the 2011 meetings were used as the sample to understand to what extent conference papers have been extended into journal articles and other forms and the lag between conference and journal publishing. It also examined the differences between the conference papers and their extended versions (i.e., journal papers by comparing the changes in authorship, references, article length, tables, and figures. It reveals the current practices of extending conference papers for journal publishing in the information management fields and may help us understand the contemporary scholarly publishing behavior. [Article content in Chinese

  14. Supplier–customer relationships: Weaknesses in south african automotive supply chains

    Directory of Open Access Journals (Sweden)

    M. J. Naude

    2012-11-01

    Full Text Available The South African automotive industry, which is an important sector in the South African economy, needs to function efficiently if it is to compete internationally. However, South African automotive components manufacturers (ACMs are not internationally competitive and automotive assemblers, also known as original equipment manufacturers (OEMs, often import cheaper components from abroad. All parties in the South African automotive supply chains need each other to ensure optimal efficiency and competitiveness. Furthermore, it is vital that good relationships exist between customers and suppliers in the automotive supply chains in South Africa. ACMs are central to automotive supply chains. A survey was conducted among ACMs to determine the nature of relationships that exist between buyers and suppliers in South Africa’s automotive supply chains. The results showed that collaborative relationships do indeed exist between members of the supply chain but that communication, understanding of the parties’ situations and cooperation can improve this relationship and so create total alliance between OEMs and ACMs.

  15. Development of a Cloud Resolving Model for Heterogeneous Supercomputers

    Science.gov (United States)

    Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.

    2017-12-01

    A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.

  16. The relationship between computer gaming hours and depression or social phobia in adults. An international online survey.

    OpenAIRE

    Tobias, Radeke

    2016-01-01

    Background: In the past decades, there was a worldwide increase in people playing video games. Researchers have started to conduct studies and identified positive and negative associations with video gaming. Comparable studies have been done.   Aim: The aim is to analyse, if there is an association between the average hours an adult participant has played computer games per day and depression or social phobia.   Methods: Data from 4,936 adults who voluntarily participated in an online survey ...

  17. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  18. Mobile-Only Web Survey Respondents

    NARCIS (Netherlands)

    Lugtig, P.J.|info:eu-repo/dai/nl/304824658; Toepoel, V.|info:eu-repo/dai/nl/304576034; amin, alerk

    2016-01-01

    Web surveys are no longer completed on just a desktop or laptop computer. Respondents increasingly use mobile devices, such as tablets and smartphones to complete web surveys. In this article, we study how respondents in the American Life Panel complete surveys using varying devices. We show that

  19. Climate Modeling: Ocean Cavities below Ice Shelves

    Energy Technology Data Exchange (ETDEWEB)

    Petersen, Mark Roger [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Computer, Computational, and Statistical Sciences Division

    2016-09-12

    The Accelerated Climate Model for Energy (ACME), a new initiative by the U.S. Department of Energy, includes unstructured-mesh ocean, land-ice, and sea-ice components using the Model for Prediction Across Scales (MPAS) framework. The ability to run coupled high-resolution global simulations efficiently on large, high-performance computers is a priority for ACME. Sub-ice shelf ocean cavities are a significant new capability in ACME, and will be used to better understand how changing ocean temperature and currents influence glacial melting and retreat. These simulations take advantage of the horizontal variable-resolution mesh and adaptive vertical coordinate in MPAS-Ocean, in order to place high resolution below ice shelves and near grounding lines.

  20. The status of training and education in information and computer technology of Australian nurses: a national survey.

    Science.gov (United States)

    Eley, Robert; Fallon, Tony; Soar, Jeffrey; Buikstra, Elizabeth; Hegney, Desley

    2008-10-01

    A study was undertaken of the current knowledge and future training requirements of nurses in information and computer technology to inform policy to meet national goals for health. The role of the modern clinical nurse is intertwined with information and computer technology and adoption of such technology forms an important component of national strategies in health. The majority of nurses are expected to use information and computer technology during their work; however, the full extent of their knowledge and experience is unclear. Self-administered postal survey. A 78-item questionnaire was distributed to 10,000 Australian Nursing Federation members to identify the nurses' use of information and computer technology. Eighteen items related to nurses' training and education in information and computer technology. Response rate was 44%. Computers were used by 86.3% of respondents as part of their work-related activities. Between 4-17% of nurses had received training in each of 11 generic computer skills and software applications during their preregistration/pre-enrolment and between 12-30% as continuing professional education. Nurses who had received training believed that it was adequate to meet the needs of their job and was given at an appropriate time. Almost half of the respondents indicated that they required more training to better meet the information and computer technology requirements of their jobs and a quarter believed that their level of computer literacy was restricting their career development. Nurses considered that the vast majority of employers did not encourage information and computer technology training and, for those for whom training was available, workload was the major barrier to uptake. Nurses favoured introduction of a national competency standard in information and computer technology. For the considerable benefits of information and computer technology to be incorporated fully into the health system, employers must pay more attention

  1. Recent high precision surveys at PEP

    International Nuclear Information System (INIS)

    Sah, R.C.

    1980-12-01

    The task of surveying and aligning the components of PEP has provided an opportunity to develop new instruments and techniques for the purpose of high precision surveys. The new instruments are quick and easy to use, and they automatically encode survey data and read them into the memory of an on-line computer. When measurements of several beam elements have been taken, the on-line computer analyzes the measured data, compares them with desired parameters, and calculates the required adjustments to beam element support stands

  2. A survey of current trends in computational drug repositioning.

    Science.gov (United States)

    Li, Jiao; Zheng, Si; Chen, Bin; Butte, Atul J; Swamidass, S Joshua; Lu, Zhiyong

    2016-01-01

    Computational drug repositioning or repurposing is a promising and efficient tool for discovering new uses from existing drugs and holds the great potential for precision medicine in the age of big data. The explosive growth of large-scale genomic and phenotypic data, as well as data of small molecular compounds with granted regulatory approval, is enabling new developments for computational repositioning. To achieve the shortest path toward new drug indications, advanced data processing and analysis strategies are critical for making sense of these heterogeneous molecular measurements. In this review, we show recent advancements in the critical areas of computational drug repositioning from multiple aspects. First, we summarize available data sources and the corresponding computational repositioning strategies. Second, we characterize the commonly used computational techniques. Third, we discuss validation strategies for repositioning studies, including both computational and experimental methods. Finally, we highlight potential opportunities and use-cases, including a few target areas such as cancers. We conclude with a brief discussion of the remaining challenges in computational drug repositioning. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  3. Analisis Teknik-Teknik Keamanan Pada Future Cloud Computing vs Current Cloud Computing: Survey Paper

    Directory of Open Access Journals (Sweden)

    Beny Nugraha

    2016-08-01

    Full Text Available Cloud computing adalah salah satu dari teknologi jaringan yang sedang berkembang pesat saat ini, hal ini dikarenakan cloud computing memiliki kelebihan dapat meningkatkan fleksibilitas dan kapabilitas dari proses komputer secara dinamis tanpa perlu mengeluarkan dana besar untuk membuat infrastruktur baru, oleh karena itu, peningkatan kualitas keamanan jaringan cloud computing sangat diperlukan. Penelitian ini akan meneliti teknik-teknik keamanan yang ada pada cloud computing saat ini dan arsitektur cloud computing masa depan, yaitu NEBULA. Teknik-teknik keamanan tersebut akan dibandingkan dalam hal kemampuannya dalam menangani serangan-serangan keamanan yang mungkin terjadi pada cloud computing. Metode yang digunakan pada penelitian ini adalah metode attack centric, yaitu setiap serangan keamanan dianalisis karakteristiknya dan kemudian diteliti mekanisme keamanan untuk menanganinya. Terdapat empat serangan keamanan yang diteliti dalam penelitian ini, dengan mengetahui bagaimana cara kerja sebuah serangan keamanan, maka akan diketahui juga mekanisme keamanan yang mana yang bisa mengatasi serangan tersebut. Dari penelitian ini didapatkan bahwa NEBULA memiliki tingkat keamanan yang paling tinggi. NEBULA memiliki tiga teknik baru yaitu Proof of Consent (PoC, Proof of Path (PoP, dan teknik kriptografi ICING. Ketiga teknik tersebut ditambah dengan teknik onion routing dapat mengatasi serangan keamanan yang dianalisa pada penelitian ini.

  4. Evaluating Modern Defenses Against Control Flow Hijacking

    Science.gov (United States)

    2015-09-01

    instructions go bad: Generalizing return-oriented programming to risc . In In Proceedings of the 15th ACM conference on Computer and communications se... Computer Science in partial fulfillment of the requirements for the degree of Master of Science in Computer Science and Engineering at the MASSACHUSETTS...Department of Electrical Engineering and Computer Science

  5. Dynamic MRI-based computer aided diagnostic systems for early detection of kidney transplant rejection: A survey

    Science.gov (United States)

    Mostapha, Mahmoud; Khalifa, Fahmi; Alansary, Amir; Soliman, Ahmed; Gimel'farb, Georgy; El-Baz, Ayman

    2013-10-01

    Early detection of renal transplant rejection is important to implement appropriate medical and immune therapy in patients with transplanted kidneys. In literature, a large number of computer-aided diagnostic (CAD) systems using different image modalities, such as ultrasound (US), magnetic resonance imaging (MRI), computed tomography (CT), and radionuclide imaging, have been proposed for early detection of kidney diseases. A typical CAD system for kidney diagnosis consists of a set of processing steps including: motion correction, segmentation of the kidney and/or its internal structures (e.g., cortex, medulla), construction of agent kinetic curves, functional parameter estimation, diagnosis, and assessment of the kidney status. In this paper, we survey the current state-of-the-art CAD systems that have been developed for kidney disease diagnosis using dynamic MRI. In addition, the paper addresses several challenges that researchers face in developing efficient, fast and reliable CAD systems for the early detection of kidney diseases.

  6. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  7. Quantum computing with trapped ions

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, R.J.

    1998-01-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  8. Equity and Computers for Mathematics Learning: Access and Attitudes

    Science.gov (United States)

    Forgasz, Helen J.

    2004-01-01

    Equity and computer use for secondary mathematics learning was the focus of a three year study. In 2003, a survey was administered to a large sample of grade 7-10 students. Some of the survey items were aimed at determining home access to and ownership of computers, and students' attitudes to mathematics, computers, and computer use for…

  9. On teaching computer ethics within a computer science department.

    Science.gov (United States)

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  10. Survey of Canadian Myotonic Dystrophy Patients' Access to Computer Technology.

    Science.gov (United States)

    Climans, Seth A; Piechowicz, Christine; Koopman, Wilma J; Venance, Shannon L

    2017-09-01

    Myotonic dystrophy type 1 is an autosomal dominant condition affecting distal hand strength, energy, and cognition. Increasingly, patients and families are seeking information online. An online neuromuscular patient portal under development can help patients access resources and interact with each other regardless of location. It is unknown how individuals living with myotonic dystrophy interact with technology and whether barriers to access exist. We aimed to characterize technology use among participants with myotonic dystrophy and to determine whether there is interest in a patient portal. Surveys were mailed to 156 participants with myotonic dystrophy type 1 registered with the Canadian Neuromuscular Disease Registry. Seventy-five participants (60% female) responded; almost half were younger than 46 years. Most (84%) used the internet; almost half of the responders (47%) used social media. The complexity and cost of technology were commonly cited reasons not to use technology. The majority of responders (76%) were interested in a myotonic dystrophy patient portal. Patients in a Canada-wide registry of myotonic dystrophy have access to and use technology such as computers and mobile phones. These patients expressed interest in a portal that would provide them with an opportunity to network with others with myotonic dystrophy and to access information about the disease.

  11. Survey of engineering computational methods and experimental programs for estimating supersonic missile aerodynamic characteristics

    Science.gov (United States)

    Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.

    1982-01-01

    This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.

  12. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  13. Research on Computer-Based Education for Reading Teachers: A 1989 Update. Results of the First National Assessment of Computer Competence.

    Science.gov (United States)

    Balajthy, Ernest

    Results of the 1985-86 National Assessment of Educational Progress (NAEP) survey of American students' knowledge of computers suggest that American schools have a long way to go before computers can be said to have made a significant impact. The survey covered the 3rd, 7th, and 11th grade levels and assessed competence in knowledge of computers,…

  14. Sci—Thur PM: Imaging — 06: Canada's National Computed Tomography (CT) Survey

    Energy Technology Data Exchange (ETDEWEB)

    Wardlaw, GM; Martel, N [Medical Imaging Division, Consumer and Clinical Radiation Protection Bureau, Healthy Environments and Consumer Safety Branch, Health Canada (Canada); Blackler, W; Asselin, J-F [Data Analysis and Information Systems, Applied Research and Analysis Directorate, Strategic Policy Branch, Health Canada (Canada)

    2014-08-15

    The value of computed tomography (CT) in medical imaging is reflected in its' increased use and availability since the early 1990's; however, given CT's relatively larger exposures (vs. planar x-ray) greater care must be taken to ensure that CT procedures are optimised in terms of providing the smallest dose possible while maintaining sufficient diagnostic image quality. The development of CT Diagnostic Reference Levels (DRLs) supports this process. DRLs have been suggested/supported by international/national bodies since the early 1990's and widely adopted elsewhere, but not on a national basis in Canada. Essentially, CT DRLs provide guidance on what is considered good practice for common CT exams, but require a representative sample of CT examination data to make any recommendations. Canada's National CT Survey project, in collaboration with provincial/territorial authorities, has collected a large national sample of CT practice data for 7 common examinations (with associated clinical indications) of both adult and pediatric patients. Following completion of data entry into a common database, a survey summary report and recommendations will be made on CT DRLs from this data. It is hoped that these can then be used by local regions to promote CT practice optimisation and support any dose reduction initiatives.

  15. Geophex Airborne Unmanned Survey System

    International Nuclear Information System (INIS)

    Won, I.L.; Keiswetter, D.

    1995-01-01

    Ground-based surveys place personnel at risk due to the proximity of buried unexploded ordnance (UXO) items or by exposure to radioactive materials and hazardous chemicals. The purpose of this effort is to design, construct, and evaluate a portable, remotely-piloted, airborne, geophysical survey system. This non-intrusive system will provide stand-off capability to conduct surveys and detect buried objects, structures, and conditions of interest at hazardous locations. During a survey, the operators remain remote from, but within visual distance of, the site. The sensor system never contacts the Earth, but can be positioned near the ground so that weak geophysical anomalies can be detected. The Geophex Airborne Unmanned Survey System (GAUSS) is designed to detect and locate small-scale anomalies at hazardous sites using magnetic and electromagnetic survey techniques. The system consists of a remotely-piloted, radio-controlled, model helicopter (RCH) with flight computer, light-weight geophysical sensors, an electronic positioning system, a data telemetry system, and a computer base-station. The report describes GAUSS and its test results

  16. Geophex Airborne Unmanned Survey System

    Energy Technology Data Exchange (ETDEWEB)

    Won, I.L.; Keiswetter, D.

    1995-12-31

    Ground-based surveys place personnel at risk due to the proximity of buried unexploded ordnance (UXO) items or by exposure to radioactive materials and hazardous chemicals. The purpose of this effort is to design, construct, and evaluate a portable, remotely-piloted, airborne, geophysical survey system. This non-intrusive system will provide stand-off capability to conduct surveys and detect buried objects, structures, and conditions of interest at hazardous locations. During a survey, the operators remain remote from, but within visual distance of, the site. The sensor system never contacts the Earth, but can be positioned near the ground so that weak geophysical anomalies can be detected. The Geophex Airborne Unmanned Survey System (GAUSS) is designed to detect and locate small-scale anomalies at hazardous sites using magnetic and electromagnetic survey techniques. The system consists of a remotely-piloted, radio-controlled, model helicopter (RCH) with flight computer, light-weight geophysical sensors, an electronic positioning system, a data telemetry system, and a computer base-station. The report describes GAUSS and its test results.

  17. Computing for particle physics. Report of the HEPAP subpanel on computer needs for the next decade

    International Nuclear Information System (INIS)

    1985-08-01

    The increasing importance of computation to the future progress in high energy physics is documented. Experimental computing demands are analyzed for the near future (four to ten years). The computer industry's plans for the near term and long term are surveyed as they relate to the solution of high energy physics computing problems. This survey includes large processors and the future role of alternatives to commercial mainframes. The needs for low speed and high speed networking are assessed, and the need for an integrated network for high energy physics is evaluated. Software requirements are analyzed. The role to be played by multiple processor systems is examined. The computing needs associated with elementary particle theory are briefly summarized. Computing needs associated with the Superconducting Super Collider are analyzed. Recommendations are offered for expanding computing capabilities in high energy physics and for networking between the laboratories

  18. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Korfiati, Aigli; Theofilatos, Konstantinos A.; Likothanassis, Spiridon D.; Tsakalidis, Athanasios K.; Mavroudi, Seferina P.

    2013-01-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. © 2013 Elsevier Inc.

  19. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2013-06-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3\\'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. © 2013 Elsevier Inc.

  20. Assessment of occupational exposure to asbestos fibers: Contribution of analytical transmission electron microscopy analysis and comparison with phase-contrast microscopy.

    Science.gov (United States)

    Eypert-Blaison, Céline; Romero-Hariot, Anita; Clerc, Frédéric; Vincent, Raymond

    2018-03-01

    From November 2009 to October 2010, the French general directorate for labor organized a large field-study using analytical transmission electron microscopy (ATEM) to characterize occupational exposure to asbestos fibers during work on asbestos containing materials (ACM). The primary objective of this study was to establish a method and to validate the feasibility of using ATEM for the analysis of airborne asbestos of individual filters sampled in various occupational environments. For each sampling event, ATEM data were compared to those obtained by phase-contrast optical microscopy (PCOM), the WHO-recommended reference technique. A total of 265 results were obtained from 29 construction sites where workers were in contact with ACM. Data were sorted depending on the combination of the ACM type and the removal technique. For each "ACM-removal technique" combination, ATEM data were used to compute statistical indicators on short, fine and WHO asbestos fibers. Moreover, exposure was assessed taking into account the use of respiratory protective devices (RPD). As in previous studies, no simple relationship was found between results by PCOM and ATEM counting methods. Some ACM, such as asbestos-containing plasters, generated very high dust levels, and some techniques generated considerable levels of dust whatever the ACM treated. On the basis of these observations, recommendations were made to measure and control the occupational exposure limit. General prevention measures to be taken during work with ACM are also suggested. Finally, it is necessary to continue acquiring knowledge, in particular regarding RPD and the dust levels measured by ATEM for the activities not evaluated during this study.

  1. Student Opinion Survey On Delivery Of ECE431: Computer ...

    African Journals Online (AJOL)

    2017-09-10

    , tidak: no). Reasons for interest/non-interest in computer programming. Reason. Number of Com. Reasons for Interest in Computer Programming essfully solving a problem/creating something new potential applications.

  2. Computer-science guest-lecture series at Langston University sponsored by the U.S. Geological Survey; abstracts, 1992-93

    Science.gov (United States)

    Steele, K. S.

    1994-01-01

    Langston University, a Historically Black University located at Langston, Oklahoma, has a computing and information science program within the Langston University Division of Business. Since 1984, Langston University has participated in the Historically Black College and University program of the U.S. Department of Interior, which provided education, training, and funding through a combined earth-science and computer-technology cooperative program with the U.S. Geological Survey (USGS). USGS personnel have presented guest lectures at Langston University since 1984. Students have been enthusiastic about the lectures, and as a result of this program, 13 Langston University students have been hired by the USGS on a part-time basis while they continued their education at the University. The USGS expanded the offering of guest lectures in 1992 by increasing the number of visits to Langston University, and by inviting participation of speakers from throughout the country. The objectives of the guest-lecture series are to assist Langston University in offering state-of-the-art education in the computer sciences, to provide students with an opportunity to learn from and interact with skilled computer-science professionals, and to develop a pool of potential future employees for part-time and full-time employment. This report includes abstracts for guest-lecture presentations during 1992-93 school year.

  3. Exploiting stock data: a survey of state of the art computational techniques aimed at producing beliefs regarding investment portfolios

    Directory of Open Access Journals (Sweden)

    Mario Linares Vásquez

    2008-01-01

    Full Text Available Selecting an investment portfolio has inspired several models aimed at optimising the set of securities which an in-vesttor may select according to a number of specific decision criteria such as risk, expected return and planning hori-zon. The classical approach has been developed for supporting the two stages of portfolio selection and is supported by disciplines such as econometrics, technical analysis and corporative finance. However, with the emerging field of computational finance, new and interesting techniques have arisen in line with the need for the automatic processing of vast volumes of information. This paper surveys such new techniques which belong to the body of knowledge con-cerning computing and systems engineering, focusing on techniques particularly aimed at producing beliefs regar-ding investment portfolios.

  4. Quantum computing for physics research

    International Nuclear Information System (INIS)

    Georgeot, B.

    2006-01-01

    Quantum computers hold great promises for the future of computation. In this paper, this new kind of computing device is presented, together with a short survey of the status of research in this field. The principal algorithms are introduced, with an emphasis on the applications of quantum computing to physics. Experimental implementations are also briefly discussed

  5. Using electronic surveys in nursing research.

    Science.gov (United States)

    Cope, Diane G

    2014-11-01

    Computer and Internet use in businesses and homes in the United States has dramatically increased since the early 1980s. In 2011, 76% of households reported having a computer, compared with only 8% in 1984 (File, 2013). A similar increase in Internet use has also been seen, with 72% of households reporting access of the Internet in 2011 compared with 18% in 1997 (File, 2013). This emerging trend in technology has prompted use of electronic surveys in the research community as an alternative to previous telephone and postal surveys. Electronic surveys can offer an efficient, cost-effective method for data collection; however, challenges exist. An awareness of the issues and strategies to optimize data collection using web-based surveys is critical when designing research studies. This column will discuss the different types and advantages and disadvantages of using electronic surveys in nursing research, as well as methods to optimize the quality and quantity of survey responses.

  6. Operating System Concepts for Reconfigurable Computing: Review and Survey

    OpenAIRE

    Marcel Eckert; Dominik Meyer; Jan Haase; Bernd Klauer

    2016-01-01

    One of the key future challenges for reconfigurable computing is to enable higher design productivity and a more easy way to use reconfigurable computing systems for users that are unfamiliar with the underlying concepts. One way of doing this is to provide standardization and abstraction, usually supported and enforced by an operating system. This article gives historical review and a summary on ideas and key concepts to include reconfigurable computing aspects in operating systems. The arti...

  7. Immature osteoblastic MG63 cells possess two calcitonin gene-related peptide receptor subtypes that respond differently to [Cys(Acm)(2,7)] calcitonin gene-related peptide and CGRP(8-37).

    Science.gov (United States)

    Kawase, Tomoyuki; Okuda, Kazuhiro; Burns, Douglas M

    2005-10-01

    Calcitonin gene-related peptide (CGRP) is clearly an anabolic factor in skeletal tissue, but the distribution of CGRP receptor (CGRPR) subtypes in osteoblastic cells is poorly understood. We previously demonstrated that the CGRPR expressed in osteoblastic MG63 cells does not match exactly the known characteristics of the classic subtype 1 receptor (CGRPR1). The aim of the present study was to further characterize the MG63 CGRPR using a selective agonist of the putative CGRPR2, [Cys(Acm)(2,7)]CGRP, and a relatively specific antagonist of CGRPR1, CGRP(8-37). [Cys(Acm)(2,7)]CGRP acted as a significant agonist only upon ERK dephosphorylation, whereas this analog effectively antagonized CGRP-induced cAMP production and phosphorylation of cAMP response element-binding protein (CREB) and p38 MAPK. Although it had no agonistic action when used alone, CGRP(8-37) potently blocked CGRP actions on cAMP, CREB, and p38 MAPK but had less of an effect on ERK. Schild plot analysis of the latter data revealed that the apparent pA2 value for ERK is clearly distinguishable from those of the other three plots as judged using the 95% confidence intervals. Additional assays using 3-isobutyl-1-methylxanthine or the PKA inhibitor N-(2-[p-bromocinnamylamino]ethyl)-5-isoquinolinesulfonamide hydrochloride (H-89) indicated that the cAMP-dependent pathway was predominantly responsible for CREB phosphorylation, partially involved in ERK dephosphorylation, and not involved in p38 MAPK phosphorylation. Considering previous data from Scatchard analysis of [125I]CGRP binding in connection with these results, these findings suggest that MG63 cells possess two functionally distinct CGRPR subtypes that show almost identical affinity for CGRP but different sensitivity to CGRP analogs: one is best characterized as a variation of CGRPR1, and the second may be a novel variant of CGRPR2.

  8. A Survey of Comics Research in Computer Science

    Directory of Open Access Journals (Sweden)

    Olivier Augereau

    2018-06-01

    Full Text Available Graphic novels such as comic books and mangas are well known all over the world. The digital transition started to change the way people are reading comics: more and more on smartphones and tablets, and less and less on paper. In recent years, a wide variety of research about comics has been proposed and might change the way comics are created, distributed and read in the future. Early work focuses on low level document image analysis. Comic books are complex; they contains text, drawings, balloons, panels, onomatopoeia, etc. Different fields of computer science covered research about user interaction and content generation such as multimedia, artificial intelligence, human–computer interaction, etc. with different sets of values. We review the previous research about comics in computer science to state what has been done and give some insights about the main outlooks.

  9. A Survey Paper on Privacy Issue in Cloud Computing

    OpenAIRE

    Yousra Abdul Alsahib S. Aldeen; Mazleena Salleh; Mohammad Abdur Razzaque

    2015-01-01

    In past few years, cloud computing is one of the popular paradigm to host and deliver services over Internet. It is having popularity by offering multiple computing services as cloud storage, cloud hosting and cloud servers etc., for various types of businesses as well as in academics. Though there are several benefits of cloud computing, it suffers from security and privacy challenges. Privacy of cloud system is a serious concern for the customers. Considering the privacy within the cloud th...

  10. Sci-Thur PM – Colourful Interactions: Highlights 07: Canadian Computed Tomography Survey: National Diagnostic Reference Levels

    Energy Technology Data Exchange (ETDEWEB)

    Wardlaw, Graeme M; Martel, Narine [Consumer & Clinical Radiation Protection Bureau / Health Canada (Canada)

    2016-08-15

    Purpose: The Canadian Computed (CT) Tomography Survey sought to collect CT technology and dose index data (CTDI and DLP) at the national level in order to establish national diagnostic reference levels (DRLs) for seven common CT examinations of standard-sized adults and pediatric patients. Methods: A single survey booklet (consisting of four sections) was mailed to and completed for each participating CT scanner. Survey sections collected data on (i) General facility and scanner information, (ii) routine protocols (as available), (iii) individual patient data (as applied) and (iv) manual CTDI measurements. Results: Dose index (CTDIvol and DLP) and associated patient data from 24 280 individual patient exam sequences was analyzed for seven common CT examinations performed in Canada: Adult Head, Chest, Abdomen/Pelvis, and Chest/Abdomen/Pelvis, and Pediatric Head, Chest, and Abdomen. Pediatric examination data was sub-divided into three age ranges: 0–3, 3–7 and 7–13 years. DRLs (75th percentile of dose index distributions) were found for all thirteen groups. Further analysis also permitted segmentation of examination data into 8 sub-groups, whose dose index data was displayed along with group histograms – showing relative contribution of axial vs. helical, contrast use (C+ vs. C-), and application of fixed current vs. dose reduction (DR) – 75th percentiles of DR sub-groups were, in almost all cases, lower than whole group (examination) DRLs. Conclusions: The analysis and summaries presented in the pending survey report can serve to aid local CT imaging optimization efforts within Canada and also contribute further to international efforts in radiation protection of patients.

  11. Microstructure and Mechanical Properties of Porous Mullite

    Science.gov (United States)

    Hsiung, Chwan-Hai Harold

    two doped intergranular glasses and their interfaces with mullite were quite similar. The reductions in strength and toughness were traced to differences in the ACM network structure and mass-distribution that are hypothesized to result from dopant-altered ACM nucleation and growth kinetics. X-ray computed tomography, a non-destructive 3-D imaging technique, played a key role in this work, enabling the measurement of needle diameters, quantification of the ACM structural network, and finite element analysis of ACM's mechanical response.

  12. PEP Laser Surveying System

    International Nuclear Information System (INIS)

    Lauritzen, T.; Sah, R.C.

    1979-03-01

    A Laser Surveying System has been developed to survey the beam elements of the PEP storage ring. This system provides automatic data acquisition and analysis in order to increase survey speed and to minimize operator error. Two special instruments, the Automatic Readout Micrometer and the Small Automatic Micrometer, have been built for measuring the locations of fiducial points on beam elements with respect to the light beam from a laser. These instruments automatically encode offset distances and read them into the memory of an on-line computer. Distances along the beam line are automatically encoded with a third instrument, the Automatic Readout Tape Unit. When measurements of several beam elements have been taken, the on-line computer analyzes the measured data, compared them with desired parameters, and calculates the required adjustments to beam element support stands

  13. Operating System Concepts for Reconfigurable Computing: Review and Survey

    Directory of Open Access Journals (Sweden)

    Marcel Eckert

    2016-01-01

    Full Text Available One of the key future challenges for reconfigurable computing is to enable higher design productivity and a more easy way to use reconfigurable computing systems for users that are unfamiliar with the underlying concepts. One way of doing this is to provide standardization and abstraction, usually supported and enforced by an operating system. This article gives historical review and a summary on ideas and key concepts to include reconfigurable computing aspects in operating systems. The article also presents an overview on published and available operating systems targeting the area of reconfigurable computing. The purpose of this article is to identify and summarize common patterns among those systems that can be seen as de facto standard. Furthermore, open problems, not covered by these already available systems, are identified.

  14. Pilot survey of patient dose from computed tomography in Bulgaria

    International Nuclear Information System (INIS)

    Vassileva, Jenia; Stoyanov, Desislav

    2008-01-01

    The number of computed tomography (CT) scanners in Bulgaria increased from 22 in 1996 to more than 160 in 2007. Big variety of scanners of different manufacturers and different generations exists in the country with predominant number of single-slice scanners. Significant part of the scanners is more than 10 years old. This work presents the pilot results from the measurements of CT dose quantities started in 2005 with the aim to spread them to a national survey of CT practice in the country. It was found that different clinical protocols are used for similar examination, resulting in large variations in dose quantities: CT air kerma index varied from 27.1 to 78.4 mGy for head examination, from 8.7 to 28.3 mGy for chest, 11.8 - 30.7 mGy for abdomen and 9.1 - 41.3 mGy for pelvis. The CT air kerma-length product for complete examination was found to vary from 310 to 1254 mGy.cm for head examination, from 215 to 893 mGy.cm for chest, from 265 to 615 mGy.cm for abdomen and from 220 to 761 mGy.cm for pelvis. The analysis demonstrated that the main reasons for found variations are differences in scanning geometry, beam quality, exposure parameters and scanning length. (author)

  15. Cryptography, quantum computation and trapped ions

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Richard J.

    1998-03-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  16. Software survey: VOSviewer, a computer program for bibliometric mapping

    NARCIS (Netherlands)

    N.J.P. van Eck (Nees Jan); L. Waltman (Ludo)

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps.

  17. Survey of computed tomography doses in head and chest protocols; Levantamento de doses em tomografia computadorizada em protocolos de cranio e torax

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Giordana Salvi de; Froner, Ana Paula Pastre; Silva, Ana Maria Marques da, E-mail: giordana.souza@acad.pucrs.br [Pontificia Universidade Catolica do Rio Grande do Sul (PUC-RS), Porto Alegre, RS (Brazil). Faculdade de Fisica. Nucleo de Pesquisa em Imagens Medicas

    2016-07-01

    Surveys of dose estimates from different medical imaging modalities highlight significant variations between health care facilities for the same examination and similar patient groups. To ensure that doses are optimized, diagnostic reference levels (DRL) must be determined, allowing the identification of procedures that are out of optimal standards. The aim of this study is to present a retrospective survey of dose indicators in head and chest CT scans, in terms of Dose-Length Product (DLP) and effective dose for adult and pediatric patients, comparing them with DRL in the literature. The study was performed with data from 293 patients submitted to computed tomography of the skull and 146 of the thorax, divided into age groups (0⊣ 1; 1⊣ 5; 5⊣ 10; 10⊣ 15; >15. For head, DLP third percentiles and effective doses are mostly higher than DRL in literature, mainly for children from 10⊣15 years (800 mGy.cm; 3,3 mSv) and 5⊣ 10 years (969 mGy.cm; 2,6 mSv). For thorax examination, although DLP are close to the DRL in literature, high variability is revealed, and the effective doses are higher for all ages. In conclusion, the survey of dose indicators in computed tomography examinations and their comparison with DRL constitute an important tool for the optimization of doses in patients. (author)

  18. A mechanistic diagnosis of the simulation of soil CO2 efflux of the ACME Land Model

    Science.gov (United States)

    Liang, J.; Ricciuto, D. M.; Wang, G.; Gu, L.; Hanson, P. J.; Mayes, M. A.

    2017-12-01

    Accurate simulation of the CO2 efflux from soils (i.e., soil respiration) to the atmosphere is critical to project global biogeochemical cycles and the magnitude of climate change in Earth system models (ESMs). Currently, the simulated soil respiration by ESMs still have a large uncertainty. In this study, a mechanistic diagnosis of soil respiration in the Accelerated Climate Model for Energy (ACME) Land Model (ALM) was conducted using long-term observations at the Missouri Ozark AmeriFlux (MOFLUX) forest site in the central U.S. The results showed that the ALM default run significantly underestimated annual soil respiration and gross primary production (GPP), while incorrectly estimating soil water potential. Improved simulations of soil water potential with site-specific data significantly improved the modeled annual soil respiration, primarily because annual GPP was simultaneously improved. Therefore, accurate simulations of soil water potential must be carefully calibrated in ESMs. Despite improved annual soil respiration, the ALM continued to underestimate soil respiration during peak growing seasons, and to overestimate soil respiration during non-peak growing seasons. Simulations involving increased GPP during peak growing seasons increased soil respiration, while neither improved plant phenology nor increased temperature sensitivity affected the simulation of soil respiration during non-peak growing seasons. One potential reason for the overestimation of the soil respiration during non-peak growing seasons may be that the current model structure is substrate-limited, while microbial dormancy under stress may cause the system to become decomposer-limited. Further studies with more microbial data are required to provide adequate representation of soil respiration and to understand the underlying reasons for inaccurate model simulations.

  19. The Asilomar Survey: Stakeholders' Opinions on Ethical Issues Related to Brain-Computer Interfacing.

    Science.gov (United States)

    Nijboer, Femke; Clausen, Jens; Allison, Brendan Z; Haselager, Pim

    2013-01-01

    Brain-Computer Interface (BCI) research and (future) applications raise important ethical issues that need to be addressed to promote societal acceptance and adequate policies. Here we report on a survey we conducted among 145 BCI researchers at the 4 th International BCI conference, which took place in May-June 2010 in Asilomar, California. We assessed respondents' opinions about a number of topics. First, we investigated preferences for terminology and definitions relating to BCIs. Second, we assessed respondents' expectations on the marketability of different BCI applications (BCIs for healthy people, BCIs for assistive technology, BCIs-controlled neuroprostheses and BCIs as therapy tools). Third, we investigated opinions about ethical issues related to BCI research for the development of assistive technology: informed consent process with locked-in patients, risk-benefit analyses, team responsibility, consequences of BCI on patients' and families' lives, liability and personal identity and interaction with the media. Finally, we asked respondents which issues are urgent in BCI research.

  20. Exploring digenic inheritance in arrhythmogenic cardiomyopathy.

    Science.gov (United States)

    König, Eva; Volpato, Claudia Béu; Motta, Benedetta Maria; Blankenburg, Hagen; Picard, Anne; Pramstaller, Peter; Casella, Michela; Rauhe, Werner; Pompilio, Giulio; Meraviglia, Viviana; Domingues, Francisco S; Sommariva, Elena; Rossini, Alessandra

    2017-12-08

    Arrhythmogenic cardiomyopathy (ACM) is an inherited genetic disorder, characterized by the substitution of heart muscle with fibro-fatty tissue and severe ventricular arrhythmias, often leading to heart failure and sudden cardiac death. ACM is considered a monogenic disorder, but the low penetrance of mutations identified in patients suggests the involvement of additional genetic or environmental factors. We used whole exome sequencing to investigate digenic inheritance in two ACM families where previous diagnostic tests have revealed a PKP2 mutation in all affected and some healthy individuals. In family members with PKP2 mutations we determined all genes that harbor variants in affected but not in healthy carriers or vice versa. We computationally prioritized the most likely candidates, focusing on known ACM genes and genes related to PKP2 through protein interactions, functional relationships, or shared biological processes. We identified four candidate genes in family 1, namely DAG1, DAB2IP, CTBP2 and TCF25, and eleven candidate genes in family 2. The most promising gene in the second family is TTN, a gene previously associated with ACM, in which the affected individual harbors two rare deleterious-predicted missense variants, one of which is located in the protein's only serine kinase domain. In this study we report genes that might act as digenic players in ACM pathogenesis, on the basis of co-segregation with PKP2 mutations. Validation in larger cohorts is still required to prove the utility of this model.

  1. Human Computing and Machine Understanding of Human Behavior: A Survey

    NARCIS (Netherlands)

    Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas; Quek, F.; Yang, Yie

    2006-01-01

    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should

  2. Effect of External Disturbing Gravity Field on Spacecraft Guidance and Surveying Line Layout for Marine Gravity Survey

    Directory of Open Access Journals (Sweden)

    HUANG Motao

    2016-11-01

    Full Text Available Centred on the support requirement of flying track control for a long range spacecraft, a detail research is made on the computation of external disturbing gravity field, the survey accuracy of gravity anomaly on the earth' surface and the program of surveying line layout for marine gravity survey. Firstly, the solution expression of navigation error for a long range spacecraft is analyzed and modified, and the influence of the earth's gravity field on flying track of spacecraft is evaluated. Then with a given limited quota of biased error of spacecraft drop point, the accuracy requirement for calculating the external disturbing gravity field is discussed and researched. Secondly, the data truncation error and the propagated data error are studied and estimated, and the quotas of survey resolution and computation accuracy for gravity anomaly on the earth' surface are determined. Finally, based on the above quotas, a corresponding program of surveying line layout for marine gravity survey is proposed. A numerical test has been made to prove the reasonableness and validity of the suggested program.

  3. PRIVACY IN CLOUD COMPUTING: A SURVEY

    OpenAIRE

    Arockiam L; Parthasarathy G; Monikandan S

    2012-01-01

    Various cloud computing models are used to increase the profit of an organization. Cloud provides a convenient environment and more advantages to business organizations to run their business. But, it has some issues related to the privacy of data. User’s data are stored and maintained out of user’s premises. The failure of data protection causes many issues like data theft which affects the individual organization. The cloud users may be satisfied, if their data are protected p...

  4. Nationwide radiation dose survey of computed tomography for fetal skeletal dysplasias

    Energy Technology Data Exchange (ETDEWEB)

    Miyazaki, Osamu [National Center for Child Health and Development, Department of Radiology, Setagaya-ku, Tokyo (Japan); Sawai, Hideaki [Hyogo College of Medicine, Department of Obstetrics and Gynecology, Nishinomiya-shi, Hyogo (Japan); Murotsuki, Jun [Miyagi Children' s Hospital, Department of Maternal and Fetal Medicine, Sendai-shi, Miyagi (Japan); Tohoku University Graduate School of Medicine, Department of Advanced Fetal and Developmental Medicine, Sendai-shi, Miyagi (Japan); Nishimura, Gen [Tokyo Metropolitan Children' s Medical Center, Department of Pediatric Imaging, Fuchu-shi, Tokyo (Japan); Horiuchi, Tetsuya [National Center for Child Health and Development, Department of Radiology, Setagaya-ku, Tokyo (Japan); Osaka University, Department of Medical Physics and Engineering, Division of Medical Technology and Science, Course of Health Science, Graduate School of Medicine, Suita, Osaka (Japan)

    2014-08-15

    Recently, computed tomography (CT) has been used to diagnose fetal skeletal dysplasia. However, no surveys have been conducted to determine the radiation exposure dose and the diagnostic reference level (DRL). To collect CT dose index volume (CTDIvol) and dose length product (DLP) data from domestic hospitals implementing fetal skeletal 3-D CT and to establish DRLs for Japan. Scan data of 125 cases of 20 protocols from 16 hospitals were analyzed. The minimum, first-quartile, median, third-quartile and maximum values of CTDIvol and DLP were determined. The time-dependent change in radiation dose setting in hospitals with three or more cases with scans was also examined. The minimum, first-quartile, median, third-quartile and maximum CTDIvol values were 2.1, 3.7, 7.7, 11.3 and 23.1 mGy, respectively, and these values for DLP were 69.0, 122.3, 276.8, 382.6 and 1025.6 mGy.cm, respectively. Six of the 12 institutions reduced the dose setting during the implementation period. The DRLs of CTDIvol and DLP for fetal CT were 11.3 mGy and 382.6 mGy.cm, respectively. Institutions implementing fetal CT should use these established DRLs as the standard and make an effort to reduce radiation exposure by voluntarily decreasing the dose. (orig.)

  5. Computer science handbook

    CERN Document Server

    Tucker, Allen B

    2004-01-01

    Due to the great response to the famous Computer Science Handbook edited by Allen B. Tucker, … in 2004 Chapman & Hall/CRC published a second edition of this comprehensive reference book. Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. … All in all, there is absolute nothing about computer science that can not be found in the encyclopedia with its 110 survey articles …-Christoph Meinel, Zentralblatt MATH

  6. Trust Models in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2008-01-01

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.......We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models....

  7. Computer-aided discovery of debris disk candidates: A case study using the Wide-Field Infrared Survey Explorer (WISE) catalog

    Science.gov (United States)

    Nguyen, T.; Pankratius, V.; Eckman, L.; Seager, S.

    2018-04-01

    Debris disks around stars other than the Sun have received significant attention in studies of exoplanets, specifically exoplanetary system formation. Since debris disks are major sources of infrared emissions, infrared survey data such as the Wide-Field Infrared Survey (WISE) catalog potentially harbors numerous debris disk candidates. However, it is currently challenging to perform disk candidate searches for over 747 million sources in the WISE catalog due to the high probability of false positives caused by interstellar matter, galaxies, and other background artifacts. Crowdsourcing techniques have thus started to harness citizen scientists for debris disk identification since humans can be easily trained to distinguish between desired artifacts and irrelevant noises. With a limited number of citizen scientists, however, increasing data volumes from large surveys will inevitably lead to analysis bottlenecks. To overcome this scalability problem and push the current limits of automated debris disk candidate identification, we present a novel approach that uses citizen science results as a seed to train machine learning based classification. In this paper, we detail a case study with a computer-aided discovery pipeline demonstrating such feasibility based on WISE catalog data and NASA's Disk Detective project. Our approach of debris disk candidates classification was shown to be robust under a wide range of image quality and features. Our hybrid approach of citizen science with algorithmic scalability can facilitate big data processing for future detections as envisioned in future missions such as the Transiting Exoplanet Survey Satellite (TESS) and the Wide-Field Infrared Survey Telescope (WFIRST).

  8. A Learning Research Informed Design and Evaluation of a Web-Enhanced Object Oriented Programming Seminar

    Science.gov (United States)

    Georgantaki, Stavroula C.; Retalis, Symeon D.

    2007-01-01

    "Object-Oriented Programming" subject is included in the ACM Curriculum Guidelines for Undergraduate and Graduate Degree Programs in Computer Science as well as in Curriculum for K-12 Computer Science. In a few research studies learning problems and difficulties have been recorded, and therefore, specific pedagogical guidelines and…

  9. Computers for lattice field theories

    International Nuclear Information System (INIS)

    Iwasaki, Y.

    1994-01-01

    Parallel computers dedicated to lattice field theories are reviewed with emphasis on the three recent projects, the Teraflops project in the US, the CP-PACS project in Japan and the 0.5-Teraflops project in the US. Some new commercial parallel computers are also discussed. Recent development of semiconductor technologies is briefly surveyed in relation to possible approaches toward Teraflops computers. (orig.)

  10. Site surveying and levelling

    CERN Document Server

    Clancy, John

    2013-01-01

    This popular and useful text has been completely revised and up-dated so that it forms and indipensible handbook for any student of surveying. An additional chapter on modern developments is included and the text has also been extended to cover ordnance survey; calculation of areas; computation of true horizontal length; measurement of vertical angles; Code of Measuring Practice; curve ranging and calculations of volumes for earthworks.

  11. Design of the Digital Sky Survey DA and online system: A case history in the use of computer aided tools for data acquisition system design

    Science.gov (United States)

    Petravick, D.; Berman, E.; Nicinski, T.; Rechenmacher, R.; Oleynik, G.; Pordes, R.; Stoughton, C.

    1991-06-01

    As part of its expanding Astrophysics program, Fermilab is participating in the Digital Sky Survey (DSS). Fermilab is part of a collaboration involving University of Chicago, Princeton University, and the Institute of Advanced Studies (at Princeton). The DSS main results will be a photometric imaging survey and a redshift survey of galaxies and color-selected quasars over pi steradians of the Northern Galactic Cap. This paper focuses on our use of Computer Aided Software Engineering (CASE) in specifying the data system for DSS. Extensions to standard methodologies were necessary to compensate for tool shortcomings and to improve communication amongst the collaboration members. One such important extension was the incorporation of CASE information into the specification document.

  12. Fault tolerant computing systems

    International Nuclear Information System (INIS)

    Randell, B.

    1981-01-01

    Fault tolerance involves the provision of strategies for error detection damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (orig.)

  13. Survey of computer vision technology for UVA navigation

    Science.gov (United States)

    Xie, Bo; Fan, Xiang; Li, Sijian

    2017-11-01

    Navigation based on computer version technology, which has the characteristics of strong independence, high precision and is not susceptible to electrical interference, has attracted more and more attention in the filed of UAV navigation research. Early navigation project based on computer version technology mainly applied to autonomous ground robot. In recent years, the visual navigation system is widely applied to unmanned machine, deep space detector and underwater robot. That further stimulate the research of integrated navigation algorithm based on computer version technology. In China, with many types of UAV development and two lunar exploration, the three phase of the project started, there has been significant progress in the study of visual navigation. The paper expounds the development of navigation based on computer version technology in the filed of UAV navigation research and draw a conclusion that visual navigation is mainly applied to three aspects as follows.(1) Acquisition of UAV navigation parameters. The parameters, including UAV attitude, position and velocity information could be got according to the relationship between the images from sensors and carrier's attitude, the relationship between instant matching images and the reference images and the relationship between carrier's velocity and characteristics of sequential images.(2) Autonomous obstacle avoidance. There are many ways to achieve obstacle avoidance in UAV navigation. The methods based on computer version technology ,including feature matching, template matching, image frames and so on, are mainly introduced. (3) The target tracking, positioning. Using the obtained images, UAV position is calculated by using optical flow method, MeanShift algorithm, CamShift algorithm, Kalman filtering and particle filter algotithm. The paper expounds three kinds of mainstream visual system. (1) High speed visual system. It uses parallel structure, with which image detection and processing are

  14. A Massively Parallel Tensor Contraction Framework for Coupled-Cluster Computations

    Science.gov (United States)

    2014-08-02

    64 (2004), pp. 1017 – 1026. [22] HONG JIA-WEI AND H. T. KUNG , I/O complexity: The red-blue pebble game, in Proceedings of the thirteenth annual ACM...electronic structure, Journal of Chemical Physics, 86 (1987), p. 7041. [36] JEPPE OLSEN, BJÖRN O. ROOS, POUL JØRGENSEN, AND HANS JØRGEN AA. JENSEN

  15. An Evaluation of Relevance of Computing Curricula to Industry Needs

    Directory of Open Access Journals (Sweden)

    Ioana Chan Mow

    2015-02-01

    Full Text Available The research documented in this paper attempted to answer the question of how relevant the content of the Computing courses offered within programs of the Computing Department at the National University of Samoa (NUS were to meet the needs of industry and the workforce. The RINCCII study which was conducted in 2013 to 2014, surveyed 13 institutions and 19 graduates from the Computing programs. Findings from the survey indicated that the current course offerings within the Computing department are relevant to the needs of industry and the workplace. However there are aspects or topics which need inclusion or better coverage. The study also recommended regular surveys to gauge relevance of curricula to needs of industry.

  16. A survey of variational principles

    International Nuclear Information System (INIS)

    Lewins, J.D.

    1993-01-01

    In this article survey of variational principles has been given. Variational principles play a significant role in mathematical theory with emphasis on the physical aspects. There are two principals used i.e. to represent the equation of the system in a succinct way and to enable a particular computation in the system to be carried out with greater accuracy. The survey of variational principles has ranged widely from its starting point in the Lagrange multiplier to optimisation principles. In an age of digital computation, these classic methods can be adapted to improve such calculations. We emphasize particularly the advantage of basic finite element methods on variational principles. (A.B.)

  17. Ingenious Snake: An Adaptive Multi-Class Contours Extraction

    Science.gov (United States)

    Li, Baolin; Zhou, Shoujun

    2018-04-01

    Active contour model (ACM) plays an important role in computer vision and medical image application. The traditional ACMs were used to extract single-class of object contours. While, simultaneous extraction of multi-class of interesting contours (i.e., various contours with closed- or open-ended) have not been solved so far. Therefore, a novel ACM model named “Ingenious Snake” is proposed to adaptively extract these interesting contours. In the first place, the ridge-points are extracted based on the local phase measurement of gradient vector flow field; the consequential ridgelines initialization are automated with high speed. Secondly, the contours’ deformation and evolvement are implemented with the ingenious snake. In the experiments, the result from initialization, deformation and evolvement are compared with the existing methods. The quantitative evaluation of the structure extraction is satisfying with respect of effectiveness and accuracy.

  18. Computer science education for medical informaticians.

    Science.gov (United States)

    Logan, Judith R; Price, Susan L

    2004-03-18

    The core curriculum in the education of medical informaticians remains a topic of concern and discussion. This paper reports on a survey of medical informaticians with Master's level credentials that asked about computer science (CS) topics or skills that they need in their employment. All subjects were graduates or "near-graduates" of a single medical informatics Master's program that they entered with widely varying educational backgrounds. The survey instrument was validated for face and content validity prior to use. All survey items were rated as having some degree of importance in the work of these professionals, with retrieval and analysis of data from databases, database design and web technologies deemed most important. Least important were networking skills and object-oriented design and concepts. These results are consistent with other work done in the field and suggest that strong emphasis on technical skills, particularly databases, data analysis, web technologies, computer programming and general computer science are part of the core curriculum for medical informatics.

  19. Gsolve, a Python computer program with a graphical user interface to transform relative gravity survey measurements to absolute gravity values and gravity anomalies

    Science.gov (United States)

    McCubbine, Jack; Tontini, Fabio Caratori; Stagpoole, Vaughan; Smith, Euan; O'Brien, Grant

    2018-01-01

    A Python program (Gsolve) with a graphical user interface has been developed to assist with routine data processing of relative gravity measurements. Gsolve calculates the gravity at each measurement site of a relative gravity survey, which is referenced to at least one known gravity value. The tidal effects of the sun and moon, gravimeter drift and tares in the data are all accounted for during the processing of the survey measurements. The calculation is based on a least squares formulation where the difference between the absolute gravity at each surveyed location and parameters relating to the dynamics of the gravimeter are minimized with respect to the relative gravity observations, and some supplied gravity reference site values. The program additionally allows the user to compute free air gravity anomalies, with respect to the GRS80 and GRS67 reference ellipsoids, from the determined gravity values and calculate terrain corrections at each of the surveyed sites using a prism formula and a user supplied digital elevation model. This paper reviews the mathematical framework used to reduce relative gravimeter survey observations to gravity values. It then goes on to detail how the processing steps can be implemented using the software.

  20. ACMS-Data

    Data.gov (United States)

    Department of Homeland Security — The Records of CBP training activities in the academies and in-service field training. This data is for processing by COTS Application Acadis Readiness Suite and is...

  1. Fault tolerance in computational grids: perspectives, challenges, and issues.

    Science.gov (United States)

    Haider, Sajjad; Nazir, Babar

    2016-01-01

    Computational grids are established with the intention of providing shared access to hardware and software based resources with special reference to increased computational capabilities. Fault tolerance is one of the most important issues faced by the computational grids. The main contribution of this survey is the creation of an extended classification of problems that incur in the computational grid environments. The proposed classification will help researchers, developers, and maintainers of grids to understand the types of issues to be anticipated. Moreover, different types of problems, such as omission, interaction, and timing related have been identified that need to be handled on various layers of the computational grid. In this survey, an analysis and examination is also performed pertaining to the fault tolerance and fault detection mechanisms. Our conclusion is that a dependable and reliable grid can only be established when more emphasis is on fault identification. Moreover, our survey reveals that adaptive and intelligent fault identification, and tolerance techniques can improve the dependability of grid working environments.

  2. SuML: A Survey Markup Language for Generalized Survey Encoding

    Science.gov (United States)

    Barclay, MW; Lober, WB; Karras, BT

    2002-01-01

    There is a need in clinical and research settings for a sophisticated, generalized, web based survey tool that supports complex logic, separation of content and presentation, and computable guidelines. There are many commercial and open source survey packages available that provide simple logic; few provide sophistication beyond “goto” statements; none support the use of guidelines. These tools are driven by databases, static web pages, and structured documents using markup languages such as eXtensible Markup Language (XML). We propose a generalized, guideline aware language and an implementation architecture using open source standards.

  3. Autonomous mobile robot for radiologic surveys

    International Nuclear Information System (INIS)

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1994-01-01

    An apparatus is described for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm. 5 figures

  4. Team Automata for Security (A Survey)

    NARCIS (Netherlands)

    ter Beek, Maurice H.; Lenzini, Gabriele; Petrocchi, Marinella

    Kleijn presented a survey of the use of team automata for the specification and analysis of phenomena from the field of computer supported cooperative work, in particular notions related to groupware systems. We present a survey of the use of team automata for the specification and analysis of some

  5. Who is downloading the free AIDA v4.3a interactive educational diabetes computer software? A 1-year survey of 3864 downloads.

    Science.gov (United States)

    Lehmann, Eldon D

    2003-01-01

    AIDA is a free diabetes computer program that permits the interactive simulation of plasma insulin and blood glucose profiles for educational, demonstration, self-learning, and research purposes. To date over 70000 copies of the software have been downloaded from the AIDA Website, www.2aida.org. This column documents a survey of downloaders of the latest release of the program (AIDA v4.3a). The Internet-based survey methodology was confirmed to be robust and reliable. Over a 1-year period (from March 2001 to February 2002) in total 3864 responses were received. During the corresponding period some 8578 actual downloads of the software were independently logged via the same route at the AIDA Website, giving a response rate for this survey of 45%. Responses were received from participants in 66 countries - over half of these (n = 2,137; 55.3%) were from the United States and the United Kingdom. There were 2318 responses (60.0%) received from patients with diabetes and 443 (11.5%) from relatives of patients, with fewer responses from doctors, students, diabetes educators, nurses, pharmacists, and other end users. This study highlights considerable interest amongst patients and their relatives to learn more about balancing insulin and diet in diabetes, as well as possibly to get more involved in self-management of insulin dosages. More computer applications that can cater for this interest in diabetes patient self-care need to be developed and made available. The Internet provides an ideal medium for the distribution of such educational tools.

  6. User's manual for computer program BASEPLOT

    Science.gov (United States)

    Sanders, Curtis L.

    2002-01-01

    The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.

  7. A study of computer-related upper limb discomfort and computer vision syndrome.

    Science.gov (United States)

    Sen, A; Richardson, Stanley

    2007-12-01

    Personal computers are one of the commonest office tools in Malaysia today. Their usage, even for three hours per day, leads to a health risk of developing Occupational Overuse Syndrome (OOS), Computer Vision Syndrome (CVS), low back pain, tension headaches and psychosocial stress. The study was conducted to investigate how a multiethnic society in Malaysia is coping with these problems that are increasing at a phenomenal rate in the west. This study investigated computer usage, awareness of ergonomic modifications of computer furniture and peripherals, symptoms of CVS and risk of developing OOS. A cross-sectional questionnaire study of 136 computer users was conducted on a sample population of university students and office staff. A 'Modified Rapid Upper Limb Assessment (RULA) for office work' technique was used for evaluation of OOS. The prevalence of CVS was surveyed incorporating a 10-point scoring system for each of its various symptoms. It was found that many were using standard keyboard and mouse without any ergonomic modifications. Around 50% of those with some low back pain did not have an adjustable backrest. Many users had higher RULA scores of the wrist and neck suggesting increased risk of developing OOS, which needed further intervention. Many (64%) were using refractive corrections and still had high scores of CVS commonly including eye fatigue, headache and burning sensation. The increase of CVS scores (suggesting more subjective symptoms) correlated with increase in computer usage spells. It was concluded that further onsite studies are needed, to follow up this survey to decrease the risks of developing CVS and OOS amongst young computer users.

  8. Conventional Microscopy vs. Computer Imagery in Chiropractic Education.

    Science.gov (United States)

    Cunningham, Christine M; Larzelere, Elizabeth D; Arar, Ilija

    2008-01-01

    As human tissue pathology slides become increasingly difficult to obtain, other methods of teaching microscopy in educational laboratories must be considered. The purpose of this study was to evaluate our students' satisfaction with newly implemented computer imagery based laboratory instruction and to obtain input from their perspective on the advantages and disadvantages of computerized vs. traditional microscope laboratories. This undertaking involved the creation of a new computer laboratory. Robbins and Cotran Pathologic Basis of Disease, 7(th)ed, was chosen as the required text which gave students access to the Robbins Pathology website, including complete content of text, Interactive Case Study Companion, and Virtual Microscope. Students had experience with traditional microscopes in their histology and microbiology laboratory courses. Student satisfaction with computer based learning was assessed using a 28 question survey which was administered to three successive trimesters of pathology students (n=193) using the computer survey website Zoomerang. Answers were given on a scale of 1-5 and statistically analyzed using weighted averages. The survey data indicated that students were satisfied with computer based learning activities during pathology laboratory instruction. The most favorable aspect to computer imagery was 24-7 availability (weighted avg. 4.16), followed by clarification offered by accompanying text and captions (weighted avg. 4.08). Although advantages and disadvantages exist in using conventional microscopy and computer imagery, current pathology teaching environments warrant investigation of replacing traditional microscope exercises with computer applications. Chiropractic students supported the adoption of computer-assisted instruction in pathology laboratories.

  9. Computational mathematics in China

    CERN Document Server

    Shi, Zhong-Ci

    1994-01-01

    This volume describes the most significant contributions made by Chinese mathematicians over the past decades in various areas of computational mathematics. Some of the results are quite important and complement Western developments in the field. The contributors to the volume range from noted senior mathematicians to promising young researchers. The topics include finite element methods, computational fluid mechanics, numerical solutions of differential equations, computational methods in dynamical systems, numerical algebra, approximation, and optimization. Containing a number of survey articles, the book provides an excellent way for Western readers to gain an understanding of the status and trends of computational mathematics in China.

  10. Applications of interval computations

    CERN Document Server

    Kreinovich, Vladik

    1996-01-01

    Primary Audience for the Book • Specialists in numerical computations who are interested in algorithms with automatic result verification. • Engineers, scientists, and practitioners who desire results with automatic verification and who would therefore benefit from the experience of suc­ cessful applications. • Students in applied mathematics and computer science who want to learn these methods. Goal Of the Book This book contains surveys of applications of interval computations, i. e. , appli­ cations of numerical methods with automatic result verification, that were pre­ sented at an international workshop on the subject in EI Paso, Texas, February 23-25, 1995. The purpose of this book is to disseminate detailed and surveyed information about existing and potential applications of this new growing field. Brief Description of the Papers At the most fundamental level, interval arithmetic operations work with sets: The result of a single arithmetic operation is the set of all possible results as the o...

  11. Computational Methods and Function Theory

    CERN Document Server

    Saff, Edward; Salinas, Luis; Varga, Richard

    1990-01-01

    The volume is devoted to the interaction of modern scientific computation and classical function theory. Many problems in pure and more applied function theory can be tackled using modern computing facilities: numerically as well as in the sense of computer algebra. On the other hand, computer algorithms are often based on complex function theory, and dedicated research on their theoretical foundations can lead to great enhancements in performance. The contributions - original research articles, a survey and a collection of problems - cover a broad range of such problems.

  12. Human computing and machine understanding of human behavior: A survey

    NARCIS (Netherlands)

    Pentland, Alex; Huang, Thomas S.; Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.

    2007-01-01

    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing should be about anticipatory user interfaces

  13. Structural dynamics in LMFBR containment analysis. A brief survey of computational methods and codes

    International Nuclear Information System (INIS)

    Chang, Y.W.

    1977-01-01

    This paper gives a brief survey of the computational methods and codes available for LMFBR containment analysis. The various numerical methods commonly used in the computer codes are compared. It provides the reactor engineers to up-to-date information on the development of structural dynamics in LMFBR containment analysis. It can also be used as a basis for the selection of the numerical method in the future code development. First, the commonly used finite-difference expressions in the Lagrangian codes will be compared. Sample calculations will be used as a basis for discussing and comparing the accuracy of the various finite-difference representations. The distortion of the meshes will also be compared; the techniques used for eliminating the numerical instabilities will be discussed and compared using examples. Next, the numerical methods used in the Eulerian formulation will be compared, first among themselves and then with the Lagrangian formulations. Special emphasis is placed on the effect of mass diffusion of the Eulerian calculation on the propagation of discontinuities. Implicit and explicit numerical integrations will be discussed and results obtained from these two techniques will be compared. Then, the finite-element methods are compared with the finite-difference methods. The advantages and disadvantages of the two methods will be discussed in detail, together with the versatility and ease of application of the method to containment analysis having complex geometries. It will also be shown that the finite-element equations for a constant-pressure fluid element is identical to the finite-difference equations using contour integrations. Finally, conclusions based on this study will be given

  14. Computational Biology Support: RECOMB Conference Series (Conference Support)

    Energy Technology Data Exchange (ETDEWEB)

    Michael Waterman

    2006-06-15

    This funding was support for student and postdoctoral attendance at the Annual Recomb Conference from 2001 to 2005. The RECOMB Conference series was founded in 1997 to provide a scientific forum for theoretical advances in computational biology and their applications in molecular biology and medicine. The conference series aims at attracting research contributions in all areas of computational molecular biology. Typical, but not exclusive, the topics of interest are: Genomics, Molecular sequence analysis, Recognition of genes and regulatory elements, Molecular evolution, Protein structure, Structural genomics, Gene Expression, Gene Networks, Drug Design, Combinatorial libraries, Computational proteomics, and Structural and functional genomics. The origins of the conference came from the mathematical and computational side of the field, and there remains to be a certain focus on computational advances. However, the effective use of computational techniques to biological innovation is also an important aspect of the conference. The conference had a growing number of attendees, topping 300 in recent years and often exceeding 500. The conference program includes between 30 and 40 contributed papers, that are selected by a international program committee with around 30 experts during a rigorous review process rivaling the editorial procedure for top-rate scientific journals. In previous years papers selection has been made from up to 130--200 submissions from well over a dozen countries. 10-page extended abstracts of the contributed papers are collected in a volume published by ACM Press and Springer, and are available at the conference. Full versions of a selection of the papers are published annually in a special issue of the Journal of Computational Biology devoted to the RECOMB Conference. A further point in the program is a lively poster session. From 120-300 posters have been presented each year at RECOMB 2000. One of the highlights of each RECOMB conference is a

  15. Academic Computing Facilities and Services in Higher Education--A Survey.

    Science.gov (United States)

    Warlick, Charles H.

    1986-01-01

    Presents statistics about academic computing facilities based on data collected over the past six years from 1,753 institutions in the United States, Canada, Mexico, and Puerto Rico for the "Directory of Computing Facilities in Higher Education." Organizational, functional, and financial characteristics are examined as well as types of…

  16. Applications of symbolic algebraic computation

    International Nuclear Information System (INIS)

    Brown, W.S.; Hearn, A.C.

    1979-01-01

    This paper is a survey of applications of systems for symbomic algebraic computation. In most successful applications, calculations that can be taken to a given order by hand are then extended one or two more orders by computer. Furthermore, with a few notable exceptins, these applications also involve numerical computation in some way. Therefore the authors emphasize the interface between symbolic and numerical computation, including: 1. Computations with both symbolic and numerical phases. 2. Data involving both the unpredictible size and shape that typify symbolic computation and the (usually inexact) numerical values that characterize numerical computation. 3. Applications of one field to the other. It is concluded that the fields of symbolic and numerical computation can advance most fruitfully in harmony rather than in competition. (Auth.)

  17. Soft computing approach for reliability optimization: State-of-the-art survey

    International Nuclear Information System (INIS)

    Gen, Mitsuo; Yun, Young Su

    2006-01-01

    In the broadest sense, reliability is a measure of performance of systems. As systems have grown more complex, the consequences of their unreliable behavior have become severe in terms of cost, effort, lives, etc., and the interest in assessing system reliability and the need for improving the reliability of products and systems have become very important. Most solution methods for reliability optimization assume that systems have redundancy components in series and/or parallel systems and alternative designs are available. Reliability optimization problems concentrate on optimal allocation of redundancy components and optimal selection of alternative designs to meet system requirement. In the past two decades, numerous reliability optimization techniques have been proposed. Generally, these techniques can be classified as linear programming, dynamic programming, integer programming, geometric programming, heuristic method, Lagrangean multiplier method and so on. A Genetic Algorithm (GA), as a soft computing approach, is a powerful tool for solving various reliability optimization problems. In this paper, we briefly survey GA-based approach for various reliability optimization problems, such as reliability optimization of redundant system, reliability optimization with alternative design, reliability optimization with time-dependent reliability, reliability optimization with interval coefficients, bicriteria reliability optimization, and reliability optimization with fuzzy goals. We also introduce the hybrid approaches for combining GA with fuzzy logic, neural network and other conventional search techniques. Finally, we have some experiments with an example of various reliability optimization problems using hybrid GA approach

  18. An introduction to digital computing

    CERN Document Server

    George, F H

    2014-01-01

    An Introduction to Digital Computing provides information pertinent to the fundamental aspects of digital computing. This book represents a major step towards the universal availability of programmed material.Organized into four chapters, this book begins with an overview of the fundamental workings of the computer, including the way it handles simple arithmetic problems. This text then provides a brief survey of the basic features of a typical computer that is divided into three sections, namely, the input and output system, the memory system for data storage, and a processing system. Other c

  19. Data structures, computer graphics, and pattern recognition

    CERN Document Server

    Klinger, A; Kunii, T L

    1977-01-01

    Data Structures, Computer Graphics, and Pattern Recognition focuses on the computer graphics and pattern recognition applications of data structures methodology.This book presents design related principles and research aspects of the computer graphics, system design, data management, and pattern recognition tasks. The topics include the data structure design, concise structuring of geometric data for computer aided design, and data structures for pattern recognition algorithms. The survey of data structures for computer graphics systems, application of relational data structures in computer gr

  20. Who am I? ~ Undergraduate Computer Science Student

    OpenAIRE

    Ferris, Jane

    2012-01-01

    As part of a school review process a survey of the students was designed to gain insight into who the students of the school were. The survey was a voluntary anonymous online survey. Students were able to skip questions and select more than one option in some questions. This was to reduce frustration with participation in the survey and ensure that the survey was completed. This conference details the average undergraduate Computer Science student of a large third level institute.

  1. A review of brain-computer interface games and an opinion survey from researchers, developers and users.

    Science.gov (United States)

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-08-11

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to "the easiness of playing" and the "development platform" as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration.

  2. A Review of Brain-Computer Interface Games and an Opinion Survey from Researchers, Developers and Users

    Directory of Open Access Journals (Sweden)

    Minkyu Ahn

    2014-08-01

    Full Text Available In recent years, research on Brain-Computer Interface (BCI technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to “the easiness of playing” and the “development platform” as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration.

  3. Toward using games to teach fundamental computer science concepts

    Science.gov (United States)

    Edgington, Jeffrey Michael

    Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. We present an investigation where computer games are used to teach two fundamental computer science concepts: boolean expressions and recursion. The games are intended to teach the concepts and not how to implement them in a programming language. For this investigation, two computer games were created. One is designed to teach basic boolean expressions and operators and the other to teach fundamental concepts of recursion. We describe the design and implementation of both games. We evaluate the effectiveness of these games using before and after surveys. The surveys were designed to ascertain basic understanding, attitudes and beliefs regarding the concepts. The boolean game was evaluated with local high school students and students in a college level introductory computer science course. The recursion game was evaluated with students in a college level introductory computer science course. We present the analysis of the collected survey information for both games. This analysis shows a significant positive change in student attitude towards recursion and modest gains in student learning outcomes for both topics.

  4. Emotion in reinforcement learning agents and robots: A survey

    OpenAIRE

    Moerland, T.M.; Broekens, D.J.; Jonker, C.M.

    2018-01-01

    This article provides the first survey of computational models of emotion in reinforcement learning (RL) agents. The survey focuses on agent/robot emotions, and mostly ignores human user emotions. Emotions are recognized as functional in decision-making by influencing motivation and action selection. Therefore, computational emotion models are usually grounded in the agent's decision making architecture, of which RL is an important subclass. Studying emotions in RL-based agents is useful for ...

  5. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  6. Quality of human-computer interaction - results of a national usability survey of hospital-IT in Germany

    Directory of Open Access Journals (Sweden)

    Bundschuh Bettina B

    2011-11-01

    Full Text Available Abstract Background Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. Methods To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. Results 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Conclusions Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies.

  7. General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results

    Czech Academy of Sciences Publication Activity Database

    Šíma, Jiří; Orponen, P.

    2003-01-01

    Roč. 15, č. 12 (2003), s. 2727-2778 ISSN 0899-7667 R&D Projects: GA AV ČR IAB2030007; GA ČR GA201/02/1456 Institutional research plan: AV0Z1030915 Keywords : computational power * computational complexity * perceptrons * radial basis functions * spiking neurons * feedforward networks * reccurent networks * probabilistic computation * analog computation Subject RIV: BA - General Mathematics Impact factor: 2.747, year: 2003

  8. PRIMARY SCHOOL PRINCIPALS’ ATTITUDES TOWARDS COMPUTER TECHNOLOGY IN THE USE OF COMPUTER TECHNOLOGY IN SCHOOL ADMINISTRATION

    OpenAIRE

    GÜNBAYI, İlhan; CANTÜRK, Gökhan

    2011-01-01

    The aim of the study is to determine the usage of computer technology in school administration, primary school administrators’ attitudes towards computer technology, administrators’ and teachers’ computer literacy level. The study was modeled as a survey search. The population of the study consists primary school principals, assistant principals in public primary schools in the center of Antalya. The data were collected from 161 (%51) administrator questionnaires in 68 of 129 public primary s...

  9. Survey of computed tomography doses in head and chest protocols; Levantamento de doses em tomografia computadorizada em protocolos de cranio e torax

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Giordana Salvi de; Silva, Ana Maria Marques da, E-mail: giordana.souza@acad.pucrs.br [Pontificia Universidade Catolica do Rio Grande do Sul (PUC-RS), Porto Alegre, RS (Brazil). Faculdade de Fisica. Nucleo de Pesquisa em Imagens Medicas

    2016-07-01

    Computed tomography is a clinical tool for the diagnosis of patients. However, the patient is subjected to a complex dose distribution. The aim of this study was to survey dose indicators in head and chest protocols CT scans, in terms of Dose-Length Product(DLP) and effective dose for adult and pediatric patients, comparing them with diagnostic reference levels in the literature. Patients were divided into age groups and the following image acquisition parameters were collected: age, kV, mAs, Volumetric Computed Tomography Dose Index (CTDIvol) and DLP. The effective dose was found multiplying DLP by correction factors. The results were obtained from the third quartile and showed the importance of determining kV and mAs values for each patient depending on the studied region, age and thickness. (author)

  10. Internet Use for Health-Related Information via Personal Computers and Cell Phones in Japan: A Cross-Sectional Population-Based Survey

    Science.gov (United States)

    Takahashi, Yoshimitsu; Ohura, Tomoko; Ishizaki, Tatsuro; Okamoto, Shigeru; Miki, Kenji; Naito, Mariko; Akamatsu, Rie; Sugimori, Hiroki; Yoshiike, Nobuo; Miyaki, Koichi; Shimbo, Takuro

    2011-01-01

    Background The Internet is known to be used for health purposes by the general public all over the world. However, little is known about the use of, attitudes toward, and activities regarding eHealth among the Japanese population. Objectives This study aimed to measure the prevalence of Internet use for health-related information compared with other sources, and to examine the effects on user knowledge, attitudes, and activities with regard to Internet use for health-related information in Japan. We examined the extent of use via personal computers and cell phones. Methods We conducted a cross-sectional survey of a quasi-representative sample (N = 1200) of the Japanese general population aged 15–79 years in September 2007. The main outcome measures were (1) self-reported rates of Internet use in the past year to acquire health-related information and to contact health professionals, family, friends, and peers specifically for health-related purposes, and (2) perceived effects of Internet use on health care. Results The prevalence of Internet use via personal computer for acquiring health-related information was 23.8% (286/1200) among those surveyed, whereas the prevalence via cell phone was 6% (77). Internet use via both personal computer and cell phone for communicating with health professionals, family, friends, or peers was not common. The Internet was used via personal computer for acquiring health-related information primarily by younger people, people with higher education levels, and people with higher household incomes. The majority of those who used the Internet for health care purposes responded that the Internet improved their knowledge or affected their lifestyle attitude, and that they felt confident in the health-related information they obtained from the Internet. However, less than one-quarter thought it improved their ability to manage their health or affected their health-related activities. Conclusions Japanese moderately used the Internet via

  11. Internet use for health-related information via personal computers and cell phones in Japan: a cross-sectional population-based survey.

    Science.gov (United States)

    Takahashi, Yoshimitsu; Ohura, Tomoko; Ishizaki, Tatsuro; Okamoto, Shigeru; Miki, Kenji; Naito, Mariko; Akamatsu, Rie; Sugimori, Hiroki; Yoshiike, Nobuo; Miyaki, Koichi; Shimbo, Takuro; Nakayama, Takeo

    2011-12-14

    The Internet is known to be used for health purposes by the general public all over the world. However, little is known about the use of, attitudes toward, and activities regarding eHealth among the Japanese population. This study aimed to measure the prevalence of Internet use for health-related information compared with other sources, and to examine the effects on user knowledge, attitudes, and activities with regard to Internet use for health-related information in Japan. We examined the extent of use via personal computers and cell phones. We conducted a cross-sectional survey of a quasi-representative sample (N = 1200) of the Japanese general population aged 15-79 years in September 2007. The main outcome measures were (1) self-reported rates of Internet use in the past year to acquire health-related information and to contact health professionals, family, friends, and peers specifically for health-related purposes, and (2) perceived effects of Internet use on health care. The prevalence of Internet use via personal computer for acquiring health-related information was 23.8% (286/1200) among those surveyed, whereas the prevalence via cell phone was 6% (77). Internet use via both personal computer and cell phone for communicating with health professionals, family, friends, or peers was not common. The Internet was used via personal computer for acquiring health-related information primarily by younger people, people with higher education levels, and people with higher household incomes. The majority of those who used the Internet for health care purposes responded that the Internet improved their knowledge or affected their lifestyle attitude, and that they felt confident in the health-related information they obtained from the Internet. However, less than one-quarter thought it improved their ability to manage their health or affected their health-related activities. Japanese moderately used the Internet via personal computers for health purposes, and rarely

  12. Les Africaines Et Les Tic

    International Development Research Centre (IDRC) Digital Library (Canada)

    Enquête sur les technologies, la question de genre et autonomisation ...... Polachek, S. (1981) «Occupational self-selection: a human capital approach to sex ...... in the Computer Science major », Communications of the ACM, 44(5) : 108–14.

  13. Computational approaches to energy materials

    CERN Document Server

    Catlow, Richard; Walsh, Aron

    2013-01-01

    The development of materials for clean and efficient energy generation and storage is one of the most rapidly developing, multi-disciplinary areas of contemporary science, driven primarily by concerns over global warming, diminishing fossil-fuel reserves, the need for energy security, and increasing consumer demand for portable electronics. Computational methods are now an integral and indispensable part of the materials characterisation and development process.   Computational Approaches to Energy Materials presents a detailed survey of current computational techniques for the

  14. Training School Administrators in Computer Use.

    Science.gov (United States)

    Spuck, Dennis W.; Bozeman, William C.

    1988-01-01

    Presents results of a survey of faculty members in doctoral-level educational administration programs that examined the use of computers in administrative training programs. The present status and future directions of technological training of school administrators are discussed, and a sample curriculum for a course in technology and computing is…

  15. Applied technology center business plan and market survey

    Science.gov (United States)

    Hodgin, Robert F.; Marchesini, Roberto

    1990-01-01

    Business plan and market survey for the Applied Technology Center (ATC), computer technology transfer and development non-profit corporation, is presented. The mission of the ATC is to stimulate innovation in state-of-the-art and leading edge computer based technology. The ATC encourages the practical utilization of late-breaking computer technologies by firms of all variety.

  16. Environmental computing compendium - background and motivation

    Science.gov (United States)

    Heikkurinen, Matti; Kranzlmüller, Dieter

    2017-04-01

    The emerging discipline of environmental computing brings together experts in applied, advanced environmental modelling. The application domains address several fundamental societal challenges, ranging from disaster risk reduction to sustainability issues (such as food security on the global scale). The community has used an Intuitive, pragmatic approach when determining which initiatives are considered to "belong to the discipline". The community's growth is based on sharing of experiences and tools provides opportunities for reusing solutions or applying knowledge in new settings. Thus, limiting possible synergies by applying an arbitrary, formal definition to exclude some of the sources of solutions and knowledge would be counterproductive. However, the number of individuals and initiatives involved has grown to the level where a survey of initiatives and sub-themes they focus on is of interest. By surveying the project landscape and identifying common themes and building a shared vocabulary to describe them we can both communicate the relevance of the new discipline to the general public more easily and make it easier for the new members of the community to find the most promising collaboration partners. This talk presents the methodology and initial findings of the initial survey of the environmental computing initiatives and organisations, as well as approaches that could lead to an environmental computing compendium that would be a collaborative maintained shared resource of the environmental computing community.

  17. FPS scientific and supercomputers computers in chemistry

    International Nuclear Information System (INIS)

    Curington, I.J.

    1987-01-01

    FPS Array Processors, scientific computers, and highly parallel supercomputers are used in nearly all aspects of compute-intensive computational chemistry. A survey is made of work utilizing this equipment, both published and current research. The relationship of the computer architecture to computational chemistry is discussed, with specific reference to Molecular Dynamics, Quantum Monte Carlo simulations, and Molecular Graphics applications. Recent installations of the FPS T-Series are highlighted, and examples of Molecular Graphics programs running on the FPS-5000 are shown

  18. Individual radiation exposure from computed tomography. A survey of paediatric practice in French university hospitals, 2010-2013

    Energy Technology Data Exchange (ETDEWEB)

    Journy, Neige M.Y. [Institut de Radioprotection et de Surete Nucleaire, Fontenay-aux-Roses (France). Lab. d' Epidemiologie des Rayonnements Ionisants; National Institutes of Health, Bethesda, MD (United States). Div. of Cancer Epidemiology and Genetics; Dreuil, Serge [Institut de Radioprotection et de Surete Nucleaire, Fontenay-aux-Roses (France). Unite d' Expertise en Radioprotection Medicale; Boddaert, Nathalie [Hopital Universitaire Necker Enfants Malades, Paris (France). Service de Radiologie Pediatrique, INSERM U1000, UMR 1163; Cite Univ. Rene Descartes, Paris (France). PRES Sorbonne Paris; Chateil, Jean-Francois [Centre Hospitalier Universitaire de Bordeaux (France). Service de Radiologie et d' Imagerie Antenatale, de l' Enfant et de la Femme; Defez, Didier [Centre Hospitalier Lyon Sud, Pierre-Benite (France). Service de Physique Medicale; Ducou-le-Pointe, Hubert [Hopital d' Enfants Armand-Trousseau, Paris (France). Service de Radiologie; Garcier, Jean-Marc [Centre Hospitalier Universitaire Estaing, Clermont-Ferrand (France). Service de Radiologie; Guersen, Joel [Centre Hospitalier Universitaire Gabriel Montpied, Clermont-Ferrand (France). Pole Imagerie et Radiologie Interventionnelle; Habib Geryes, Bouchra [Hopital Universitaire Necker Enfants Malades, Paris (France). Direction des Affaires Medicales, de la Qualite et la Relation avec les Usagers; Jahnen, Andreas [Luxembourg Institute of Science and Technology (LIST), Esch/Alzette (Luxembourg); Lee, Choonsik [National Institutes of Health, Bethesda, MD (United States). Div. of Cancer Epidemiology and Genetics; Payen-de-la-Garanderie, Jacqueline; Pracros, Jean-Pierre [Hopital Femme Mere Enfants, Bron (France). Service d' Imagerie Pediatrique; Sirinelli, Dominique [Centre Hospitalier Regional Universitaire de Tours (France). Service de Radiologie Pediatrique, Hopital Clocheville; Thierry-Chef, Isabelle [International Agency for Research on Cancer, Lyon (France). Section of Environment and Cancer; Bernier, Marie-Odile [Institut de Radioprotection et de Surete Nucleaire, Fontenay-aux-Roses (France). Lab. d' Epidemiologie des Rayonnements Ionisants

    2018-02-15

    To describe computed tomography (CT) scanning parameters, volume CT dose index (CTDIvol) and dose-length product (DLP) in paediatric practice and compare them to current diagnostic reference levels (DRLs). The survey was conducted in radiology departments of six major university hospitals in France in 2010-2013. Data collection was automatised to extract and standardise information on scanning parameters from DICOM-header files. CTDIvol and DLP were estimated based on Monte Carlo transport simulation and computational reference phantoms. CTDIvol and DLP were derived for 4,300 studies, four age groups and 18 protocols. CTDIvol was lower in younger patients for non-head scans, but did not vary with age for routine head scans. Ratios of 95th to 5th percentile CTDIvol values were 2-4 for most body parts, but 5-7 for abdominal examinations and 4-14 for mediastinum CT with contrast, depending on age. The 75th percentile CTDIvol values were below the national DRLs for chest (all ages) and head and abdominal scans (≥10 years). The results suggest the need for a better optimisation of scanning parameters for routine head scans and infrequent protocols with patient age, enhanced standardisation of practices across departments and revision of current DRLs for children. (orig.)

  19. Individual radiation exposure from computed tomography. A survey of paediatric practice in French university hospitals, 2010-2013

    International Nuclear Information System (INIS)

    Journy, Neige M.Y.; Lee, Choonsik; Payen-de-la-Garanderie, Jacqueline; Pracros, Jean-Pierre; Sirinelli, Dominique; Thierry-Chef, Isabelle; Bernier, Marie-Odile

    2018-01-01

    To describe computed tomography (CT) scanning parameters, volume CT dose index (CTDIvol) and dose-length product (DLP) in paediatric practice and compare them to current diagnostic reference levels (DRLs). The survey was conducted in radiology departments of six major university hospitals in France in 2010-2013. Data collection was automatised to extract and standardise information on scanning parameters from DICOM-header files. CTDIvol and DLP were estimated based on Monte Carlo transport simulation and computational reference phantoms. CTDIvol and DLP were derived for 4,300 studies, four age groups and 18 protocols. CTDIvol was lower in younger patients for non-head scans, but did not vary with age for routine head scans. Ratios of 95th to 5th percentile CTDIvol values were 2-4 for most body parts, but 5-7 for abdominal examinations and 4-14 for mediastinum CT with contrast, depending on age. The 75th percentile CTDIvol values were below the national DRLs for chest (all ages) and head and abdominal scans (≥10 years). The results suggest the need for a better optimisation of scanning parameters for routine head scans and infrequent protocols with patient age, enhanced standardisation of practices across departments and revision of current DRLs for children. (orig.)

  20. Lytr, a phage-derived amidase is most effective in induced lysis of Lactococcus lactis compared with other lactococcal amidases and glucosaminidases

    NARCIS (Netherlands)

    Steen, Anton; van Schalkwijk, Saskia; Buist, Girbe; Twigt, Marja; Szeliga, Monika; Meijer, Wilco; Kuipers, Oscar P.; Kok, Jan; Hugenholtz, Jeroen

    In the genome of Lactococcus lactis IL1403 five genes encoding peptidoglycan hydrolases are present: four glucosaminidases (acmA, acmB, acmC and acmD) and an endopeptidase (yjgB). Genes for six prophage lysins have also been identified. The genes acmB, acmC, acmD, yjgB and the lysin lytR of prophage

  1. A computationally efficient approach for template matching-based ...

    Indian Academy of Sciences (India)

    In this paper, a new computationally efficient image registration method is ...... the proposed method requires less computational time as compared to traditional methods. ... Zitová B and Flusser J 2003 Image registration methods: A survey.

  2. What is Genselfdual?

    OpenAIRE

    Bouyukliev, Iliya; Bouyuklieva, Stefka; Dzhumalieva-Stoeva, Maria; Monev, Venelin

    2016-01-01

    This paper presents developed software in the area of Coding Theory. Using it, all binary self-dual codes with given properties can be classified. The programs have consequent and parallel implementations. ACM Computing Classification System (1998): G.4, E.4.

  3. Computer finds ore

    Science.gov (United States)

    Bell, Peter M.

    Artificial intelligence techniques are being used for the first time to evaluate geophysical, geochemical, and geologic data and theory in order to locate ore deposits. After several years of development, an intelligent computer code has been formulated and applied to the Mount Tolman area in Washington state. In a project funded by the United States Geological Survey and the National Science Foundation a set of computer programs, under the general title Prospector, was used successfully to locate a previously unknown ore-grade porphyry molybdenum deposit in the vicinity of Mount Tolman (Science, Sept. 3, 1982).The general area of the deposit had been known to contain exposures of porphyry mineralization. Between 1964 and 1978, exploration surveys had been run by the Bear Creek Mining Company, and later exploration was done in the area by the Amax Corporation. Some of the geophysical data and geochemical and other prospecting surveys were incorporated into the programs, and mine exploration specialists contributed to a set of rules for Prospector. The rules were encoded as ‘inference networks’ to form the ‘expert system’ on which the artificial intelligence codes were based. The molybdenum ore deposit discovered by the test is large, located subsurface, and has an areal extent of more than 18 km2.

  4. Speech-enabled Computer-aided Translation

    DEFF Research Database (Denmark)

    Mesa-Lao, Bartolomé

    2014-01-01

    The present study has surveyed post-editor trainees’ views and attitudes before and after the introduction of speech technology as a front end to a computer-aided translation workbench. The aim of the survey was (i) to identify attitudes and perceptions among post-editor trainees before performing...... a post-editing task using automatic speech recognition (ASR); and (ii) to assess the degree to which post-editors’ attitudes and expectations to the use of speech technology changed after actually using it. The survey was based on two questionnaires: the first one administered before the participants...

  5. Development of carborne gamma-ray survey system

    International Nuclear Information System (INIS)

    Sakamoto, Ryuichi; Tsutsumi, Masahiro

    1999-01-01

    A carborne γ-ray survey system was developed aiming at implementation of environmental radiation investigation in and around Chernobyl area after the accident. This system consists of a dose rate meter, a GPS and a personal computer, and acquires γ-ray dose rates and positional data using two serial ports of the computer. It is indispensable to acquire these two kinds of data simultaneously for carborne γ-ray survey in a large area. A feature of the system is to be able to get these data continuously by means of a simple programming. This system has been used in and around Chernobyl area for a few years, and it has worked without any troubles. The outline of this survey system, which is applicable for configuration of environmental radiation data acquisition system is reported with some results in Chernobyl. (author)

  6. Borehole television survey

    International Nuclear Information System (INIS)

    Lau, J.S.O.

    1980-01-01

    The borehole television survey can provide a measure of the orientation, depth, width and aperture of any planar discontinuity intersected by a borehole and a technique is in an advanced stage of development by the Geological Survey of Canada (GSC) to make such measurements. Much of its practical application to date has been in crystalline rocks (plutons) at research areas pertaining to the Nuclear Waste Disposal Program in Canada. It also has many other engineering applications where bedrock stability is of particular concern. The equipment required to carry out the survey can be readily transported by two panel trucks with trailers. The components consist of a camera probe, control unit, cable storage reel, cable drive, video-tape recorder, TV monitor and two electrical generators. An inclined planar structure intersected by a borehole appears as an elliptical trace on the wall of the borehole. Such an intersection line shows on the TV monitor as a sinusoidal curve with a high point and a low point as the camera rotates through an angle of 360 degrees. The azimuth of the low point, measured by a compass in the camera probe, represents the direction of the dip of the planar structure. The angle of dip is measured midway between the high and low points or is computed from the maximum-to-minimum distance of the sinusoid and the hole diameter. These observations provide the true orientation of the planar structure if the borehole is vertical. However, if the borehole is inclined, direct observations will only provide the apparent orientation. The true orientation must thus be obtained either by means of stereographic projection or spherical trigonometry. A computer program has been written to calculate the true orientation from the apparent orientation. In the field, observation data are recorded directly on a data record sheet for keypunching and input into the computer

  7. Computers in technical information transfer

    Energy Technology Data Exchange (ETDEWEB)

    Price, C.E.

    1978-01-01

    The use of computers in transferring scientific and technical information from its creation to its use is surveyed. The traditional publication and distribution processes for S and T literature in past years have been the vehicle for transfer, but computers have altered the process in phenomenal ways. Computers are used in literature publication through text editing and photocomposition applications. Abstracting and indexing services use computers for preparing their products, but the machine-readable document descriptions created for this purpose are input to a rapidly growing computerized information retrieval service industry. Computer use is making many traditional processes passe, and may eventually lead to a largely ''paperless'' information utility.

  8. Computer Science Research at Langley

    Science.gov (United States)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  9. From Monte Carlo to Quantum Computation

    OpenAIRE

    Heinrich, Stefan

    2001-01-01

    Quantum computing was so far mainly concerned with discrete problems. Recently, E. Novak and the author studied quantum algorithms for high dimensional integration and dealt with the question, which advantages quantum computing can bring over classical deterministic or randomized methods for this type of problem. In this paper we give a short introduction to the basic ideas of quantum computing and survey recent results on high dimensional integration. We discuss connections to the Monte Carl...

  10. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  11. Evaluation of the Presentation of Network Data via Visualization Tools for Network Analysts

    Science.gov (United States)

    2014-03-01

    distortion presentation techniques based on a tabular layout (10) such as Document Lens (11), fish -eye views (12, 13), stretching rubber sheets (14...User Interface Software and Technology, pages 29–38, ACM Press, 1998. 36. Schmitz, C. Limesurvey-the Open Source Survey Application. Hamburg [cited

  12. Future Field Programmable Gate Array (FPGA) Design Methodologies and Tool Flows

    Science.gov (United States)

    2008-07-01

    Cruickshank, J. E. Gaffney and R. D. Melbourne, Australia : ACM, 1992. Proceedings of the 14th International Conference on Software Engineering. pp. 327-337... Ridge Compiler Collection Stone Ridge Technology 48 A.3 FPGA Architecture Survey Company Niche 3P plus 1 Technology Coarse-grain configurable IP

  13. Computing layouts with deformable templates

    KAUST Repository

    Peng, Chi-Han

    2014-07-22

    In this paper, we tackle the problem of tiling a domain with a set of deformable templates. A valid solution to this problem completely covers the domain with templates such that the templates do not overlap. We generalize existing specialized solutions and formulate a general layout problem by modeling important constraints and admissible template deformations. Our main idea is to break the layout algorithm into two steps: a discrete step to lay out the approximate template positions and a continuous step to refine the template shapes. Our approach is suitable for a large class of applications, including floorplans, urban layouts, and arts and design. Copyright © ACM.

  14. Computing layouts with deformable templates

    KAUST Repository

    Peng, Chi-Han; Yang, Yongliang; Wonka, Peter

    2014-01-01

    In this paper, we tackle the problem of tiling a domain with a set of deformable templates. A valid solution to this problem completely covers the domain with templates such that the templates do not overlap. We generalize existing specialized solutions and formulate a general layout problem by modeling important constraints and admissible template deformations. Our main idea is to break the layout algorithm into two steps: a discrete step to lay out the approximate template positions and a continuous step to refine the template shapes. Our approach is suitable for a large class of applications, including floorplans, urban layouts, and arts and design. Copyright © ACM.

  15. Exposure characteristics of positive tone electron beam resist containing p-chloro-α-methylstyrene

    Science.gov (United States)

    Ochiai, Shunsuke; Takayama, Tomohiro; Kishimura, Yukiko; Asada, Hironori; Sonoda, Manae; Iwakuma, Minako; Hoshino, Ryoichi

    2017-07-01

    The positive tone resist consisted of methyl-α-chloroacrylate (ACM) and α-methylstyrene (MS) has higher sensitivity and higher dry etching resistance than poly (methylmethacrylate) (PMMA) due to the presence of a chlorine atom and a phenyl group. Copolymers consisted of ACM and p-chloro-α-methylstyrene (PCMS), where the additional chlorine atom is introduced in phenyl group compared with ACM-MS resist are synthesized and their exposure characteristics are investigated. ACM-PCMS resist with the ACM:PCMS composition ratio of 49:51 indicates the high solubility for amyl acetate developer. As the ACM composition ratio increases, the solubility of ACM-PCMS resist is suppressed. In both ACM-PCMS and ACM-MS resists, the sensitivity decreases while the contrast increases with increasing ACM ratio. When the composition ratio of ACM:PCMS is 69:31, 100/100 nm line and space pattern having a good shape is obtained at 120 μC/cm2 which is comparable to the required exposure dose for conventional ACM-MS resist with ACM:MS=50:50. Dry etching resistance of ACM:PCMS resists for Ar gas is also presented.

  16. The role of parents and related factors on adolescent computer use

    Directory of Open Access Journals (Sweden)

    Jennifer A. Epstein

    2012-02-01

    Full Text Available Background. Research suggested the importance of parents on their adolescents’ computer activity. Spending too much time on the computer for recreational purposes in particular has been found to be related to areas of public health concern in children/adolescents, including obesity and substance use. Design and Methods. The goal of the research was to determine the association between recreational computer use and potentially linked factors (parental monitoring, social influences to use computers including parents, age of first computer use, self-control, and particular internet activities. Participants (aged 13-17 years and residing in the United States were recruited via the Internet to complete an anonymous survey online using a survey tool. The target sample of 200 participants who completed the survey was achieved. The sample’s average age was 16 and was 63% girls. Results. A set of regressions with recreational computer use as dependent variables were run. Conclusions. Less parental monitoring, younger age at first computer use, listening or downloading music from the internet more frequently, using the internet for educational purposes less frequently, and parent’s use of the computer for pleasure was related to spending a greater percentage of time on non-school computer use. These findings suggest the importance of parental monitoring and parental computer use on their children’s own computer use, and the influence of some internet activities on adolescent computer use. Finally, programs aimed at parents to help them increase the age when their children start using computers and learn how to place limits on recreational computer use are needed.

  17. Survey of computer codes applicable to waste facility performance evaluations

    International Nuclear Information System (INIS)

    Alsharif, M.; Pung, D.L.; Rivera, A.L.; Dole, L.R.

    1988-01-01

    This study is an effort to review existing information that is useful to develop an integrated model for predicting the performance of a radioactive waste facility. A summary description of 162 computer codes is given. The identified computer programs address the performance of waste packages, waste transport and equilibrium geochemistry, hydrological processes in unsaturated and saturated zones, and general waste facility performance assessment. Some programs also deal with thermal analysis, structural analysis, and special purposes. A number of these computer programs are being used by the US Department of Energy, the US Nuclear Regulatory Commission, and their contractors to analyze various aspects of waste package performance. Fifty-five of these codes were identified as being potentially useful on the analysis of low-level radioactive waste facilities located above the water table. The code summaries include authors, identification data, model types, and pertinent references. 14 refs., 5 tabs

  18. Multiplayer computer games as youth's leisure phenomenon

    OpenAIRE

    HADERKOVÁ, Barbora

    2016-01-01

    The thesis is dedicated to multiplayer computer games as youth's leisure phenomenon of this time. The theoretical part is focused on computer games history, multiplayer computer games and their types, gaming platforms, community of multiplayer games players and potential negatives and positives, which follows from playing this type of games. The practical part contains a qualitative survey using interviews with multiplayer computer games players aged from 15 to 26 years from city of České Bud...

  19. Consécration pour les Inventeurs du World-Wide Web

    CERN Multimedia

    CERN Press Office. Geneva

    1996-01-01

    Nearly seven years after it was invented at CERN, the World-Wide Web has woven its way into every corner of the Internet. On Saturday, 17 February, the inventors of the Web, Tim Berners-Lee, now at Massachusetts Institute of Technology (MIT), and Robert Cailliau of CERN's Electronics and Computing for Physics (ECP) Division, will be honoured with one of computing's highest distinctions: the Association for Computing (ACM) Software System Award 1995.

  20. A survey of common habits of computer users as indicators of ...

    African Journals Online (AJOL)

    Yomi

    2012-01-31

    Jan 31, 2012 ... Hygiene has been recognized as an infection control strategy and the extent of the problems of environmental contamination largely depends on personal hygiene. With the development of several computer applications in recent times, the uses of computer systems have greatly expanded. And with.

  1. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  2. Transuranic Computational Chemistry.

    Science.gov (United States)

    Kaltsoyannis, Nikolas

    2018-02-26

    Recent developments in the chemistry of the transuranic elements are surveyed, with particular emphasis on computational contributions. Examples are drawn from molecular coordination and organometallic chemistry, and from the study of extended solid systems. The role of the metal valence orbitals in covalent bonding is a particular focus, especially the consequences of the stabilization of the 5f orbitals as the actinide series is traversed. The fledgling chemistry of transuranic elements in the +II oxidation state is highlighted. Throughout, the symbiotic interplay of experimental and computational studies is emphasized; the extraordinary challenges of experimental transuranic chemistry afford computational chemistry a particularly valuable role at the frontier of the periodic table. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Quantum chromodynamics with advanced computing

    International Nuclear Information System (INIS)

    Kronfeld, A S

    2008-01-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists

  4. An Informal Survey of the CTI Backup System

    Directory of Open Access Journals (Sweden)

    Joseph Covino

    1981-06-01

    Full Text Available In order to help decide whether or not to purchase computer backup systems from Computer Translation, Inc. (CTI, for use when the CLSI LIBS 100 automated circulation system is not operating, Great Neck Library conducted an informal survey of libraries using both systems

  5. 40 CFR Appendix A to Subpart M of... - Interpretive Rule Governing Roof Removal Operations

    Science.gov (United States)

    2010-07-01

    ...-containing material (ACM) is material containing more than one percent asbestos as determined using the... NESHAP classifies ACM as either “friable” or “nonfriable”. Friable ACM is ACM that, when dry, can be crumbled, pulverized or reduced to powder by hand pressure. Nonfriable ACM is ACM that, when dry, cannot be...

  6. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    Science.gov (United States)

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  7. A Survey on the Relation between the Rate of Watching and Playing Computer Game and Weight Disorder Regarding first Grade Students in Elementary School in Karaj in 2012

    Directory of Open Access Journals (Sweden)

    K. Kabir

    2013-10-01

    Full Text Available Background: Life style changes have been considered in different ways in urban communities. Apartment housing, lack of physical activities, both parents being employed, entertainment devices relying on monitor display screens being within reach, leads our children to a sedentary life style. Obesity is just one of the side effects of this kind of living while children's confrontation with computer is decreasing to a lower age. There have been lots of surveys related to the duration of time spending on these entertainments with high school students but no research have been carried out on elementary school students in Iran. Methods: This cross-sectional study was carried on a sample of 450 male and female students attending the assessment center to register in the first grade in Karaj city. Results & conclusion: Therefore, in this survey, the average time new comers spent in front of a monitor screen was taken into account .The result showed that some part of child's daily activities was allocated to electronic devices having a monitor screen. Our findings showed that the allocated time for watching TV was 2.6 hours per day, for satellite programs: 0.49 hours per day, for Computer operating: 0.9 hours per day, Computer games: 0.38, and play station: 0.14hours per day and in total they used electronic devices 4.6 hour per day. Moreover, in this study, the BMI of each case has been calculated and the spread rate of weigh disorders was studied. About the weight disorders, we found that 15.8 % of students in this survey were low weight, 69.8 % were in normal ranges, 8.9 % were overweight and 5.4% were obese. In this survey, the relation between weight disorder and the rate of using electronic monitor screens was also studied. however, we couldn’t find any relevance between the two variables, probably the weight disorders must be affected by many other factors. We considered demographic variables as well as other variables which may affect weight

  8. Novel derivatives of aclacinomycin A block cancer cell migration through inhibition of farnesyl transferase.

    Science.gov (United States)

    Magi, Shigeyuki; Shitara, Tetsuo; Takemoto, Yasushi; Sawada, Masato; Kitagawa, Mitsuhiro; Tashiro, Etsu; Takahashi, Yoshikazu; Imoto, Masaya

    2013-03-01

    In the course of screening for an inhibitor of farnesyl transferase (FTase), we identified two compounds, N-benzyl-aclacinomycin A (ACM) and N-allyl-ACM, which are new derivatives of ACM. N-benzyl-ACM and N-allyl-ACM inhibited FTase activity with IC50 values of 0.86 and 2.93 μM, respectively. Not only ACM but also C-10 epimers of each ACM derivative failed to inhibit FTase. The inhibition of FTase by N-benzyl-ACM and N-allyl-ACM seems to be specific, because these two compounds did not inhibit geranylgeranyltransferase or geranylgeranyl pyrophosphate (GGPP) synthase up to 100 μM. In cultured A431 cells, N-benzyl-ACM and N-allyl-ACM also blocked both the membrane localization of H-Ras and activation of the H-Ras-dependent PI3K/Akt pathway. In addition, they inhibited epidermal growth factor (EGF)-induced migration of A431 cells. Thus, N-benzyl-ACM and N-allyl-ACM inhibited EGF-induced migration of A431 cells by inhibiting the farnesylation of H-Ras and subsequent H-Ras-dependent activation of the PI3K/Akt pathway.

  9. Problems experienced by people with arthritis when using a computer.

    Science.gov (United States)

    Baker, Nancy A; Rogers, Joan C; Rubinstein, Elaine N; Allaire, Saralynn H; Wasko, Mary Chester

    2009-05-15

    To describe the prevalence of computer use problems experienced by a sample of people with arthritis, and to determine differences in the magnitude of these problems among people with rheumatoid arthritis (RA), osteoarthritis (OA), and fibromyalgia (FM). Subjects were recruited from the Arthritis Network Disease Registry and asked to complete a survey, the Computer Problems Survey, which was developed for this study. Descriptive statistics were calculated for the total sample and the 3 diagnostic subgroups. Ordinal regressions were used to determine differences between the diagnostic subgroups with respect to each equipment item while controlling for confounding demographic variables. A total of 359 respondents completed a survey. Of the 315 respondents who reported using a computer, 84% reported a problem with computer use attributed to their underlying disorder, and approximately 77% reported some discomfort related to computer use. Equipment items most likely to account for problems and discomfort were the chair, keyboard, mouse, and monitor. Of the 3 subgroups, significantly more respondents with FM reported more severe discomfort, more problems, and greater limitations related to computer use than those with RA or OA for all 4 equipment items. Computer use is significantly affected by arthritis. This could limit the ability of a person with arthritis to participate in work and home activities. Further study is warranted to delineate disease-related limitations and develop interventions to reduce them.

  10. 78 FR 15014 - Change in Bank Control Notices; Acquisitions of Shares of a Bank or Bank Holding Company

    Science.gov (United States)

    2013-03-08

    ... Meyer, Jr., individually and as trustee of the: ACM, Jr. 2010 3Y GRAT A, the ACM, Jr. 2010 3Y GRAT B, the ACM, Jr. 2010 3Y GRAT C, the ACM, Jr. 2013 2Y GRAT A, the ACM, Jr. 2013 2Y GRAT B, the ACM, Jr. 2013 2Y GRAT C, the ACM, Jr. 2013 2Y GRAT D, the Katharine Clara Kimmel Non- Exempt Trust C/U Elisabeth...

  11. Marvels of modern electronics a survey

    CERN Document Server

    Lunt, Barry

    2013-01-01

    This reader-friendly survey focuses on innovations of the past 40 years, including computers, integrated circuits, the Internet, cell phones, GPS, optical fibers, and more. Engaging, mildly technical, authoritative treatment. 2013 edition.

  12. The LLL algorithm survey and applications

    CERN Document Server

    Nguyen, Phong Q

    2010-01-01

    The first book to offer a comprehensive view of the LLL algorithm, this text surveys computational aspects of Euclidean lattices and their main applications. It includes many detailed motivations, explanations and examples.

  13. Emotion in reinforcement learning agents and robots : A survey

    NARCIS (Netherlands)

    Moerland, T.M.; Broekens, D.J.; Jonker, C.M.

    2018-01-01

    This article provides the first survey of computational models of emotion in reinforcement learning (RL) agents. The survey focuses on agent/robot emotions, and mostly ignores human user emotions. Emotions are recognized as functional in decision-making by influencing motivation and action

  14. The situation of computer utilization in radiation therapy in Japan and other countries and problems

    International Nuclear Information System (INIS)

    Onai, Yoshio

    1981-01-01

    The uses of computers in radiation therapy are various, such as radiation dose calculation, clinical history management, radiotherapeutical instrument automation and biological model. To grasp the situation in this field, a survey by questionnaire was carried out internationally at the time of the 7th International Conference on the Use of Computers in Radiation Therapy held in Kawasaki and Tokyo in September, 1980. The surveyed nations totaled 21 including Japan; the number of facilities answered were 203 in Japan and 111 in other countries, and the period concerned is December, 1979, to September, 1980. The results of the survey are described as follows: areas of use of computers in hospitals, computer utilization in radiation department, computer uses in radiation therapy, and evaluation of radiotherapeutical computer uses and problems. (J.P.N.)

  15. Diagnostic reference levels for common computed tomography (CT) examinations: results from the first Nigerian nationwide dose survey.

    Science.gov (United States)

    Ekpo, Ernest U; Adejoh, Thomas; Akwo, Judith D; Emeka, Owujekwe C; Modu, Ali A; Abba, Mohammed; Adesina, Kudirat A; Omiyi, David O; Chiegwu, Uche H

    2018-01-29

    To explore doses from common adult computed tomography (CT) examinations and propose national diagnostic reference levels (nDRLs) for Nigeria. This retrospective study was approved by the Nnamdi Azikiwe University and University Teaching Hospital Institutional Review Boards (IRB: NAUTH/CS/66/Vol8/84) and involved dose surveys of adult CT examinations across the six geographical regions of Nigeria and Abuja from January 2016 to August 2017. Dose data of adult head, chest and abdomen/pelvis CT examinations were extracted from patient folders. The median, 75th and 25th percentile CT dose index volume (CTDI vol ) and dose-length-product (DLP) were computed for each of these procedures. Effective doses (E) for these examinations were estimated using the k conversion factor as described in the ICRP publication 103 (E DLP  =  k × DLP ). The proposed 75th percentile CTDI vol for head, chest, and abdomen/pelvis are 61 mGy, 17 mGy, and 20 mGy, respectively. The corresponding DLPs are 1310 mGy.cm, 735 mGy.cm, and 1486 mGy.cm respectively. The effective doses were 2.75 mSv (head), 10.29 mSv (chest), and 22.29 mSv (abdomen/pelvis). Findings demonstrate wide dose variations within and across centres in Nigeria. The results also show CTDI vol comparable to international standards, but considerably higher DLP and effective doses.

  16. Neuromorphic Computing for Very Large Test and Evaluation Data Analysis

    Science.gov (United States)

    2014-05-01

    Conference on Neural Networks IEEE , (To be published in IEEE Xplore ) Liu, B., Chen, Y., Wysocki, B., Huang, T., “The Circuit Realization of a...A. "Compact Method for Modeling and Simulation of Memristor Devices: Ion Conductor Chalcogenide-Based Memristor Devices," IEEE /ACM International...R.E.; Rogers, S.; , "A Memristor Device Model," Electron Device Letters, IEEE , Vol. 32, no.10, Oct. 2011, pp.1436-1438 Approved for Public

  17. Data mining in e-commerce: A survey

    Indian Academy of Sciences (India)

    Data mining has matured as a field of basic and applied research in computer science in general and e-commerce in particular. In this paper, we survey some of the recent approaches and architectures where data mining has been applied in the fields of e-commerce and e-business. Our intent is not to survey the plethora ...

  18. Adolescent computer use and alcohol use: what are the role of quantity and content of computer use?

    Science.gov (United States)

    Epstein, Jennifer A

    2011-05-01

    The purpose of this study was to examine the relationship between computer use and alcohol use among adolescents. In particular, the goal of the research was to determine the role of lifetime drinking and past month drinking on quantity as measured by amount of time on the computer (for school work and excluding school work) and on content as measured by the frequency of a variety of activities on the internet (e.g., e-mail, searching for information, social networking, listen to/download music). Participants (aged 13-17 years and residing in the United States) were recruited via the internet to complete an anonymous survey online using a popular survey tool (N=270). Their average age was 16 and the sample was predominantly female (63% girls). A series of analyses was conducted with the computer use measures as dependent variables (hours on the computer per week for school work and excluding school work; various internet activities including e-mail, searching for information, social networking, listen to/download music) controlling for gender, age, academic performance and age of first computer use. Based on the results, past month drinkers used the computer more hours per week excluding school work than those who did not. As expected, there were no differences in hours based on alcohol use for computer use for school work. Drinking also had relationships with more frequent social networking and listening to/downloading music. These findings suggest that both quantity and content of computer use were related to adolescent drinking. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. RMIT at the TREC 2015 LiveQA Track

    Science.gov (United States)

    2015-11-20

    recipient of an Australian Research Council DECRA Research Fellowship (DE140100275). REFERENCES [1] Michael Collins. Head-driven statistical models for...18th Australasian Document Computing Symposium, pages 58–65. ACM, 2013. [4] Matthias Petri, Alistair Moffat, and J Shane Culpepper. Score-safe term

  20. Acemetacin cocrystals and salts: structure solution from powder X-ray data and form selection of the piperazine salt.

    Science.gov (United States)

    Sanphui, Palash; Bolla, Geetha; Nangia, Ashwini; Chernyshev, Vladimir

    2014-03-01

    Acemetacin (ACM) is a non-steroidal anti-inflammatory drug (NSAID), which causes reduced gastric damage compared with indomethacin. However, acemetacin has a tendency to form a less soluble hydrate in the aqueous medium. We noted difficulties in the preparation of cocrystals and salts of acemetacin by mechanochemical methods, because this drug tends to form a hydrate during any kind of solution-based processing. With the objective to discover a solid form of acemetacin that is stable in the aqueous medium, binary adducts were prepared by the melt method to avoid hydration. The coformers/salt formers reported are pyridine carboxamides [nicotinamide (NAM), isonicotinamide (INA), and picolinamide (PAM)], caprolactam (CPR), p-aminobenzoic acid (PABA), and piperazine (PPZ). The structures of an ACM-INA cocrystal and a binary adduct ACM-PABA were solved using single-crystal X-ray diffraction. Other ACM cocrystals, ACM-PAM and ACM-CPR, and the piperazine salt ACM-PPZ were solved from high-resolution powder X-ray diffraction data. The ACM-INA cocrystal is sustained by the acid⋯pyridine heterosynthon and N-H⋯O catemer hydrogen bonds involving the amide group. The acid⋯amide heterosynthon is present in the ACM-PAM cocrystal, while ACM-CPR contains carboxamide dimers of caprolactam along with acid-carbonyl (ACM) hydrogen bonds. The cocrystals ACM-INA, ACM-PAM and ACM-CPR are three-dimensional isostructural. The carboxyl⋯carboxyl synthon in ACM-PABA posed difficulty in assigning the position of the H atom, which may indicate proton disorder. In terms of stability, the salts were found to be relatively stable in pH 7 buffer medium over 24 h, but the cocrystals dissociated to give ACM hydrate during the same time period. The ACM-PPZ salt and ACM-nicotinamide cocrystal dissolve five times faster than the stable hydrate form, whereas the ACM-PABA adduct has 2.5 times faster dissolution rate. The pharmaceutically acceptable piperazine salt of acemetacin exhibits superior

  1. MiR-320a as a Potential Novel Circulating Biomarker of Arrhythmogenic CardioMyopathy.

    Science.gov (United States)

    Sommariva, Elena; D'Alessandra, Yuri; Farina, Floriana Maria; Casella, Michela; Cattaneo, Fabio; Catto, Valentina; Chiesa, Mattia; Stadiotti, Ilaria; Brambilla, Silvia; Dello Russo, Antonio; Carbucicchio, Corrado; Vettor, Giulia; Riggio, Daniela; Sandri, Maria Teresa; Barbuti, Andrea; Vernillo, Gianluca; Muratori, Manuela; Dal Ferro, Matteo; Sinagra, Gianfranco; Moimas, Silvia; Giacca, Mauro; Colombo, Gualtiero Ivanoe; Pompilio, Giulio; Tondo, Claudio

    2017-07-06

    Diagnosis of Arrhythmogenic CardioMyopathy (ACM) is challenging and often late after disease onset. No circulating biomarkers are available to date. Given their involvement in several cardiovascular diseases, plasma microRNAs warranted investigation as potential non-invasive diagnostic tools in ACM. We sought to identify circulating microRNAs differentially expressed in ACM with respect to Healthy Controls (HC) and Idiopathic Ventricular Tachycardia patients (IVT), often in differential diagnosis. ACM and HC subjects were screened for plasmatic expression of 377 microRNAs and validation was performed in 36 ACM, 53 HC, 21 IVT. Variable importance in data partition was estimated through Random Forest analysis and accuracy by Receiver Operating Curves. Plasmatic miR-320a showed 0.53 ± 0.04 fold expression difference in ACM vs. HC (p ACM (n = 13) and HC (n = 17) with athletic lifestyle, a ACM precipitating factor. Importantly, ACM patients miR-320a showed 0.78 ± 0.05 fold expression change vs. IVT (p = 0.03). When compared to non-invasive ACM diagnostic parameters, miR-320a ranked highly in discriminating ACM vs. IVT and it increased their accuracy. Finally, miR-320a expression did not correlate with ACM severity. Our data suggest that miR-320a may be considered a novel potential biomarker of ACM, specifically useful in ACM vs. IVT differentiation.

  2. Computer aided materials design; Keisanki zairyo sekkei

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The questionnaire survey on the computer aided materials design (CAMD), and the survey of current domestic and overseas software concerned were carried out to clarify developmental issues. The current elementary technology of CAMD was also surveyed to study its several problems caused with a progress of material design technology due to drastic diffusion of CAMD. This project aims at establishment of a new demanded software, computer chemistry, focusing attention on functional materials such as catalyst, polymer and non-linear electronic materials. Microscopic simulation technology was mainly surveyed in fiscal 1996. Although some fruitful results have been obtained in the fields of medical and agricultural chemicals, organic compounds, proteins, catalysts and electronic materials, such some problems are pointed out as `CAMD cannot handle an actual size of the target system` and `commercially available software are very expensive.` Reliable tool development as elementary technology, and the verification of its applications are thus required. Meso-dynamics, polymers, surface reaction and integrated technological environment attract users` attention. 27 refs., 16 figs., 2 tabs.

  3. Introducing Principles of Land Surveying by Assigning a Practical Project

    OpenAIRE

    Introducing Principles of Land Surveying by Assigning a Practical Project

    2016-01-01

    A practical project is used in an engineering surveying course to expose sophomore and junior civil engineering students to several important issues related to the use of basic principles of land surveying. The project, which is the design of a two-lane rural highway to connect between two arbitrary points, requires students to draw the profile of the proposed highway along with the existing ground level. Areas of all cross-sections are then computed to enable quantity computations between th...

  4. Computer integration in the curriculum: promises and problems

    NARCIS (Netherlands)

    Plomp, T.; van den Akker, Jan

    1988-01-01

    This discussion of the integration of computers into the curriculum begins by reviewing the results of several surveys conducted in the Netherlands and the United States which provide insight into the problems encountered by schools and teachers when introducing computers in education. Case studies

  5. Optical character recognition systems for different languages with soft computing

    CERN Document Server

    Chaudhuri, Arindam; Badelia, Pratixa; K Ghosh, Soumya

    2017-01-01

    The book offers a comprehensive survey of soft-computing models for optical character recognition systems. The various techniques, including fuzzy and rough sets, artificial neural networks and genetic algorithms, are tested using real texts written in different languages, such as English, French, German, Latin, Hindi and Gujrati, which have been extracted by publicly available datasets. The simulation studies, which are reported in details here, show that soft-computing based modeling of OCR systems performs consistently better than traditional models. Mainly intended as state-of-the-art survey for postgraduates and researchers in pattern recognition, optical character recognition and soft computing, this book will be useful for professionals in computer vision and image processing alike, dealing with different issues related to optical character recognition.

  6. If I Survey You Again Today, Will Still Love Me Tomorrow?

    Science.gov (United States)

    Webster, Sarah P.

    1989-01-01

    Description of academic computing services at Syracuse University focuses on surveys of students and faculty that have identified hardware and software use, problems encountered, prior computer experience, and attitudes toward computers. Advances in microcomputers, word processing, and graphics are described; resource allocation is discussed; and…

  7. A survey of cost accounting in service-oriented computing

    NARCIS (Netherlands)

    de Medeiros, Robson W.A.; Rosa, Nelson S.; Campos, Glaucia M.M.; Ferreira Pires, Luis

    Nowadays, companies are increasingly offering their business services through computational services on the Internet in order to attract more customers and increase their revenues. However, these services have financial costs that need to be managed in order to maximize profit. Several models and

  8. Survey of Storage and Fault Tolerance Strategies Used in Cloud Computing

    Science.gov (United States)

    Ericson, Kathleen; Pallickara, Shrideep

    Cloud computing has gained significant traction in recent years. Companies such as Google, Amazon and Microsoft have been building massive data centers over the past few years. Spanning geographic and administrative domains, these data centers tend to be built out of commodity desktops with the total number of computers managed by these companies being in the order of millions. Additionally, the use of virtualization allows a physical node to be presented as a set of virtual nodes resulting in a seemingly inexhaustible set of computational resources. By leveraging economies of scale, these data centers can provision cpu, networking, and storage at substantially reduced prices which in turn underpins the move by many institutions to host their services in the cloud.

  9. Student Computer Use: Its Organizational Structure and Institutional Support.

    Science.gov (United States)

    Juska, Arunas; Paris, Arthur E.

    1993-01-01

    Examines the structure of undergraduate computing at a large private university, including patterns of use, impact of computer ownership and gender, and the bureaucratic structure in which usage is embedded. The profile of computer use uncovered in a survey is compared with reports offered by the institution and the trade press. (10 references)…

  10. Joint Labeling Of Multiple Regions of Interest (Rois) By Enhanced Auto Context Models.

    Science.gov (United States)

    Kim, Minjeong; Wu, Guorong; Guo, Yanrong; Shen, Dinggang

    2015-04-01

    Accurate segmentation of a set of regions of interest (ROIs) in the brain images is a key step in many neuroscience studies. Due to the complexity of image patterns, many learning-based segmentation methods have been proposed, including auto context model (ACM) that can capture high-level contextual information for guiding segmentation. However, since current ACM can only handle one ROI at a time, neighboring ROIs have to be labeled separately with different ACMs that are trained independently without communicating each other. To address this, we enhance the current single-ROI learning ACM to multi-ROI learning ACM for joint labeling of multiple neighboring ROIs (called e ACM). First, we extend current independently-trained single-ROI ACMs to a set of jointly-trained cross-ROI ACMs, by simultaneous training of ACMs for all spatially-connected ROIs to let them to share their respective intermediate outputs for coordinated labeling of each image point. Then, the context features in each ACM can capture the cross-ROI dependence information from the outputs of other ACMs that are designed for neighboring ROIs. Second, we upgrade the output labeling map of each ACM with the multi-scale representation, thus both local and global context information can be effectively used to increase the robustness in characterizing geometric relationship among neighboring ROIs. Third, we integrate ACM into a multi-atlases segmentation paradigm, for encompassing high variations among subjects. Experiments on LONI LPBA40 dataset show much better performance by our e ACM, compared to the conventional ACM.

  11. Computer-Aided Engineering Education at the K.U. Leuven.

    Science.gov (United States)

    Snoeys, R.; Gobin, R.

    1987-01-01

    Describes some recent initiatives and developments in the computer-aided design program in the engineering faculty of the Katholieke Universiteit Leuven (Belgium). Provides a survey of the engineering curriculum, the computer facilities, and the main software packages available. (TW)

  12. Statistical properties of dynamical systems – Simulation and abstract computation

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Hoyrup, Mathieu; Rojas, Cristóbal

    2012-01-01

    Highlights: ► A survey on results about computation and computability on the statistical properties of dynamical systems. ► Computability and non-computability results for invariant measures. ► A short proof for the computability of the convergence speed of ergodic averages. ► A kind of “constructive” version of the pointwise ergodic theorem. - Abstract: We survey an area of recent development, relating dynamics to theoretical computer science. We discuss some aspects of the theoretical simulation and computation of the long term behavior of dynamical systems. We will focus on the statistical limiting behavior and invariant measures. We present a general method allowing the algorithmic approximation at any given accuracy of invariant measures. The method can be applied in many interesting cases, as we shall explain. On the other hand, we exhibit some examples where the algorithmic approximation of invariant measures is not possible. We also explain how it is possible to compute the speed of convergence of ergodic averages (when the system is known exactly) and how this entails the computation of arbitrarily good approximations of points of the space having typical statistical behaviour (a sort of constructive version of the pointwise ergodic theorem).

  13. Computer applications in water conservancy and hydropower engineering

    Energy Technology Data Exchange (ETDEWEB)

    Chen, J

    1984-09-20

    The use of computers in China's water conservancy and hydropower construction began in the 1960s for exploration surveys, planning, design, construction, operation, and scientific research. Despite the positive results, and the formation of a 1000-person computer computation contingent, computer development among different professions is not balanced. The weaknesses and disparities in computer applications include an overall low level of application relative to the rest of the world, which is partly due to inadequate hardware and programs. The report suggests five ways to improve applications and popularize microcomputers which emphasize leadership and planning.

  14. Computer representation of molecular surfaces

    International Nuclear Information System (INIS)

    Max, N.L.

    1981-01-01

    This review article surveys recent work on computer representation of molecular surfaces. Several different algorithms are discussed for producing vector or raster drawings of space-filling models formed as the union of spheres. Other smoother surfaces are also considered

  15. Enhanced Survey and Proposal to secure the data in Cloud Computing Environment

    OpenAIRE

    MR.S.SUBBIAH; DR.S.SELVA MUTHUKUMARAN; DR.T.RAMKUMAR

    2013-01-01

    Cloud computing have the power to eliminate the cost of setting high end computing infrastructure. It is a promising area or design to give very flexible architecture, accessible through the internet. In the cloud computing environment the data will be reside at any of the data centers. Due to that, some data center may leak the data stored on there, beyond the reach and control of the users. For this kind of misbehaving data centers, the service providers should take care of the security and...

  16. The Role of Computer Networks in Aerospace Engineering.

    Science.gov (United States)

    Bishop, Ann Peterson

    1994-01-01

    Presents selected results from an empirical investigation into the use of computer networks in aerospace engineering based on data from a national mail survey. The need for user-based studies of electronic networking is discussed, and a copy of the questionnaire used in the survey is appended. (Contains 46 references.) (LRW)

  17. Survey of computer systems usage in southeastern Nigeria | Opara ...

    African Journals Online (AJOL)

    The shift from industrial age (17th Century) to information age (21st Century) has led to information explosion in this 21st century. Therefore, this has resulted in tremendous advancement in Computer Systems Technology (CST), software engineering and telecommunications. Also, the resultant radical changes as well as ...

  18. Expanding Usability of Virtual Network Laboratory in IT Engineering Education

    Directory of Open Access Journals (Sweden)

    Dalibor M Dobrilovic

    2013-02-01

    Full Text Available This paper deals with importance of virtual network laboratories usage in IT engineering education. It presents the particular virtual network laboratory model developed for usage in Computer Networks course as well. This virtual network laboratory, called VNLab, is based on virtualization technology. It has been successfully tested in educational process of Computer Network course for IT undergraduate students. Its usability for network related courses is analyzed by comparison of recommended curricula’s of world organizations such as IEEE, ACM and AIS. This paper is focused on expanding the usability of this virtual network laboratory to other non-network related courses. The primary expansion field is in domain of IT System Administration, IT Systems and Data Security and Operating Systems as well. The possible learning scenarios, learning tools and concepts for making this system applicable in these three additional fields are presented by the analyses of compatibility with recommended learning topics and outcomes by IEEE, ACM and AIS.

  19. Assessment of Examinations in Computer Science Doctoral Education

    Science.gov (United States)

    Straub, Jeremy

    2014-01-01

    This article surveys the examination requirements for attaining degree candidate (candidacy) status in computer science doctoral programs at all of the computer science doctoral granting institutions in the United States. It presents a framework for program examination requirement categorization, and categorizes these programs by the type or types…

  20. Computational physics problem solving with Python

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2015-01-01

    The use of computation and simulation has become an essential part of the scientific process. Being able to transform a theory into an algorithm requires significant theoretical insight, detailed physical and mathematical understanding, and a working level of competency in programming. This upper-division text provides an unusually broad survey of the topics of modern computational physics from a multidisciplinary, computational science point of view. Its philosophy is rooted in learning by doing (assisted by many model programs), with new scientific materials as well as with the Python progr

  1. Green and sustainable computing pt.I

    CERN Document Server

    Hurson, Ali

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  2. Arithmetically Cohen-Macaulay sets of points in P^1 x P^1

    CERN Document Server

    Guardo, Elena

    2015-01-01

    This brief presents a solution to the interpolation problem for arithmetically Cohen-Macaulay (ACM) sets of points in the multiprojective space P^1 x P^1.  It collects the various current threads in the literature on this topic with the aim of providing a self-contained, unified introduction while also advancing some new ideas.  The relevant constructions related to multiprojective spaces are reviewed first, followed by the basic properties of points in P^1 x P^1, the bigraded Hilbert function, and ACM sets of points.  The authors then show how, using a combinatorial description of ACM points in P^1 x P^1, the bigraded Hilbert function can be computed and, as a result, solve the interpolation problem.  In subsequent chapters, they consider fat points and double points in P^1 x P^1 and demonstrate how to use their results to answer questions and problems of interest in commutative algebra.  Throughout the book, chapters end with a brief historical overview, citations of related results, and, where relevan...

  3. Development of aerial gamma radiation survey system, 1

    International Nuclear Information System (INIS)

    Tsutsumi, Masahiro; Saito, Komei; Sakamoto, Ryuichi; Nagaoka, Toshi; Moriuchi, Shigeru

    1986-05-01

    JAERI has started to develop an aerial gamma radiation survey system by a helicopter in 1980. The development of measuring instruments, the experiments in the field of natural and artificial radiation sources, and the imaginary emergency survey at a real site, were executed. This report describes mainly about the hardware and software of this system. The system consists of gamma-ray measuring instruments with NaI(Tl) scintillation detectors, microwave positioning instruments, and a data processing system for postflight data. A foreign-made geological survey system is improved on for radiation measurements. For covering the wide radiation range, detectors of various shapes and sizes are prepared, from a large volume detector, DET-1024 - four 4'' x 4'' x 16'' crystals assembly - to a 2'' diameter x 2'' NaI(Tl) crystal. Radiation and position data are recorded on a magnetic tape, and computer-processed afterwards. Moreover, scene below flight courses and internal communication are recorded on video tape with the information of clock and position superimposed. In consequence of field experiments carried out five times, basic radiation data for evaluating airborne acquired data are accumulated, and flight survey procedures are established. As more practical use, a system has been produced more compactly and functionally. Exposure rates (> 1 mR/h), energy distribution spectra, and energy window counts are obtained for radiation data. Using the Spectrum-Dose Conversion Method, the accurate exposure rates are directly calculated from pulse height spectra. Numerical tables of G(E) function converting pulse height spectra into exposures are shown in this report. As regards the analysis of survey data, process codes have been completed with either large-computer or mini-computer. (author)

  4. Spatiotemporal Data Mining: A Computational Perspective

    Directory of Open Access Journals (Sweden)

    Shashi Shekhar

    2015-10-01

    Full Text Available Explosive growth in geospatial and temporal data as well as the emergence of new technologies emphasize the need for automated discovery of spatiotemporal knowledge. Spatiotemporal data mining studies the process of discovering interesting and previously unknown, but potentially useful patterns from large spatiotemporal databases. It has broad application domains including ecology and environmental management, public safety, transportation, earth science, epidemiology, and climatology. The complexity of spatiotemporal data and intrinsic relationships limits the usefulness of conventional data science techniques for extracting spatiotemporal patterns. In this survey, we review recent computational techniques and tools in spatiotemporal data mining, focusing on several major pattern families: spatiotemporal outlier, spatiotemporal coupling and tele-coupling, spatiotemporal prediction, spatiotemporal partitioning and summarization, spatiotemporal hotspots, and change detection. Compared with other surveys in the literature, this paper emphasizes the statistical foundations of spatiotemporal data mining and provides comprehensive coverage of computational approaches for various pattern families. ISPRS Int. J. Geo-Inf. 2015, 4 2307 We also list popular software tools for spatiotemporal data analysis. The survey concludes with a look at future research needs.

  5. 75 FR 52508 - Proposed Information Collection; Comment Request; Information and Communication Technology Survey

    Science.gov (United States)

    2010-08-26

    ... conduct the 2010 through 2012 Information and Communication Technology Survey (ICTS). The annual survey... payments) for four types of information and communication technology equipment and software (computers and... and Communication Technology Survey AGENCY: U.S. Census Bureau. ACTION: Notice. SUMMARY: The...

  6. 78 FR 50374 - Proposed Information Collection; Comment Request; Information and Communication Technology Survey

    Science.gov (United States)

    2013-08-19

    ... and Communication Technology Survey AGENCY: U.S. Census Bureau, Commerce. ACTION: Notice. SUMMARY: The... Communication Technology Survey (ICTS). The annual survey collects data on two categories of non-capitalized... communication technology equipment and software (computers and peripheral equipment; ICT equipment, excluding...

  7. Computer tablet or telephone? A randomised controlled trial exploring two methods of collecting data from drug and alcohol outpatients.

    Science.gov (United States)

    Hobden, Breanne; Bryant, Jamie; Carey, Mariko; Sanson-Fisher, Rob; Oldmeadow, Christopher

    2017-08-01

    Both computerised and telephone surveys have potential advantages for research data collection. The current study aimed to determine the: (i) feasibility, (ii) acceptability, and (iii) cost per completed survey of computer tablet versus telephone data collection for clients attending an outpatient drug and alcohol treatment clinic. Two-arm randomised controlled trial. Clients attending a drug and alcohol outpatient clinic in New South Wales, Australia, were randomised to complete a baseline survey via computer tablet in the clinic or via telephone interview within two weeks of their appointment. All participants completed a three-month follow-up survey via telephone. Consent and completion rates for the baseline survey were significantly higher in the computer tablet condition. The time taken to complete the computer tablet survey was lower (11min) than the telephone condition (17min). There were no differences in the proportion of consenters or completed follow-up surveys between the two conditions at the 3-month follow-up. Acceptability was high across both modes of data collection. The cost of the computer tablet condition was $67.52 greater per completed survey than the telephone condition. There is a trade-off between computer tablet and telephone data collection. While both data collection methods were acceptable to participants, the computer tablet condition resulted in higher consent and completion rates at baseline, therefore yielding greater external validity, and was quicker for participants to complete. Telephone data collection was however, more cost-effective. Researchers should carefully consider the mode of data collection that suits individual study needs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. New Pedagogies on Teaching Science with Computer Simulations

    Science.gov (United States)

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  9. An automated radiological survey method for performing site remediation and decommissioning

    International Nuclear Information System (INIS)

    Handy, R.G.; Bolch, W.E.; Harder, G.F.; Tolaymat, T.M.

    1994-01-01

    A portable, computer-based method of performing environmental monitoring and assessment for site remediation and decommissioning has been developed. The integrated system has been developed to provide for survey time reductions and real-time data analysis. The technique utilizes a notebook 486 computer with the necessary hardware and software components that makes it possible to be used in an almost unlimited number of environmental monitoring and assessment scenarios. The results from a pilot, open-quotes hide-and-seekclose quotes gamma survey and an actual alpha decontamination survey were elucidated. It was found that a open-quotes hide-and-seekclose quotes survey could come up with timely and accurate conclusions about the position of the source. The use of the automated system in a Th-232 alpha survey resulted in a reduction in the standard time necessary to do a radiological survey. In addition, the ability to analyze the data on-site allowed for identification and location of areas which needed further decontamination. Finally, a discussion on possible future improvements and field conclusions was made

  10. The survey of academic libraries

    CERN Document Server

    2014-01-01

    The Survey of Academic Libraries, 2014-15 Edition looks closely at key benchmarks for academic libraries in areas such as spending for books and e-books, deployment and pay rates for student workers, use of tablet computers, cloud computing and other new technologies, database licensing practices, and much more. The study includes detailed data on overall budgets, capital budgets, salaries and materials spending, and much more of interest to academic librarians and their suppliers. Data in this 200+ page report is broken out by size and type of library for easy benchmarking.

  11. Results of the First National Assessment of Computer Competence (The Printout).

    Science.gov (United States)

    Balajthy, Ernest

    1988-01-01

    Discusses the findings of the National Assessment of Educational Progress 1985-86 survey of American students' computer competence, focusing on findings of interest to reading teachers who use computers. (MM)

  12. Peer production & peer support at the Free Technology Academy

    NARCIS (Netherlands)

    Potters, Hanneke; Berlanga, Adriana; Bijlsma, Lex

    2012-01-01

    Potters, H., Berlanga, A. J., & Lex, B. (2011). Peer Production & Peer Support at the Free Technology Academy. In G. van de Veer, P. B. Sloep, & M. van Eekelen (Eds.), Proceedings Computer Science Education Research Conference (CSERC '11) (pp. 49-58). April, 7-8, 2011, Heerlen, The Netherlands: ACM.

  13. International workshop on multimodal virtual and augmented reality (workshop summary)

    NARCIS (Netherlands)

    Hürst, W.O.; Iwai, Daisuke; Balakrishnan, Prabhakaran

    2016-01-01

    Virtual reality (VR) and augmented reality (AR) are expected by many to become the next wave of computing with significant impacts on our daily lives. Motivated by this, we organized a workshop on “Multimodal Virtual and Augmented Reality (MVAR)” at the 18th ACM International Conference on

  14. The Cronus Distributed DBMS (Database Management System) Project

    Science.gov (United States)

    1989-10-01

    projects, e.g., HiPAC [Dayal 88] and Postgres [Stonebraker 86]. Although we expect to use these techniques, they have been developed for centralized...Computing Systems, June 1989. (To appear). [Stonebraker 86] Stonebraker, M. and Rowe, L. A., "The Design of POSTGRES ," Proceedings ACM SIGMOD Annual

  15. modelling the behaviour of interface surfaces using the finite eleme

    African Journals Online (AJOL)

    user

    Norwell, M.A.. 36. Wingo, etal, Hardware assisted self-collision for rigid and deformable surfaces, Journal of. Tele-operators and Virtual Environments. Dec., 2004. Vol. 13, No 6 pp 681-691. 37. Brian Von Herzen, etal. Geometric Collisions for Time- dependent parametric surfaces. ACM SIGGRAPH Computer Graphics, Aug.,.

  16. A survey of process control computers at the Idaho Chemical Processing Plant

    International Nuclear Information System (INIS)

    Dahl, C.A.

    1989-01-01

    The Idaho Chemical Processing Plant (ICPP) at the Idaho National Engineering Laboratory is charged with the safe processing of spent nuclear fuel elements for the United States Department of Energy. The ICPP was originally constructed in the late 1950s and used state-of-the-art technology for process control at that time. The state of process control instrumentation at the ICPP has steadily improved to keep pace with emerging technology. Today, the ICPP is a college of emerging computer technology in process control with some systems as simple as standalone measurement computers while others are state-of-the-art distributed control systems controlling the operations in an entire facility within the plant. The ICPP has made maximal use of process computer technology aimed at increasing surety, safety, and efficiency of the process operations. Many benefits have been derived from the use of the computers for minimal costs, including decreased misoperations in the facility, and more benefits are expected in the future

  17. [Musculoskeletal disorders among university student computer users].

    Science.gov (United States)

    Lorusso, A; Bruno, S; L'Abbate, N

    2009-01-01

    Musculoskeletal disorders are a common problem among computer users. Many epidemiological studies have shown that ergonomic factors and aspects of work organization play an important role in the development of these disorders. We carried out a cross-sectional survey to estimate the prevalence of musculoskeletal symptoms among university students using personal computers and to investigate the features of occupational exposure and the prevalence of symptoms throughout the study course. Another objective was to assess the students' level of knowledge of computer ergonomics and the relevant health risks. A questionnaire was distributed to 183 students attending the lectures for second and fourth year courses of the Faculty of Architecture. Data concerning personal characteristics, ergonomic and organizational aspects of computer use, and the presence of musculoskeletal symptoms in the neck and upper limbs were collected. Exposure to risk factors such as daily duration of computer use, time spent at the computer without breaks, duration of mouse use and poor workstation ergonomics was significantly higher among students of the fourth year course. Neck pain was the most commonly reported symptom (69%), followed by hand/wrist (53%), shoulder (49%) and arm (8%) pain. The prevalence of symptoms in the neck and hand/wrist area was signifcantly higher in the students of the fourth year course. In our survey we found high prevalence of musculoskeletal symptoms among university students using computers for long time periods on a daily basis. Exposure to computer-related ergonomic and organizational risk factors, and the prevalence ofmusculoskeletal symptoms both seem to increase significantly throughout the study course. Furthermore, we found that the level of perception of computer-related health risks among the students was low. Our findings suggest the need for preventive intervention consisting of education in computer ergonomics.

  18. Industrial applications of computed tomography

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Carmignato, S.; Kruth, J. -P.

    2014-01-01

    The number of industrial applications of Computed Tomography(CT) is large and rapidly increasing. After a brief market overview, the paper gives a survey of state of the art and upcoming CT technologies, covering types of CT systems, scanning capabilities, and technological advances. The paper...

  19. The Role of Visualization in Computer Science Education

    Science.gov (United States)

    Fouh, Eric; Akbar, Monika; Shaffer, Clifford A.

    2012-01-01

    Computer science core instruction attempts to provide a detailed understanding of dynamic processes such as the working of an algorithm or the flow of information between computing entities. Such dynamic processes are not well explained by static media such as text and images, and are difficult to convey in lecture. The authors survey the history…

  20. Short assessment of the Big Five: robust across survey methods except telephone interviewing

    OpenAIRE

    Lang, Frieder R.; John, Dennis; Lüdtke, Oliver; Schupp, Jürgen; Wagner, Gert G.

    2011-01-01

    We examined measurement invariance and age-related robustness of a short 15-item Big Five Inventory (BFI–S) of personality dimensions, which is well suited for applications in large-scale multidisciplinary surveys. The BFI–S was assessed in three different interviewing conditions: computer-assisted or paper-assisted face-to-face interviewing, computer-assisted telephone interviewing, and a self-administered questionnaire. Randomized probability samples from a large-scale German panel survey a...

  1. HOMOMORPHIC ENCRYPTION: CLOUD COMPUTING SECURITY AND OTHER APPLICATIONS (A SURVEY

    Directory of Open Access Journals (Sweden)

    A. I. Trubei

    2015-01-01

    Full Text Available Homomorphic encryption is a form of encryption which allows specific types of computations to be carried out on cipher text and to obtain an encrypted result which matches the result of operations performed on the plain text. The article presents a basic concept of the homomorphic encryption and various encryption algorithms in accordance with the fundamental properties of the homomorphic encryption. The examples of various principles and properties of homomorphic encryption, some homomorphic algorithms using asymmetric key systems such as RSA, ElGamal, Paillier algorithms as well as various homomorphic encryption schemes are given. Prospects of homomorphic encryption application in the field of secure cloud computing, electronic voting, cipher text searching, encrypted mail filtering, mobile cipher and secure feedback systems are considered.

  2. Computer code determination of tolerable accel current and voltage limits during startup of an 80 kV MFTF sustaining neutral beam source

    International Nuclear Information System (INIS)

    Mayhall, D.J.; Eckard, R.D.

    1979-01-01

    We have used a Lawrence Livermore Laboratory (LLL) version of the WOLF ion source extractor design computer code to determine tolerable accel current and voltage limits during startup of a prototype 80 kV Mirror Fusion Test Facility (MFTF) sustaining neutral beam source. Arc current limits are also estimated. The source extractor has gaps of 0.236, 0.721, and 0.155 cm. The effective ion mass is 2.77 AMU. The measured optimum accel current density is 0.266 A/cm 2 . The gradient grid electrode runs at 5/6 V/sub a/ (accel voltage). The suppressor electrode voltage is zero for V/sub a/ < 3 kV and -3 kV for V/sub a/ greater than or equal to 3 kV. The accel current density for optimum beam divergence is obtained for 1 less than or equal to V/sub a/ less than or equal to 80 kV, as are the beam divergence and emittance

  3. X-ray machine vision and computed tomography

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    This survey examines how 2-D x-ray machine vision and 3-D computed tomography will be used in industry in the 1988-1995 timeframe. Specific applications are described and rank-ordered in importance. The types of companies selling and using 2-D and 3-D systems are profiled, and markets are forecast for 1988 to 1995. It is known that many machine vision and automation companies are now considering entering this field. This report looks at the potential pitfalls and whether recent market problems similar to those recently experienced by the machine vision industry will likely occur in this field. FTS will publish approximately 100 other surveys in 1988 on emerging technology in the fields of AI, manufacturing, computers, sensors, photonics, energy, bioengineering, and materials

  4. Incorporating Colour Information for Computer-Aided Diagnosis of Melanoma from Dermoscopy Images: A Retrospective Survey and Critical Analysis

    Directory of Open Access Journals (Sweden)

    Ali Madooei

    2016-01-01

    Full Text Available Cutaneous melanoma is the most life-threatening form of skin cancer. Although advanced melanoma is often considered as incurable, if detected and excised early, the prognosis is promising. Today, clinicians use computer vision in an increasing number of applications to aid early detection of melanoma through dermatological image analysis (dermoscopy images, in particular. Colour assessment is essential for the clinical diagnosis of skin cancers. Due to this diagnostic importance, many studies have either focused on or employed colour features as a constituent part of their skin lesion analysis systems. These studies range from using low-level colour features, such as simple statistical measures of colours occurring in the lesion, to availing themselves of high-level semantic features such as the presence of blue-white veil, globules, or colour variegation in the lesion. This paper provides a retrospective survey and critical analysis of contributions in this research direction.

  5. Use of computed tomography and computed tomographic myelography for assessment of spinal tumoral calcinosis in a dog

    International Nuclear Information System (INIS)

    Ham, L.M. van; Bree, H.J. van; Tshamala, M.; Thoonen, H.

    1995-01-01

    Spinal tumoral calcinosis is reported in a Berner sennenhund puppy. The condition was manifested clinically as a non-ambulatory tetraparesis associated with neck pain. On survey radiographs there was a focal calcified mass at the atlantoaxial articulation. Computed tomography and computed tomographic myelography gave additional information on the extent of the mass and on the degree of spinal cord compression. The mass was removed surgically and the dog made a complete recovery

  6. The Danish Youth Survey 2002

    DEFF Research Database (Denmark)

    Helweg-Larsen, Karin; Sundaram, Vanita; Curtis, Tine

    2004-01-01

    OBJECTIVES: To explore ethical, legal and practical issues related to conducting a youth survey in Denmark on sexual experiences before the age of 15 and thereby achieve reliable data on child sexual abuse. STUDY DESIGN AND METHODS: The relevant authorities were consulted on possible legal...... of the accompanying offer of counselling. CONCLUSION: An anonymous youth survey based on computer-assisted self-interview (CASI) would increase the validity of youth surveys on child sexual abuse to which no ethical or legal objections were found....... obtaining parental consent. The Central Scientific Ethical Committee had no objections. In a number of fields, Danish legislation accords 15-to-18-year-olds the competence to make independent decisions regarding their personal circumstances, and the UN Convention of Children's Rights states that a child...

  7. Computed tomography in Alexander's disease

    Energy Technology Data Exchange (ETDEWEB)

    Holland, I M; Kendall, B E

    1980-10-01

    Two cases of biopsy-proven Alexander's disease are described with computed tomographic changes which, in our experience and on survey of the literature, have not occurred in any other condition. Such changes in a child with a progressive condition consistent with Alexander's disease, strongly support the diagnosis.

  8. Trust models in ubiquitous computing.

    Science.gov (United States)

    Krukow, Karl; Nielsen, Mogens; Sassone, Vladimiro

    2008-10-28

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.

  9. A Survey of Quantum Learning Theory

    OpenAIRE

    Arunachalam, Srinivasan; de Wolf, Ronald

    2017-01-01

    This paper surveys quantum learning theory: the theoretical aspects of machine learning using quantum computers. We describe the main results known for three models of learning: exact learning from membership queries, and Probably Approximately Correct (PAC) and agnostic learning from classical or quantum examples.

  10. Designs 2002 further computational and constructive design theory

    CERN Document Server

    2003-01-01

    This volume is a sequel to the 1996 compilation, Computational and Constructive Design Theory. It contains research papers and surveys of recent research work on two closely related aspects of the study of combinatorial designs: design construction and computer-aided study of designs. Audience: This volume is suitable for researchers in the theory of combinatorial designs

  11. Some gender issues in educational computer use: results of an international comparative survey

    OpenAIRE

    Janssen Reinen, I.A.M.; Plomp, T.

    1993-01-01

    In the framework of the Computers in Education international study of the International Association for the Evaluation of Educational Achievement (IEA), data have been collected concerning the use of computers in 21 countries. This article examines some results regarding the involvement of women in the implementation and use of computers in the educational practice of elementary, lower secondary and upper secondary education in participating countries. The results show that in many countries ...

  12. How Far Can You Trust A Computer?

    National Research Council Canada - National Science Library

    Landwehr, Carl E

    1993-01-01

    The history of attempts to secure computer systems against threats to confidentiality, integrity, and availability of data is briefly surveyed, and the danger of repeating a portion of that history is noted...

  13. Application of GPS in a high precision engineering survey network

    International Nuclear Information System (INIS)

    Ruland, R.; Leick, A.

    1985-04-01

    A GPS satellite survey was carried out with the Macrometer to support construction at the Stanford Linear Accelerator Center (SLAC). The network consists of 16 stations of which 9 stations were part of the Macrometer network. The horizontal and vertical accuracy of the GPS survey is estimated to be 1 to 2 mm and 2 to 3 mm respectively. The horizontal accuracy of the terrestrial survey, consisting of angles and distances, equals that of the GPS survey only in the ''loop'' portion of the network. All stations are part of a precise level network. The ellipsoidal heights obtained from the GPS survey and the orthometric heights of the level network are used to compute geoid undulations. A geoid profile along the linac was computed by the National Geodetic Survey in 1963. This profile agreed with the observed geoid within the standard deviation of the GPS survey. Angles and distances were adjusted together (TERRA), and all terrestrial observations were combined with the GPS vector observations in a combination adjustment (COMB). A comparison of COMB and TERRA revealed systematic errors in the terrestrial solution. A scale factor of 1.5 ppM +- .8 ppM was estimated. This value is of the same magnitude as the over-all horizontal accuracy of both networks. 10 refs., 3 figs., 5 tabs

  14. Physical Realizations of Quantum Computing

    CERN Document Server

    Kanemitsu, Shigeru; Salomaa, Martti; Takagi, Shin; Are the DiVincenzo Criteria Fulfilled in 2004 ?

    2006-01-01

    The contributors of this volume are working at the forefront of various realizations of quantum computers. They survey the recent developments in each realization, in the context of the DiVincenzo criteria, including nuclear magnetic resonance, Josephson junctions, quantum dots, and trapped ions. There are also some theoretical contributions which have relevance in the physical realizations of a quantum computer. This book fills the gap between elementary introductions to the subject and highly specialized research papers to allow beginning graduate students to understand the cutting-edge of r

  15. Universal computer interfaces

    CERN Document Server

    Dheere, RFBM

    1988-01-01

    Presents a survey of the latest developments in the field of the universal computer interface, resulting from a study of the world patent literature. Illustrating the state of the art today, the book ranges from basic interface structure, through parameters and common characteristics, to the most important industrial bus realizations. Recent technical enhancements are also included, with special emphasis devoted to the universal interface adapter circuit. Comprehensively indexed.

  16. Survey of Energy Computing in the Smart Grid Domain

    OpenAIRE

    Rajesh Kumar; Arun Agarwala

    2013-01-01

    Resource optimization, with advance computing tools, improves the efficient use of energy resources. The renewable energy resources are instantaneous and needs to be conserve at the same time. To optimize real time process, the complex design, includes plan of resources and control for effective utilization. The advances in information communication technology tools enables data formatting and analysis results in optimization of use the renewable resources for sustainable energy solution on s...

  17. Acquisition Information Management system telecommunication site survey results

    Energy Technology Data Exchange (ETDEWEB)

    Hake, K.A. [Oak Ridge National Lab., TN (United States); Key, B.G. [COR, Inc., Oak Ridge, TN (United States)

    1993-09-01

    The Army acquisition community currently uses a dedicated, point-to-point secure computer network for the Army Material Plan Modernization (AMPMOD). It must transition to the DOD supplied Defense Secure Network 1 (DSNET1). This is one of the first networks of this size to begin the transition. The type and amount of computing resources available at individual sites may or may not meet the new network requirements. This task surveys these existing telecommunications resources available in the Army acquisition community. It documents existing communication equipment, computer hardware, associated software, and recommends appropriate changes.

  18. Influence of miscibility phenomenon on crystalline polymorph transition in poly(vinylidene fluoride)/acrylic rubber/clay nanocomposite hybrid.

    Science.gov (United States)

    Abolhasani, Mohammad Mahdi; Naebe, Minoo; Jalali-Arani, Azam; Guo, Qipeng

    2014-01-01

    In this paper, intercalation of nanoclay in the miscible polymer blend of poly(vinylidene fluoride) (PVDF) and acrylic rubber(ACM) was studied. X-ray diffraction was used to investigate the formation of nanoscale polymer blend/clay hybrid. Infrared spectroscopy and X-ray analysis revealed the coexistence of β and γ crystalline forms in PVDF/Clay nanocomposite while α crystalline form was found to be dominant in PVDF/ACM/Clay miscible hybrids. Flory-Huggins interaction parameter (B) was used to further explain the miscibility phenomenon observed. The B parameter was determined by combining the melting point depression and the binary interaction model. The estimated B values for the ternary PVDF/ACM/Clay and PVDF/ACM pairs were all negative, showing both proper intercalation of the polymer melt into the nanoclay galleries and the good miscibility of PVDF and ACM blend. The B value for the PVDF/ACM blend was almost the same as that measured for the PVDF/ACM/Clay hybrid, suggesting that PVDF chains in nanocomposite hybrids interact with ACM chains and that nanoclay in hybrid systems is wrapped by ACM molecules.

  19. Retweeting Activity on Twitter: Signs of Deception

    Science.gov (United States)

    2015-05-22

    for computing the like- lihood of an unknown user being a human, bot or cyborg . [16] shows the strong classification and prediction performance of...ACM (2013) 2. Chu, Z., et al.: Who is Tweeting on Twitter: Human, Bot, or Cyborg ? ACSAC, 21–30 (2010) 3. Derrida, B., et al.: Statistical Properties

  20. Availability, Indications, and Technical Performance of Computed Tomographic Colonography: A National Survey

    International Nuclear Information System (INIS)

    Fisichella, V.; Hellstroem, M.

    2006-01-01

    Purpose: To determine the availability, indications, and technique of computed tomographic colonography (CTC) in Sweden and to investigate opinions on its future role in colon imaging. Material and Methods: In May 2004, a questionnaire on CTC was mailed to all Departments of Radiology in Sweden, and one year later a telephone interview was conducted with the departments that intended to start a CTC service. Results: Ninety-nine departments (83%) answered the questionnaire, indicating that 23/99 (23.2%) offered a CTC service. Reasons for non-implementation of CTC were lack of CTC training in 34/73 (46.6%) and non-availability of multi-detector row CT scanners in 33/73 (45.2%), while 26% were awaiting further scientific documentation on CTC. Incomplete colonoscopy was the main indication for CTC in 21/23 (91.3%) departments performing CTC. Dual positioning, room air insufflation, and thin-slice collimation were used in all the responding departments. The number of CTC studies performed varied from 1-5 (26.1%) to more than 200 (17.4%). Intravenous contrast material was routinely administered by 9/23 (39.1%) departments. Out of 30 (39.5%) departments that in 2004 intended to start CTC, 9 (30%) had done so by June 2005. A total of 32/99 (32.3%) departments had therefore started CTC by June 2005. Half of the departments that replied believed that CTC would absolutely or probably replace barium enema in the future. Conclusion: The survey shows relatively limited diffusion of CTC practice in Sweden, with approximately one-third of radiology departments offering a CTC service, mostly on a small scale. A wider dissemination of CTC requires further scientific documentation of its capability, intensified educational efforts, and additional funding

  1. Ketonization of Proline Residues in the Peptide Chains of Actinomycins by a 4-Oxoproline Synthase.

    Science.gov (United States)

    Semsary, Siamak; Crnovčić, Ivana; Driller, Ronja; Vater, Joachim; Loll, Bernhard; Keller, Ullrich

    2018-04-04

    X-type actinomycins (Acms) contain 4-hydroxyproline (Acm X 0 ) or 4-oxoproline (Acm X 2 ) in their β-pentapeptide lactone rings, whereas their α ring contains proline. We demonstrate that these Acms are formed through asymmetric condensation of Acm half molecules (Acm halves) containing proline with 4-hydroxyproline- or 4-oxoproline-containing Acm halves. In turn, we show-using an artificial Acm half analogue (PPL 1) with proline in its peptide chain-their conversion into the 4-hydroxyproline- and 4-oxoproline-containing Acm halves, PPL 0 and PPL 2, in mycelial suspensions of Streptomyces antibioticus. Two responsible genes of the Acm X biosynthetic gene cluster of S. antibioticus, saacmM and saacmN, encoding a cytochrome P450 monooxygenase (Cyp) and a ferredoxin were identified. After coexpression in Escherichia coli, their gene products converted PPL 1 into PPL 0 and PPL 2 in vivo as well as in situ in permeabilized cell of the transformed E. coli strain in conjunction with the host-encoded ferredoxin reductase in a NADH (NADPH)-dependent manner. saAcmM has high sequence similarity to the Cyp107Z (Ema) family of Cyps, which can convert avermectin B1 into its keto derivative, 4''-oxoavermectin B1. Determination of the structure of saAcmM reveals high similarity to the Ema structure but with significant differences in residues decorating their active sites, which defines saAcmM and its orthologues as a distinct new family of peptidylprolineketonizing Cyp. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Astronomical Surveys and Big Data

    Directory of Open Access Journals (Sweden)

    Mickaelian Areg M.

    2016-03-01

    Full Text Available Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ-rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ-ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc., proper motions (Tycho, USNO, Gaia, variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS, and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA. An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  3. A survey on computational intelligence approaches for predictive modeling in prostate cancer

    OpenAIRE

    Cosma, G; Brown, D; Archer, M; Khan, M; Pockley, AG

    2017-01-01

    Predictive modeling in medicine involves the development of computational models which are capable of analysing large amounts of data in order to predict healthcare outcomes for individual patients. Computational intelligence approaches are suitable when the data to be modelled are too complex forconventional statistical techniques to process quickly and eciently. These advanced approaches are based on mathematical models that have been especially developed for dealing with the uncertainty an...

  4. Simulation of trickle irrigation, an extension to the US Geological Survey's computer program VS2D

    Science.gov (United States)

    Healy, R.W.

    1987-01-01

    A method is presented for simulating water movement through unsaturated porous media in response to a constant rate of application from a surface source. Because the rate at which water can be absorbed by soil is limited, the water will pond; therefore the actual surface area over which the water is applied may change with time and in general will not be known beforehand. An iterative method is used to determine the size of this ponded area at any time. This method will be most useful for simulating trickling irrigation, but also may be of value for simulating movement of water is soils as the result of an accidental spill. The method is an extension to the finite difference computer program VS2D developed by the U.S. Geological Survey, which simulates water movement through variably saturated porous media. The simulated region can be a vertical, 2-dimensional cross section for treatment of a surface line source or an axially symmetric, 3-dimensional cylinder for a point source. Five test problems, obtained from the literature , are used to demonstrate the ability of the method to accurately match analytical and experimental results. (Author 's abstract)

  5. ResourceGate: A New Solution for Cloud Computing Resource Allocation

    OpenAIRE

    Abdullah A. Sheikh

    2012-01-01

    Cloud computing has taken place to be focused by educational and business communities. These concerns include their needs to improve the Quality of Services (QoS) provided, also services such as reliability, performance and reducing costs. Cloud computing provides many benefits in terms of low cost and accessibility of data. Ensuring these benefits is considered to be the major factor in the cloud computing environment. This paper surveys recent research related to cloud computing resource al...

  6. National Survey of Computer Aided Manufacturing in Industrial Technology Programs.

    Science.gov (United States)

    Heidari, Farzin

    The current status of computer-aided manufacturing in the 4-year industrial technology programs in the United States was studied. All industrial technology department chairs were mailed a questionnaire divided into program information, equipment information, and general comments sections. The questionnaire was designed to determine the subjects…

  7. Computational needs survey of NASA automation and robotics missions. Volume 2: Appendixes

    Science.gov (United States)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is the fact that mission computing requirements are frequency unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. Here, NASA, industry and academic communities are provided with a preliminary set of advanced mission computational processing requirements of automation and robotics (A and R) systems. The results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implemented capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Here, appendixes are provided.

  8. Academic Research Equipment in the Physical and Computer Sciences and Engineering. An Analysis of Findings from Phase I of the National Science Foundation's National Survey of Academic Research Instruments and Instrumentation Needs.

    Science.gov (United States)

    Burgdorf, Kenneth; White, Kristine

    This report presents information from phase I of a survey designed to develop quantitative indicators of the current national stock, cost/investment, condition, obsolescence, utilization, and need for major research instruments in academic settings. Data for phase I (which focused on the physical and computer sciences and engineering) were…

  9. The Use of PCs, Smartphones, and Tablets in a Probability-Based Panel Survey : Effects on Survey Measurement Error

    NARCIS (Netherlands)

    Lugtig, Peter; Toepoel, Vera

    2016-01-01

    Respondents in an Internet panel survey can often choose which device they use to complete questionnaires: a traditional PC, laptop, tablet computer, or a smartphone. Because all these devices have different screen sizes and modes of data entry, measurement errors may differ between devices. Using

  10. Man-Computer Symbiosis Through Interactive Graphics: A Survey and Identification of Critical Research Areas.

    Science.gov (United States)

    Knoop, Patricia A.

    The purpose of this report was to determine the research areas that appear most critical to achieving man-computer symbiosis. An operational definition of man-computer symbiosis was developed by: (1) reviewing and summarizing what others have said about it, and (2) attempting to distinguish it from other types of man-computer relationships. From…

  11. A Survey on Mobile Edge Networks: Convergence of Computing, Caching and Communications

    OpenAIRE

    Wang, Shuo; Zhang, Xing; Zhang, Yan; Wang, Lin; Yang, Juwo; Wang, Wenbo

    2017-01-01

    As the explosive growth of smart devices and the advent of many new applications, traffic volume has been growing exponentially. The traditional centralized network architecture cannot accommodate such user demands due to heavy burden on the backhaul links and long latency. Therefore, new architectures which bring network functions and contents to the network edge are proposed, i.e., mobile edge computing and caching. Mobile edge networks provide cloud computing and caching capabilities at th...

  12. Assessment of Universal Healthcare Coverage in a District of North India: A Rapid Cross-Sectional Survey Using Tablet Computers.

    Science.gov (United States)

    Singh, Tarundeep; Roy, Pritam; Jamir, Limalemla; Gupta, Saurav; Kaur, Navpreet; Jain, D K; Kumar, Rajesh

    2016-01-01

    A rapid survey was carried out in Shaheed Bhagat Singh Nagar District of Punjab state in India to ascertain health seeking behavior and out-of-pocket health expenditures. Using multistage cluster sampling design, 1,008 households (28 clusters x 36 households in each cluster) were selected proportionately from urban and rural areas. Households were selected through a house-to-house survey during April and May 2014 whose members had (a) experienced illness in the past 30 days, (b) had illness lasting longer than 30 days, (c) were hospitalized in the past 365 days, or (d) had women who were currently pregnant or experienced childbirth in the past two years. In these selected households, trained investigators, using a tablet computer-based structured questionnaire, enquired about the socio-demographics, nature of illness, source of healthcare, and healthcare and household expenditure. The data was transmitted daily to a central server using wireless communication network. Mean healthcare expenditures were computed for various health conditions. Catastrophic healthcare expenditure was defined as more than 10% of the total annual household expenditure on healthcare. Chi square test for trend was used to compare catastrophic expenditures on hospitalization between households classified into expenditure quartiles. The mean monthly household expenditure was 15,029 Indian Rupees (USD 188.2). Nearly 14.2% of the household expenditure was on healthcare. Fever, respiratory tract diseases, gastrointestinal diseases were the common acute illnesses, while heart disease, diabetes mellitus, and respiratory diseases were the more common chronic diseases. Hospitalizations were mainly due to cardiovascular diseases, gastrointestinal problems, and accidents. Only 17%, 18%, 20% and 31% of the healthcare for acute illnesses, chronic illnesses, hospitalizations and childbirth was sought in the government health facilities. Average expenditure in government health facilities was 16.6% less

  13. Assessment of Universal Healthcare Coverage in a District of North India: A Rapid Cross-Sectional Survey Using Tablet Computers.

    Directory of Open Access Journals (Sweden)

    Tarundeep Singh

    Full Text Available A rapid survey was carried out in Shaheed Bhagat Singh Nagar District of Punjab state in India to ascertain health seeking behavior and out-of-pocket health expenditures.Using multistage cluster sampling design, 1,008 households (28 clusters x 36 households in each cluster were selected proportionately from urban and rural areas. Households were selected through a house-to-house survey during April and May 2014 whose members had (a experienced illness in the past 30 days, (b had illness lasting longer than 30 days, (c were hospitalized in the past 365 days, or (d had women who were currently pregnant or experienced childbirth in the past two years. In these selected households, trained investigators, using a tablet computer-based structured questionnaire, enquired about the socio-demographics, nature of illness, source of healthcare, and healthcare and household expenditure. The data was transmitted daily to a central server using wireless communication network. Mean healthcare expenditures were computed for various health conditions. Catastrophic healthcare expenditure was defined as more than 10% of the total annual household expenditure on healthcare. Chi square test for trend was used to compare catastrophic expenditures on hospitalization between households classified into expenditure quartiles.The mean monthly household expenditure was 15,029 Indian Rupees (USD 188.2. Nearly 14.2% of the household expenditure was on healthcare. Fever, respiratory tract diseases, gastrointestinal diseases were the common acute illnesses, while heart disease, diabetes mellitus, and respiratory diseases were the more common chronic diseases. Hospitalizations were mainly due to cardiovascular diseases, gastrointestinal problems, and accidents. Only 17%, 18%, 20% and 31% of the healthcare for acute illnesses, chronic illnesses, hospitalizations and childbirth was sought in the government health facilities. Average expenditure in government health facilities was

  14. Computers and Languages: Theory and Practice

    NARCIS (Netherlands)

    Nijholt, Antinus

    A global introduction to language technology and the areas of computer science where language technology plays a role. Surveyed in this volume are issueas related to the parsing problem in the fields of natural languages, programming languages, and formal languages. Throughout the book attention is

  15. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  16. Dense image correspondences for computer vision

    CERN Document Server

    Liu, Ce

    2016-01-01

    This book describes the fundamental building-block of many new computer vision systems: dense and robust correspondence estimation. Dense correspondence estimation techniques are now successfully being used to solve a wide range of computer vision problems, very different from the traditional applications such techniques were originally developed to solve. This book introduces the techniques used for establishing correspondences between challenging image pairs, the novel features used to make these techniques robust, and the many problems dense correspondences are now being used to solve. The book provides information to anyone attempting to utilize dense correspondences in order to solve new or existing computer vision problems. The editors describe how to solve many computer vision problems by using dense correspondence estimation. Finally, it surveys resources, code, and data necessary for expediting the development of effective correspondence-based computer vision systems.   ·         Provides i...

  17. Assessing asbestos exposure potential in nonindustrial settings.

    Science.gov (United States)

    Chang, S N; White, L E; Scott, W D

    1987-01-01

    The presence of asbestos containing materials (ACM) in office and commercial buildings is a significant environmental problem. Asbestosis, mesothelioma and lung cancer have been linked with industrial exposure to airborne asbestos. The extensive use of asbestos products in buildings has raised concerns about the widespread exposure of the general public to asbestos in nonoccupational settings. The presence of asbestos in a building does not necessarily mean that significant exposure of the occupants of the building has occurred, but it is important that the asbestos be monitored regularly to ensure that fibers do not become airborne. If ACM are contained within a matrix and not disturbed, exposure is unlikely. However, if the asbestos becomes friable (crumbling) or if building maintenance, repair, renovation or other activities disturb ACM, airborne asbestos fibers may be a source of exposure to the occupants of the building. Currently, asbestos exposure assessment is conducted by a phase contrast light microscope (PCM) technique. Due to its inherent limitation in resolution and the generic counting rules used, analysis by the PCM method underestimates the airborne asbestos fiber concentration as compared to analysis by transmission electron microscopy (TEM). It is important that the air monitoring results analyzed by PCM be interpreted carefully in conjunction with a survey by a professional to judge the physical condition of the ACM in buildings. Exposure levels to airborne asbestos fibers vary from day to day and depend on the physical condition of the material involved and the type of operating and maintenance program in place.(ABSTRACT TRUNCATED AT 250 WORDS)

  18. Computing discharge using the index velocity method

    Science.gov (United States)

    Levesque, Victor A.; Oberg, Kevin A.

    2012-01-01

    Application of the index velocity method for computing continuous records of discharge has become increasingly common, especially since the introduction of low-cost acoustic Doppler velocity meters (ADVMs) in 1997. Presently (2011), the index velocity method is being used to compute discharge records for approximately 470 gaging stations operated and maintained by the U.S. Geological Survey. The purpose of this report is to document and describe techniques for computing discharge records using the index velocity method. Computing discharge using the index velocity method differs from the traditional stage-discharge method by separating velocity and area into two ratings—the index velocity rating and the stage-area rating. The outputs from each of these ratings, mean channel velocity (V) and cross-sectional area (A), are then multiplied together to compute a discharge. For the index velocity method, V is a function of such parameters as streamwise velocity, stage, cross-stream velocity, and velocity head, and A is a function of stage and cross-section shape. The index velocity method can be used at locations where stage-discharge methods are used, but it is especially appropriate when more than one specific discharge can be measured for a specific stage. After the ADVM is selected, installed, and configured, the stage-area rating and the index velocity rating must be developed. A standard cross section is identified and surveyed in order to develop the stage-area rating. The standard cross section should be surveyed every year for the first 3 years of operation and thereafter at a lesser frequency, depending on the susceptibility of the cross section to change. Periodic measurements of discharge are used to calibrate and validate the index rating for the range of conditions experienced at the gaging station. Data from discharge measurements, ADVMs, and stage sensors are compiled for index-rating analysis. Index ratings are developed by means of regression

  19. A SURVEY ON LOAD BALANCING IN CLOUD COMPUTING USING ARTIFICIAL INTELLIGENCE TECHNIQUES

    OpenAIRE

    Amandeep Kaur; Pooja Nagpal

    2016-01-01

    Since its inception, the cloud computing paradigm has gained the widespread popularity in the industry and academia. The economical, scalable, expedient, ubiquitous, and on-demand access to shared resources are some of the characteristics of the cloud that have resulted in shifting the business processes to the cloud. The cloud computing attracts the attention of research community due to its potential to provide tremendous benefits to the industry and the community. But with the increasing d...

  20. Computer Programs for Obtaining and Analyzing Daily Mean Steamflow Data from the U.S. Geological Survey National Water Information System Web Site

    Science.gov (United States)

    Granato, Gregory E.

    2009-01-01

    Research Council, 2004). The USGS maintains the National Water Information System (NWIS), a distributed network of computers and file servers used to store and retrieve hydrologic data (Mathey, 1998; U.S. Geological Survey, 2008). NWISWeb is an online version of this database that includes water data from more than 24,000 streamflow-gaging stations throughout the United States (U.S. Geological Survey, 2002, 2008). Information from NWISWeb is commonly used to characterize streamflows at gaged sites and to help predict streamflows at ungaged sites. Five computer programs were developed for obtaining and analyzing streamflow from the National Water Information System (NWISWeb). The programs were developed as part of a study by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, to develop a stochastic empirical loading and dilution model. The programs were developed because reliable, efficient, and repeatable methods are needed to access and process streamflow information and data. The first program is designed to facilitate the downloading and reformatting of NWISWeb streamflow data. The second program is designed to facilitate graphical analysis of streamflow data. The third program is designed to facilitate streamflow-record extension and augmentation to help develop long-term statistical estimates for sites with limited data. The fourth program is designed to facilitate statistical analysis of streamflow data. The fifth program is a preprocessor to create batch input files for the U.S. Environmental Protection Agency DFLOW3 program for calculating low-flow statistics. These computer programs were developed to facilitate the analysis of daily mean streamflow data for planning-level water-quality analyses but also are useful for many other applications pertaining to streamflow data and statistics. These programs and the associated documentation are included on the CD-ROM accompanying this report. This report and the appendixes on the

  1. Computers and the internet: tools for youth empowerment.

    Science.gov (United States)

    Valaitis, Ruta K

    2005-10-04

    Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth's and adults' perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives.

  2. QUEST Hanford Site Computer Users - What do they do?

    Energy Technology Data Exchange (ETDEWEB)

    WITHERSPOON, T.T.

    2000-03-02

    The Fluor Hanford Chief Information Office requested that a computer-user survey be conducted to determine the user's dependence on the computer and its importance to their ability to accomplish their work. Daily use trends and future needs of Hanford Site personal computer (PC) users was also to be defined. A primary objective was to use the data to determine how budgets should be focused toward providing those services that are truly needed by the users.

  3. Influence of Miscibility Phenomenon on Crystalline Polymorph Transition in Poly(Vinylidene Fluoride)/Acrylic Rubber/Clay Nanocomposite Hybrid

    Science.gov (United States)

    Abolhasani, Mohammad Mahdi; Naebe, Minoo; Jalali-Arani, Azam; Guo, Qipeng

    2014-01-01

    In this paper, intercalation of nanoclay in the miscible polymer blend of poly(vinylidene fluoride) (PVDF) and acrylic rubber(ACM) was studied. X-ray diffraction was used to investigate the formation of nanoscale polymer blend/clay hybrid. Infrared spectroscopy and X-ray analysis revealed the coexistence of β and γ crystalline forms in PVDF/Clay nanocomposite while α crystalline form was found to be dominant in PVDF/ACM/Clay miscible hybrids. Flory-Huggins interaction parameter (B) was used to further explain the miscibility phenomenon observed. The B parameter was determined by combining the melting point depression and the binary interaction model. The estimated B values for the ternary PVDF/ACM/Clay and PVDF/ACM pairs were all negative, showing both proper intercalation of the polymer melt into the nanoclay galleries and the good miscibility of PVDF and ACM blend. The B value for the PVDF/ACM blend was almost the same as that measured for the PVDF/ACM/Clay hybrid, suggesting that PVDF chains in nanocomposite hybrids interact with ACM chains and that nanoclay in hybrid systems is wrapped by ACM molecules. PMID:24551141

  4. Influence of miscibility phenomenon on crystalline polymorph transition in poly(vinylidene fluoride/acrylic rubber/clay nanocomposite hybrid.

    Directory of Open Access Journals (Sweden)

    Mohammad Mahdi Abolhasani

    Full Text Available In this paper, intercalation of nanoclay in the miscible polymer blend of poly(vinylidene fluoride (PVDF and acrylic rubber(ACM was studied. X-ray diffraction was used to investigate the formation of nanoscale polymer blend/clay hybrid. Infrared spectroscopy and X-ray analysis revealed the coexistence of β and γ crystalline forms in PVDF/Clay nanocomposite while α crystalline form was found to be dominant in PVDF/ACM/Clay miscible hybrids. Flory-Huggins interaction parameter (B was used to further explain the miscibility phenomenon observed. The B parameter was determined by combining the melting point depression and the binary interaction model. The estimated B values for the ternary PVDF/ACM/Clay and PVDF/ACM pairs were all negative, showing both proper intercalation of the polymer melt into the nanoclay galleries and the good miscibility of PVDF and ACM blend. The B value for the PVDF/ACM blend was almost the same as that measured for the PVDF/ACM/Clay hybrid, suggesting that PVDF chains in nanocomposite hybrids interact with ACM chains and that nanoclay in hybrid systems is wrapped by ACM molecules.

  5. Genetic basis of arrhythmogenic cardiomyopathy.

    Science.gov (United States)

    Karmouch, Jennifer; Protonotarios, Alexandros; Syrris, Petros

    2018-05-01

    To date 16 genes have been associated with arrhythmogenic cardiomyopathy (ACM). Mutations in these genes can lead to a broad spectrum of phenotypic expression ranging from disease affecting predominantly the right or left ventricle, to biventricular subtypes. Understanding the genetic causes of ACM is important in diagnosis and management of the disorder. This review summarizes recent advances in molecular genetics and discusses the application of next-generation sequencing technology in genetic testing in ACM. Use of next-generation sequencing methods has resulted in the identification of novel causative variants and genes for ACM. The involvement of filamin C in ACM demonstrates the genetic overlap between ACM and other types of cardiomyopathy. Putative pathogenic variants have been detected in cadherin 2 gene, a protein involved in cell adhesion. Large genomic rearrangements in desmosome genes have been systematically investigated in a cohort of ACM patients. Recent studies have identified novel causes of ACM providing new insights into the genetic spectrum of the disease and highlighting an overlapping phenotype between ACM and dilated cardiomyopathy. Next-generation sequencing is a useful tool for research and genetic diagnostic screening but interpretation of identified sequence variants requires caution and should be performed in specialized centres.

  6. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

    Science.gov (United States)

    Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

    2007-01-01

    In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

  7. Osteoporosis imaging: effects of bone preservation on MDCT-based trabecular bone microstructure parameters and finite element models

    International Nuclear Information System (INIS)

    Baum, Thomas; Grande Garcia, Eduardo; Burgkart, Rainer; Gordijenko, Olga; Liebl, Hans; Jungmann, Pia M.; Gruber, Michael; Zahel, Tina; Rummeny, Ernst J.; Waldt, Simone; Bauer, Jan S.

    2015-01-01

    Osteoporosis is defined as a skeletal disorder characterized by compromised bone strength due to a reduction of bone mass and deterioration of bone microstructure predisposing an individual to an increased risk of fracture. Trabecular bone microstructure analysis and finite element models (FEM) have shown to improve the prediction of bone strength beyond bone mineral density (BMD) measurements. These computational methods have been developed and validated in specimens preserved in formalin solution or by freezing. However, little is known about the effects of preservation on trabecular bone microstructure and FEM. The purpose of this observational study was to investigate the effects of preservation on trabecular bone microstructure and FEM in human vertebrae. Four thoracic vertebrae were harvested from each of three fresh human cadavers (n = 12). Multi-detector computed tomography (MDCT) images were obtained at baseline, 3 and 6 month follow-up. In the intervals between MDCT imaging, two vertebrae from each donor were formalin-fixed and frozen, respectively. BMD, trabecular bone microstructure parameters (histomorphometry and fractal dimension), and FEM-based apparent compressive modulus (ACM) were determined in the MDCT images and validated by mechanical testing to failure of the vertebrae after 6 months. Changes of BMD, trabecular bone microstructure parameters, and FEM-based ACM in formalin-fixed and frozen vertebrae over 6 months ranged between 1.0–5.6 % and 1.3–6.1 %, respectively, and were not statistically significant (p > 0.05). BMD, trabecular bone microstructure parameters, and FEM-based ACM as assessed at baseline, 3 and 6 month follow-up correlated significantly with mechanically determined failure load (r = 0.89–0.99; p < 0.05). The correlation coefficients r were not significantly different for the two preservation methods (p > 0.05). Formalin fixation and freezing up to six months showed no significant effects on trabecular bone microstructure

  8. Computers and Instruction: Implications of the Rising Tide of Criticism for Reading Education.

    Science.gov (United States)

    Balajthy, Ernest

    1988-01-01

    Examines two major reasons that schools have adopted computers without careful prior examination and planning. Surveys a variety of criticisms targeted toward some aspects of computer-based instruction in reading in an effort to direct attention to the beneficial implications of computers in the classroom. (MS)

  9. 77 FR 30005 - Office of the Assistant Secretary for Health, Statement of Organization, Functions, and...

    Science.gov (United States)

    2012-05-21

    .... Specifically, it realigns these functions in the Office of the Surgeon General (ACM) and abolishes the Office... Section AC.20, Functions, delete Paragraph ``I. Office of Surgeon General (ACM),'' in its entirety and replace with the following: I. Office of the Surgeon General (ACM) Section ACM.00 Mission: The Office of...

  10. Student Engagement in a Computer Rich Science Classroom

    Science.gov (United States)

    Hunter, Jeffrey C.

    The purpose of this study was to examine the student lived experience when using computers in a rural science classroom. The overarching question the project sought to examine was: How do rural students relate to computers as a learning tool in comparison to a traditional science classroom? Participant data were collected using a pre-study survey, Experience Sampling during class and post-study interviews. Students want to use computers in their classrooms. Students shared that they overwhelmingly (75%) preferred a computer rich classroom to a traditional classroom (25%). Students reported a higher level of engagement in classes that use technology/computers (83%) versus those that do not use computers (17%). A computer rich classroom increased student control and motivation as reflected by a participant who shared; "by using computers I was more motivated to get the work done" (Maggie, April 25, 2014, survey). The researcher explored a rural school environment. Rural populations represent a large number of students and appear to be underrepresented in current research. The participants, tenth grade Biology students, were sampled in a traditional teacher led class without computers for one week followed by a week using computers daily. Data supported that there is a new gap that separates students, a device divide. This divide separates those who have access to devices that are robust enough to do high level class work from those who do not. Although cellular phones have reduced the number of students who cannot access the Internet, they may have created a false feeling that access to a computer is no longer necessary at home. As this study shows, although most students have Internet access, fewer have access to a device that enables them to complete rigorous class work at home. Participants received little or no training at school in proper, safe use of a computer and the Internet. It is clear that the majorities of students are self-taught or receive guidance

  11. Pre-Service Teachers, Computers, and ICT Courses: A Troubled Relationship

    Science.gov (United States)

    Fokides, Emmanuel

    2016-01-01

    The study presents the results of a four-year long survey among pre-service teachers, examining factors which influence their knowledge and skills on computers, as well as factors which contribute to shaping their perceived computer competency. Participants were seven hundred fifty-four senior students, at the Department of Primary School…

  12. Subsurface fracture surveys using a borehole television camera and an acoustic televiewer

    International Nuclear Information System (INIS)

    Lau, J.S.O.; Auger, L.F.

    1987-01-01

    Borehole television survey and acoustic televiewer logging provide rapid, cost-effective, and accurate methods of surveying fractures and their characteristics within boreholes varying in diameter from 7.6 to 15.3 cm. In the television survey, a camera probe is used to inspect the borehole walls. Measurements of location, orientation, infilling width, and aperture of fractures are made on the television screen and recorded on computer data record sheets. All observations are recorded on video cassette tapes. With the acoustic televiewer, oriented images of fractures in the borehole walls are recorded on a strip-chart log and also on video cassette tapes. The images are displayed as if the walls were split vertically along magnetic north and spread out horizontally. Measurements of fracture characteristics are made on the strip-chart log, using a digitizing table and a microcomputer, and the data recorded on floppy diskettes. In both surveys, an inclined fracture is displayed as a sinusoidal curve, from which the apparent orientation of the fracture can be measured. Once the borehole orientation is known, the true orientation of the fracture can be computed from its apparent orientation. Computer analysis of the fracture data, provides a rapid assessment of fracture occurrence, fracture aperture, and statisically significant concentrations of fracture orientations

  13. Survey of advanced general-purpose software for robot manipulators

    International Nuclear Information System (INIS)

    Latombe, J.C.

    1983-01-01

    Computer-controlled sensor-based robots will more and more common in industry. This paper attempts to survey the main trends of the development of advanced general-purpose software for robot manipulators. It is intended to make clear that robots are not only mechanical devices. They are truly programmable machines, and their programming, which occurs in an imperfectly modelled world,is somewhat different from conventional computer programming. (orig.)

  14. Computed radiography for breast cancer

    International Nuclear Information System (INIS)

    Yamada, Tatsuya; Muramatsu, Yukio

    1990-01-01

    In order to evaluate the possibility of using computed radiographic mammography in mass surveys of the breast, we have retrospectively examined 71 breast cancer lesions in 71 patients using computed radiographic and conventional non-screen mammographies and have carried out comparative studies on tumor detection rate and calcification. A 95.8% detection rate was obtained for the tumor image (n 71) using computed radiography (CR) and one of 93.0% using non-screen techniques. Three lesions remained undetected by either study. A 100% detection rate was obtained for calcification associated with cancer (n 33) from each method. No significant differences in either detection rate or calcification were seen between the two images. On the other hand, the ability to recognize tumor images (n 66) was as follows; CR superior to non-screen radiography in 53 lesions (80.3%), equal in eight lesions (12.1%) and inferior in five lesions (7.6%). For the calcification images (n 18), CR was superior to non-screen radiography in all 18 lesions. Obviously, CR gives better results than non-screen radiography. Furthermore, an adequate image can be obtained using CR even although the X-ray dosage is only a twentieth of that required for non-screen radiography. It can therefore be applied not only to mass surveys for breast cancer but also to routine clinical diagnoses. (author)

  15. Use of a radiation therapy treatment planning computer in a hospital health physics program

    International Nuclear Information System (INIS)

    Addison, S.J.

    1984-01-01

    An onsite treatment planning computer has become state of the art in the care of radiation therapy patients, but in most installations the computer is used for therapy planning a diminutive amount of the day. At St. Mary's Hospital, arrangements have been negotiated for part time use of the treatment planning computer for health physics purposes. Computerized Medical Systems, Inc. (CMS) produces the Modulex radiotherapy planning system which is programmed in MUMPS, a user oriented language specially adapted for handling text string information. St. Mary's Hospital's CMS computer has currently been programmed to assist in data collection and write-up of diagnostic x-ray surveys, meter calibrations, and wipe/leak tests. The computer is setup to provide timely reminders of tests and surveys, and billing for consultation work. Programs are currently being developed for radionuclide inventories. Use of a therapy planning computer for health physics purposes can enhance the radiation safety program and provide additional grounds for the acquisition of such a computer system

  16. Advances in computers dependable and secure systems engineering

    CERN Document Server

    Hurson, Ali

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  17. Incentives and participation in a medical survey.

    Science.gov (United States)

    Gjøstein, Dagrun Kyte; Huitfeldt, Anders; Løberg, Magnus; Adami, Hans-Olov; Garborg, Kjetil; Kalager, Mette; Bretthauer, Michael

    2016-07-01

    BACKGROUND Questionnaire surveys are important for surveying the health and disease behaviour of the population, but recent years have seen a fall in participation. Our study tested whether incentives can increase participation in these surveys.MATERIAL AND METHOD We sent a questionnaire on risk factors for colorectal cancer (height, weight, smoking, self-reported diagnoses, family medical history) to non-screened participants in a randomised colonoscopy screening study for colorectal cancer: participants who were invited but did not attend for colonoscopy examination (screening-invited) and persons who were not offered colonoscopy (control group). The persons were randomised to three groups: no financial incentive, lottery scratch cards included with the form, or a prize draw for a tablet computer when they responded to the form. We followed up all the incentive groups with telephone reminder calls, and before the prize draw for the tablet computer.RESULTS Altogether 3 705 of 6 795 persons (54.5  %) responded to the questionnaire; 43.5  % of those invited for screening and 65.6  % of the control group (p reminder calls, 39.2  % responded. A further 15.3  % responded following telephone reminder calls (14.1  % of the screening-invited and 16.5  % of the control group; p increase participation in this medical questionnaire survey. Use of telephone reminder calls and telephone interviews increased participation, but whether this is more effective than other methods requires further study.

  18. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  19. 16 Cards to Get Into Computer Organization

    OpenAIRE

    Tabik, Siham; Romero, Luis F.

    2014-01-01

    This paper presents a novel educative activity for teaching computer architecture fundamentals. This activity is actually a game that uses 16 cards and involves about twenty active participant students. Executing this activity in the fi rst class of the course allows the studentin only 45 minutes to acquire the fundamental concepts of computer organization. The results of the surveys that evaluate the proposed activity together with the grades obtained by the students at the end of course...

  20. Security and Privacy of Sensitive Data in Cloud Computing: A Survey of Recent Developments

    OpenAIRE

    Gholami, Ali; Laure, Erwin

    2016-01-01

    Cloud computing is revolutionizing many ecosystems by providing organizations with computing resources featuring easy deployment, connectivity, configuration, automation and scalability. This paradigm shift raises a broad range of security and privacy issues that must be taken into consideration. Multi-tenancy, loss of control, and trust are key challenges in cloud computing environments. This paper reviews the existing technologies and a wide array of both earlier and state-of...