WorldWideScience

Sample records for applied computing computer

  1. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  2. Applied computational physics

    CERN Document Server

    Boudreau, Joseph F; Bianchi, Riccardo Maria

    2018-01-01

    Applied Computational Physics is a graduate-level text stressing three essential elements: advanced programming techniques, numerical analysis, and physics. The goal of the text is to provide students with essential computational skills that they will need in their careers, and to increase the confidence with which they write computer programs designed for their problem domain. The physics problems give them an opportunity to reinforce their programming skills, while the acquired programming skills augment their ability to solve physics problems. The C++ language is used throughout the text. Physics problems include Hamiltonian systems, chaotic systems, percolation, critical phenomena, few-body and multi-body quantum systems, quantum field theory, simulation of radiation transport, and data modeling. The book, the fruit of a collaboration between a theoretical physicist and an experimental physicist, covers a broad range of topics from both viewpoints. Examples, program libraries, and additional documentatio...

  3. Computer control applied to accelerators

    CERN Document Server

    Crowley-Milling, Michael C

    1974-01-01

    The differences that exist between control systems for accelerators and other types of control systems are outlined. It is further indicated that earlier accelerators had manual control systems to which computers were added, but that it is essential for the new, large accelerators to include computers in the control systems right from the beginning. Details of the computer control designed for the Super Proton Synchrotron are presented. The method of choosing the computers is described, as well as the reasons for CERN having to design the message transfer system. The items discussed include: CAMAC interface systems, a new multiplex system, operator-to-computer interaction (such as touch screen, computer-controlled knob, and non- linear track-ball), and high-level control languages. Brief mention is made of the contributions of other high-energy research laboratories as well as of some other computer control applications at CERN. (0 refs).

  4. Applied computation and security systems

    CERN Document Server

    Saeed, Khalid; Choudhury, Sankhayan; Chaki, Nabendu

    2015-01-01

    This book contains the extended version of the works that have been presented and discussed in the First International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2014) held during April 18-20, 2014 in Kolkata, India. The symposium has been jointly organized by the AGH University of Science & Technology, Cracow, Poland and University of Calcutta, India. The Volume I of this double-volume book contains fourteen high quality book chapters in three different parts. Part 1 is on Pattern Recognition and it presents four chapters. Part 2 is on Imaging and Healthcare Applications contains four more book chapters. The Part 3 of this volume is on Wireless Sensor Networking and it includes as many as six chapters. Volume II of the book has three Parts presenting a total of eleven chapters in it. Part 4 consists of five excellent chapters on Software Engineering ranging from cloud service design to transactional memory. Part 5 in Volume II is on Cryptography with two book...

  5. Applied computing in medicine and health

    CERN Document Server

    Al-Jumeily, Dhiya; Mallucci, Conor; Oliver, Carol

    2015-01-01

    Applied Computing in Medicine and Health is a comprehensive presentation of on-going investigations into current applied computing challenges and advances, with a focus on a particular class of applications, primarily artificial intelligence methods and techniques in medicine and health. Applied computing is the use of practical computer science knowledge to enable use of the latest technology and techniques in a variety of different fields ranging from business to scientific research. One of the most important and relevant areas in applied computing is the use of artificial intelligence (AI) in health and medicine. Artificial intelligence in health and medicine (AIHM) is assuming the challenge of creating and distributing tools that can support medical doctors and specialists in new endeavors. The material included covers a wide variety of interdisciplinary perspectives concerning the theory and practice of applied computing in medicine, human biology, and health care. Particular attention is given to AI-bas...

  6. Computer Labs | College of Engineering & Applied Science

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Structural Engineering Laboratory Water Resources Laboratory Computer Science Department Computer Science Academic Programs Computer Science Undergraduate Programs Computer Science Major Computer Science Tracks

  7. Computer Resources | College of Engineering & Applied Science

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Structural Engineering Laboratory Water Resources Laboratory Computer Science Department Computer Science Academic Programs Computer Science Undergraduate Programs Computer Science Major Computer Science Tracks

  8. Computer Science | Classification | College of Engineering & Applied

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Structural Engineering Laboratory Water Resources Laboratory Computer Science Department Computer Science Academic Programs Computer Science Undergraduate Programs Computer Science Major Computer Science Tracks

  9. Applied Computational Intelligence for finance and economics

    OpenAIRE

    Isasi Viñuela, Pedro; Quintana Montero, David; Sáez Achaerandio, Yago; Mochón, Asunción

    2007-01-01

    This article introduces some relevant research works on computational intelligence applied to finance and economics. The objective is to offer an appropriate context and a starting point for those who are new to computational intelligence in finance and economics and to give an overview of the most recent works. A classification with five different main areas is presented. Those areas are related with different applications of the most modern computational intelligence techniques showing a ne...

  10. Computer simulations applied in materials

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La{sub 2}Zr{sub 2}O{sub 7} pyrochlores; first principle calculations of defects formation energies in the Y{sub 2}(Ti,Sn,Zr){sub 2}O{sub 7} pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO{sub 2}; composition defect maps for A{sup 3+}B{sup 3+}O{sub 3} perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)

  11. Computer simulations applied in materials

    International Nuclear Information System (INIS)

    2003-01-01

    This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La 2 Zr 2 O 7 pyrochlores; first principle calculations of defects formation energies in the Y 2 (Ti,Sn,Zr) 2 O 7 pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO 2 ; composition defect maps for A 3+ B 3+ O 3 perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)

  12. Applied Computational Mathematics in Social Sciences

    CERN Document Server

    Damaceanu, Romulus-Catalin

    2010-01-01

    Applied Computational Mathematics in Social Sciences adopts a modern scientific approach that combines knowledge from mathematical modeling with various aspects of social science. Special algorithms can be created to simulate an artificial society and a detailed analysis can subsequently be used to project social realities. This Ebook specifically deals with computations using the NetLogo platform, and is intended for researchers interested in advanced human geography and mathematical modeling studies.

  13. MOBILE CLOUD COMPUTING APPLIED TO HEALTHCARE APPROACH

    OpenAIRE

    Omar AlSheikSalem

    2016-01-01

    In the past few years it was clear that mobile cloud computing was established via integrating both mobile computing and cloud computing to be add in both storage space and processing speed. Integrating healthcare applications and services is one of the vast data approaches that can be adapted to mobile cloud computing. This work proposes a framework of a global healthcare computing based combining both mobile computing and cloud computing. This approach leads to integrate all of ...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  15. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  17. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  2. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  9. Interactive computer programs for applied nutrition education.

    Science.gov (United States)

    Wise, A

    1985-12-01

    DIET2 and DIET3 are programs written for a Dec2050 computer and intended for teaching applied nutrition to students of nutrition, dietetics, home economics, and hotel and institutional administration. DIET2 combines all the facilities of the separate dietary programs already available at Robert Gordon's Institute of Technology into a single package, and extends these to give students a large amount of relevant information about the nutritional balance of foods (including DHSS and NACNE recommendations) prior to choosing them for meals. Students are also helped by the inclusion of typical portion weights. They are presented with an analysis of nutrients and their balance in the menu created, with an easy mechanism for ammendation of the menu and addition of foods which provide the nutrients that are lacking. At any stage the computer can give the proportion of total nutrient provided by each meal. DIET3 is a relatively simple program that displays the nutritional profile of foods and diets semigraphically.

  10. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  11. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  13. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  14. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  19. Applied Educational Computing: Putting Skills to Practice.

    Science.gov (United States)

    Thomerson, J. D.

    The College of Education at Valdosta State University (Georgia) developed a followup course to their required entry-level educational computing course. The introductory course covers word processing, spreadsheet, database, presentation, Internet, electronic mail, and operating system software and basic computer concepts. Students expressed a need…

  20. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  1. Applying Human Computation Methods to Information Science

    Science.gov (United States)

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  2. Applying and evaluating computer-animated tutors

    Science.gov (United States)

    Massaro, Dominic W.; Bosseler, Alexis; Stone, Patrick S.; Connors, Pamela

    2002-05-01

    We have developed computer-assisted speech and language tutors for deaf, hard of hearing, and autistic children. Our language-training program utilizes our computer-animated talking head, Baldi, as the conversational agent, who guides students through a variety of exercises designed to teach vocabulary and grammer, to improve speech articulation, and to develop linguistic and phonological awareness. Baldi is an accurate three-dimensional animated talking head appropriately aligned with either synthesized or natural speech. Baldi has a tongue and palate, which can be displayed by making his skin transparent. Two specific language-training programs have been evaluated to determine if they improve word learning and speech articulation. The results indicate that the programs are effective in teaching receptive and productive language. Advantages of utilizing a computer-animated agent as a language tutor are the popularity of computers and embodied conversational agents with autistic kids, the perpetual availability of the program, and individualized instruction. Students enjoy working with Baldi because he offers extreme patience, he doesn't become angry, tired, or bored, and he is in effect a perpetual teaching machine. The results indicate that the psychology and technology of Baldi holds great promise in language learning and speech therapy. [Work supported by NSF Grant Nos. CDA-9726363 and BCS-9905176 and Public Health Service Grant No. PHS R01 DC00236.

  3. Research in Applied Mathematics, Fluid Mechanics and Computer Science

    Science.gov (United States)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.

  4. [Research activities in applied mathematics, fluid mechanics, and computer science

    Science.gov (United States)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.

  5. Applying improved instrumentation and computer control systems

    International Nuclear Information System (INIS)

    Bevilacqua, F.; Myers, J.E.

    1977-01-01

    In-core and out-of-core instrumentation systems for the Cherokee-I reactor are described. The reactor has 61m-core instrument assemblies. Continuous computer monitoring and processing of data from over 300 fixed detectors will be used to improve the manoeuvering of core power. The plant protection system is a standard package for the Combustion Engineering System 80, consisting of two independent systems, the reactor protection system and the engineering safety features activation system, both of which are designed to meet NRC, ANS and IEEE design criteria or standards. The plants protection system has its own computer which provides plant monitoring, alarming, logging and performance calculations. (U.K.)

  6. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  7. Computational Tools applied to Urban Engineering

    OpenAIRE

    Filho, Armando Carlos de Pina; Lima, Fernando Rodrigues; Amaral, Renato Dias Calado do

    2010-01-01

    This chapter looked for to present the main details on three technologies much used in Urban Engineering: CAD (Computer-Aided Design); GIS (Geographic Information System); and BIM (Building Information Modelling). As it can be seen, each one of them presents specific characteristics and with diverse applications in urban projects, providing better results in relation to the planning, management and maintenance of the systems. In relation to presented software, it is important to note that the...

  8. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  9. Computational optimization techniques applied to microgrids planning

    DEFF Research Database (Denmark)

    Gamarra, Carlos; Guerrero, Josep M.

    2015-01-01

    Microgrids are expected to become part of the next electric power system evolution, not only in rural and remote areas but also in urban communities. Since microgrids are expected to coexist with traditional power grids (such as district heating does with traditional heating systems......), their planning process must be addressed to economic feasibility, as a long-term stability guarantee. Planning a microgrid is a complex process due to existing alternatives, goals, constraints and uncertainties. Usually planning goals conflict each other and, as a consequence, different optimization problems...... appear along the planning process. In this context, technical literature about optimization techniques applied to microgrid planning have been reviewed and the guidelines for innovative planning methodologies focused on economic feasibility can be defined. Finally, some trending techniques and new...

  10. 4th International Conference on Applied Computing and Information Technology

    CERN Document Server

    2017-01-01

    This edited book presents scientific results of the 4th International Conference on Applied Computing and Information Technology (ACIT 2016) which was held on December 12–14, 2016 in Las Vegas, USA. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. The aim of this conference was also to bring out the research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the best papers from those papers accepted for presentation at the conference. The papers were chosen based on review scores submitted by members of the Program Committee, and underwent further rigorous rounds of review. Th...

  11. Computer Game-based Learning: Applied Game Development Made Simpler

    NARCIS (Netherlands)

    Nyamsuren, Enkhbold

    2018-01-01

    The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish

  12. Three-dimensional integrated CAE system applying computer graphic technique

    International Nuclear Information System (INIS)

    Kato, Toshisada; Tanaka, Kazuo; Akitomo, Norio; Obata, Tokayasu.

    1991-01-01

    A three-dimensional CAE system for nuclear power plant design is presented. This system utilizes high-speed computer graphic techniques for the plant design review, and an integrated engineering database for handling the large amount of nuclear power plant engineering data in a unified data format. Applying this system makes it possible to construct a nuclear power plant using only computer data from the basic design phase to the manufacturing phase, and it increases the productivity and reliability of the nuclear power plants. (author)

  13. Do flow principles of operations management apply to computing centres?

    CERN Document Server

    Abaunza, Felipe; Hameri, Ari-Pekka; Niemi, Tapio

    2014-01-01

    By analysing large data-sets on jobs processed in major computing centres, we study how operations management principles apply to these modern day processing plants. We show that Little’s Law on long-term performance averages holds to computing centres, i.e. work-in-progress equals throughput rate multiplied by process lead time. Contrary to traditional manufacturing principles, the law of variation does not hold to computing centres, as the more variation in job lead times the better the throughput and utilisation of the system. We also show that as the utilisation of the system increases lead times and work-in-progress increase, which complies with traditional manufacturing. In comparison with current computing centre operations these results imply that better allocation of jobs could increase throughput and utilisation, while less computing resources are needed, thus increasing the overall efficiency of the centre. From a theoretical point of view, in a system with close to zero set-up times, as in the c...

  14. Global Conference on Applied Computing in Science and Engineering

    CERN Document Server

    2016-01-01

    The Global Conference on Applied Computing in Science and Engineering is organized by academics and researchers belonging to different scientific areas of the C3i/Polytechnic Institute of Portalegre (Portugal) and the University of Extremadura (Spain) with the technical support of ScienceKnow Conferences. The event has the objective of creating an international forum for academics, researchers and scientists from worldwide to discuss worldwide results and proposals regarding to the soundest issues related to Applied Computing in Science and Engineering. This event will include the participation of renowned keynote speakers, oral presentations, posters sessions and technical conferences related to the topics dealt with in the Scientific Program as well as an attractive social and cultural program. The papers will be published in the Proceedings e-books. The proceedings of the conference will be sent to possible indexing on Thomson Reuters (selective by Thomson Reuters, not all-inclusive) and Google Scholar...

  15. Computed neutron coincidence counting applied to passive waste assay

    Energy Technology Data Exchange (ETDEWEB)

    Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R. [Nuclear Research Centre, Mol (Belgium)

    1997-11-01

    Neutron coincidence counting applied for the passive assay of fissile material is generally realised with dedicated electronic circuits. This paper presents a software based neutron coincidence counting method with data acquisition via a commercial PC-based Time Interval Analyser (TIA). The TIA is used to measure and record all time intervals between successive pulses in the pulse train up to count-rates of 2 Mpulses/s. Software modules are then used to compute the coincidence count-rates and multiplicity related data. This computed neutron coincidence counting (CNCC) offers full access to all the time information contained in the pulse train. This paper will mainly concentrate on the application and advantages of CNCC for the non-destructive assay of waste. An advanced multiplicity selective Rossi-alpha method is presented and its implementation via CNCC demonstrated. 13 refs., 4 figs., 2 tabs.

  16. Computational modeling applied to stress gradient analysis for metallic alloys

    International Nuclear Information System (INIS)

    Iglesias, Susana M.; Assis, Joaquim T. de; Monine, Vladimir I.

    2009-01-01

    Nowadays composite materials including materials reinforced by particles are the center of the researcher's attention. There are problems with the stress measurements in these materials, connected with the superficial stress gradient caused by the difference of the stress state of particles on the surface and in the matrix of the composite material. Computer simulation of diffraction profile formed by superficial layers of material allows simulate the diffraction experiment and gives the possibility to resolve the problem of stress measurements when the stress state is characterized by strong gradient. The aim of this paper is the application of computer simulation technique, initially developed for homogeneous materials, for diffraction line simulation of composite materials and alloys. Specifically we applied this technique for siluminum fabricated by powder metallurgy. (author)

  17. Applying Integrated Computer Assisted Media (ICAM in Teaching Vocabulary

    Directory of Open Access Journals (Sweden)

    Opick Dwi Indah

    2015-02-01

    Full Text Available The objective of this research was to find out whether the use of integrated computer assisted media (ICAM is effective to improve the vocabulary achievement of the second semester students of Cokroaminoto Palopo University. The population of this research was the second semester students of English department of Cokroaminoto Palopo University in academic year 2013/2014. The samples of this research were 60 students and they were placed into two groups: experimental and control group where each group consisted of 30 students. This research used cluster random sampling technique. The research data was collected by applying vocabulary test and it was analyzed by using descriptive and inferential statistics. The result of this research was integrated computer assisted media (ICAM can improve vocabulary achievement of the students of English department of Cokroaminoto Palopo University. It can be concluded that the use of ICAM in the teaching vocabulary is effective to be implemented in improving the students’ vocabulary achievement.

  18. Computed neutron coincidence counting applied to passive waste assay

    International Nuclear Information System (INIS)

    Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.

    1997-01-01

    Neutron coincidence counting applied for the passive assay of fissile material is generally realised with dedicated electronic circuits. This paper presents a software based neutron coincidence counting method with data acquisition via a commercial PC-based Time Interval Analyser (TIA). The TIA is used to measure and record all time intervals between successive pulses in the pulse train up to count-rates of 2 Mpulses/s. Software modules are then used to compute the coincidence count-rates and multiplicity related data. This computed neutron coincidence counting (CNCC) offers full access to all the time information contained in the pulse train. This paper will mainly concentrate on the application and advantages of CNCC for the non-destructive assay of waste. An advanced multiplicity selective Rossi-alpha method is presented and its implementation via CNCC demonstrated. 13 refs., 4 figs., 2 tabs

  19. Human computer confluence applied in healthcare and rehabilitation.

    Science.gov (United States)

    Viaud-Delmon, Isabelle; Gaggioli, Andrea; Ferscha, Alois; Dunne, Stephen

    2012-01-01

    Human computer confluence (HCC) is an ambitious research program studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding. It is an interdisciplinary field, bringing together researches from horizons as various as pervasive computing, bio-signals processing, neuroscience, electronics, robotics, virtual & augmented reality, and provides an amazing potential for applications in medicine and rehabilitation.

  20. Applying virtual and augmented reality in cultural computing

    NARCIS (Netherlands)

    Bartneck, C.; Hu, J.; Salem, B.I.; Cristescu, R.; Rauterberg, G.W.M.

    2008-01-01

    We are exploring a new application of virtual and augmented reality for a novel direction in human-computer inteaction named 'cultural computing', which aims to provide a new medium for cultural translation and unconscious metamorphosis. In this application both virtual and robotic agents are

  1. Computational sieving applied to some classical number-theoretic problems

    NARCIS (Netherlands)

    H.J.J. te Riele (Herman)

    1998-01-01

    textabstractMany problems in computational number theory require the application of some sieve. Efficient implementation of these sieves on modern computers has extended our knowledge of these problems considerably. This is illustrated by three classical problems: the Goldbach conjecture, factoring

  2. Applying natural evolution for solving computational problems - Lecture 1

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  3. Applying natural evolution for solving computational problems - Lecture 2

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  4. Applied Computational Fluid Dynamics at NASA Ames Research Center

    Science.gov (United States)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  5. Discrete calculus applied analysis on graphs for computational science

    CERN Document Server

    Grady, Leo J

    2010-01-01

    This unique text brings together into a single framework current research in the three areas of discrete calculus, complex networks, and algorithmic content extraction. Many example applications from several fields of computational science are provided.

  6. Advanced computer graphics techniques as applied to the nuclear industry

    International Nuclear Information System (INIS)

    Thomas, J.J.; Koontz, A.S.

    1985-08-01

    Computer graphics is a rapidly advancing technological area in computer science. This is being motivated by increased hardware capability coupled with reduced hardware costs. This paper will cover six topics in computer graphics, with examples forecasting how each of these capabilities could be used in the nuclear industry. These topics are: (1) Image Realism with Surfaces and Transparency; (2) Computer Graphics Motion; (3) Graphics Resolution Issues and Examples; (4) Iconic Interaction; (5) Graphic Workstations; and (6) Data Fusion - illustrating data coming from numerous sources, for display through high dimensional, greater than 3-D, graphics. All topics will be discussed using extensive examples with slides, video tapes, and movies. Illustrations have been omitted from the paper due to the complexity of color reproduction. 11 refs., 2 figs., 3 tabs

  7. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  8. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    CERN Document Server

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  9. Applying Kitaev's algorithm in an ion trap quantum computer

    International Nuclear Information System (INIS)

    Travaglione, B.; Milburn, G.J.

    2000-01-01

    Full text: Kitaev's algorithm is a method of estimating eigenvalues associated with an operator. Shor's factoring algorithm, which enables a quantum computer to crack RSA encryption codes, is a specific example of Kitaev's algorithm. It has been proposed that the algorithm can also be used to generate eigenstates. We extend this proposal for small quantum systems, identifying the conditions under which the algorithm can successfully generate eigenstates. We then propose an implementation scheme based on an ion trap quantum computer. This scheme allows us to illustrate a simple example, in which the algorithm effectively generates eigenstates

  10. Trends in scientific computing applied to petroleum exploration and production

    International Nuclear Information System (INIS)

    Guevara, Saul E; Piedrahita, Carlos E; Arroyo, Elkin R; Soto Rodolfo

    2002-01-01

    Current trends of computational tools in the upstream of the petroleum industry ore presented herein several results and images obtained through commercial programs and through in-house software developments illustrate the topics discussed. They include several types of problems and programming paradigms. Emphasis is made on the future of parallel processing through the use of affordable, open systems, as the Linux system. This kind of technologies will likely make possible new research and industry applications, since quite advanced computational resources will be available to many people working in the area

  11. Quantitative computed tomography applied to interstitial lung diseases.

    Science.gov (United States)

    Obert, Martin; Kampschulte, Marian; Limburg, Rebekka; Barańczuk, Stefan; Krombach, Gabriele A

    2018-03-01

    To evaluate a new image marker that retrieves information from computed tomography (CT) density histograms, with respect to classification properties between different lung parenchyma groups. Furthermore, to conduct a comparison of the new image marker with conventional markers. Density histograms from 220 different subjects (normal = 71; emphysema = 73; fibrotic = 76) were used to compare the conventionally applied emphysema index (EI), 15 th percentile value (PV), mean value (MV), variance (V), skewness (S), kurtosis (K), with a new histogram's functional shape (HFS) method. Multinomial logistic regression (MLR) analyses was performed to calculate predictions of different lung parenchyma group membership using the individual methods, as well as combinations thereof, as covariates. Overall correct assigned subjects (OCA), sensitivity (sens), specificity (spec), and Nagelkerke's pseudo R 2 (NR 2 ) effect size were estimated. NR 2 was used to set up a ranking list of the different methods. MLR indicates the highest classification power (OCA of 92%; sens 0.95; spec 0.89; NR 2 0.95) when all histogram analyses methods were applied together in the MLR. Highest classification power among individually applied methods was found using the HFS concept (OCA 86%; sens 0.93; spec 0.79; NR 2 0.80). Conventional methods achieved lower classification potential on their own: EI (OCA 69%; sens 0.95; spec 0.26; NR 2 0.52); PV (OCA 69%; sens 0.90; spec 0.37; NR 2 0.57); MV (OCA 65%; sens 0.71; spec 0.58; NR 2 0.61); V (OCA 66%; sens 0.72; spec 0.53; NR 2 0.66); S (OCA 65%; sens 0.88; spec 0.26; NR 2 0.55); and K (OCA 63%; sens 0.90; spec 0.16; NR 2 0.48). The HFS method, which was so far applied to a CT bone density curve analysis, is also a remarkable information extraction tool for lung density histograms. Presumably, being a principle mathematical approach, the HFS method can extract valuable health related information also from histograms from complete different areas

  12. Quantum computing applied to calculations of molecular energies

    Czech Academy of Sciences Publication Activity Database

    Pittner, Jiří; Veis, L.

    2011-01-01

    Roč. 241, - (2011), 151-phys ISSN 0065-7727. [National Meeting and Exposition of the American-Chemical-Society (ACS) /241./. 27.03.2011-31.03.2011, Anaheim] Institutional research plan: CEZ:AV0Z40400503 Keywords : molecular energie * quantum computers Subject RIV: CF - Physical ; Theoretical Chemistry

  13. Monte Carlo computation in the applied research of nuclear technology

    International Nuclear Information System (INIS)

    Xu Shuyan; Liu Baojie; Li Qin

    2007-01-01

    This article briefly introduces Monte Carlo Methods and their properties. It narrates the Monte Carlo methods with emphasis in their applications to several domains of nuclear technology. Monte Carlo simulation methods and several commonly used computer software to implement them are also introduced. The proposed methods are demonstrated by a real example. (authors)

  14. Optimizing Computer Assisted Instruction By Applying Principles of Learning Theory.

    Science.gov (United States)

    Edwards, Thomas O.

    The development of learning theory and its application to computer-assisted instruction (CAI) are described. Among the early theoretical constructs thought to be important are E. L. Thorndike's concept of learning connectisms, Neal Miller's theory of motivation, and B. F. Skinner's theory of operant conditioning. Early devices incorporating those…

  15. Microeconomic theory and computation applying the maxima open-source computer algebra system

    CERN Document Server

    Hammock, Michael R

    2014-01-01

    This book provides a step-by-step tutorial for using Maxima, an open-source multi-platform computer algebra system, to examine the economic relationships that form the core of microeconomics in a way that complements traditional modeling techniques.

  16. Computational engineering applied to the concentrating solar power technology

    International Nuclear Information System (INIS)

    Giannuzzi, Giuseppe Mauro; Miliozzi, Adio

    2006-01-01

    Solar power plants based on parabolic-trough collectors present innumerable thermo-structural problems related on the one hand to the high temperatures of the heat transfer fluid, and on the other to the need og highly precise aiming and structural resistance. Devising an engineering response to these problems implies analysing generally unconventional solutions. At present, computational engineering is the principal investigating tool; it speeds the design of prototype installations and significantly reduces the necessary but costly experimental programmes [it

  17. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  18. Computed tomography scanner applied to soil compaction studies

    International Nuclear Information System (INIS)

    Vaz, C.M.P.

    1989-11-01

    The soil compaction problem was studied using a first generation computed tomography scanner (CT). This apparatus gets images of soil cross sections samples, with resolution of a few millimeters. We performed the following laboratory and field experiments: basic experiments of equipment calibrations and resolutions studies; measurements of compacted soil thin layers; measurements of soil compaction caused by agricultural tools; stress-strain modelling in confined soil sample, with several moisture degree; characterizations of soil bulk density profile with samples collected in a hole (trench), comparing with a cone penetrometer technique. (author)

  19. Personal Computer (PC) based image processing applied to fluid mechanics

    Science.gov (United States)

    Cho, Y.-C.; Mclachlan, B. G.

    1987-01-01

    A PC based image processing system was employed to determine the instantaneous velocity field of a two-dimensional unsteady flow. The flow was visualized using a suspension of seeding particles in water, and a laser sheet for illumination. With a finite time exposure, the particle motion was captured on a photograph as a pattern of streaks. The streak pattern was digitized and processed using various imaging operations, including contrast manipulation, noise cleaning, filtering, statistical differencing, and thresholding. Information concerning the velocity was extracted from the enhanced image by measuring the length and orientation of the individual streaks. The fluid velocities deduced from the randomly distributed particle streaks were interpolated to obtain velocities at uniform grid points. For the interpolation a simple convolution technique with an adaptive Gaussian window was used. The results are compared with a numerical prediction by a Navier-Stokes computation.

  20. Applying computer-based procedures in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Mauro V. de; Carvalho, Paulo V.R. de; Santos, Isaac J.A.L. dos; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Div. de Instrumentacao e Confiabilidade Humana], e-mail: mvitor@ien.gov.br, e-mail: paulov@ien.gov.br, e-mail: luquetti@ien.gov.br, e-mail: grecco@ien.gov.br; Bruno, Diego S. [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Escola Politecnica. Curso de Engenharia de Controle e Automacao], e-mail: diegosalomonebruno@gmail.com

    2009-07-01

    Plant operation procedures are used to guide operators in coping with normal, abnormal or emergency situations in a process control system. Historically, the plant procedures have been paper-based (PBP), with the digitalisation trend in these complex systems computer-based procedures (CBPs) are being developed to support procedure use. This work shows briefly the research on CBPs at the Human-System Interface Laboratory (LABIHS). The emergency operation procedure EOP-0 of the LABIHS NPP simulator was implemented in the ImPRO CBP system. The ImPRO system was chosen for test because it is available for download in the Internet. A preliminary operation test using the implemented procedure in the CBP system was realized and the results were compared to the operation through PBP use. (author)

  1. Parallel computation of automatic differentiation applied to magnetic field calculations

    International Nuclear Information System (INIS)

    Hinkins, R.L.; Lawrence Berkeley Lab., CA

    1994-09-01

    The author presents a parallelization of an accelerator physics application to simulate magnetic field in three dimensions. The problem involves the evaluation of high order derivatives with respect to two variables of a multivariate function. Automatic differentiation software had been used with some success, but the computation time was prohibitive. The implementation runs on several platforms, including a network of workstations using PVM, a MasPar using MPFortran, and a CM-5 using CMFortran. A careful examination of the code led to several optimizations that improved its serial performance by a factor of 8.7. The parallelization produced further improvements, especially on the MasPar with a speedup factor of 620. As a result a problem that took six days on a SPARC 10/41 now runs in minutes on the MasPar, making it feasible for physicists at Lawrence Berkeley Laboratory to simulate larger magnets

  2. Software Development Processes Applied to Computational Icing Simulation

    Science.gov (United States)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  3. Above the cloud computing: applying cloud computing principles to create an orbital services model

    Science.gov (United States)

    Straub, Jeremy; Mohammad, Atif; Berk, Josh; Nervold, Anders K.

    2013-05-01

    Large satellites and exquisite planetary missions are generally self-contained. They have, onboard, all of the computational, communications and other capabilities required to perform their designated functions. Because of this, the satellite or spacecraft carries hardware that may be utilized only a fraction of the time; however, the full cost of development and launch are still bone by the program. Small satellites do not have this luxury. Due to mass and volume constraints, they cannot afford to carry numerous pieces of barely utilized equipment or large antennas. This paper proposes a cloud-computing model for exposing satellite services in an orbital environment. Under this approach, each satellite with available capabilities broadcasts a service description for each service that it can provide (e.g., general computing capacity, DSP capabilities, specialized sensing capabilities, transmission capabilities, etc.) and its orbital elements. Consumer spacecraft retain a cache of service providers and select one utilizing decision making heuristics (e.g., suitability of performance, opportunity to transmit instructions and receive results - based on the orbits of the two craft). The two craft negotiate service provisioning (e.g., when the service can be available and for how long) based on the operating rules prioritizing use of (and allowing access to) the service on the service provider craft, based on the credentials of the consumer. Service description, negotiation and sample service performance protocols are presented. The required components of each consumer or provider spacecraft are reviewed. These include fully autonomous control capabilities (for provider craft), a lightweight orbit determination routine (to determine when consumer and provider craft can see each other and, possibly, pointing requirements for craft with directional antennas) and an authentication and resource utilization priority-based access decision making subsystem (for provider craft

  4. Applied and computational harmonic analysis on graphs and networks

    Science.gov (United States)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  5. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    Science.gov (United States)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  6. Object oriented business process modelling in RFID applied computing environment

    NARCIS (Netherlands)

    Zhao, X.; Liu, Chengfei; Lin, T.; Ranasinghe, D.C.; Sheng, Q.Z.

    2010-01-01

    As a tracking technology, Radio Frequency Identification (RFID) is now widely applied to enhance the context awareness of enterprise information systems. Such awareness provides great opportunities to facilitate business process automation and thereby improve operation efficiency and accuracy. With

  7. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  8. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    Science.gov (United States)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  9. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    Science.gov (United States)

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  10. 3rd ACIS International Conference on Applied Computing and Information Technology

    CERN Document Server

    2016-01-01

    This edited book presents scientific results of the 3nd International Conference on Applied Computing and Information Technology (ACIT 2015) which was held on July 12-16, 2015 in Okayama, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.

  11. Optical high-performance computing: introduction to the JOSA A and Applied Optics feature.

    Science.gov (United States)

    Caulfield, H John; Dolev, Shlomi; Green, William M J

    2009-08-01

    The feature issues in both Applied Optics and the Journal of the Optical Society of America A focus on topics of immediate relevance to the community working in the area of optical high-performance computing.

  12. Computing and Systems Applied in Support of Coordinated Energy, Environmental, and Climate Planning

    Science.gov (United States)

    This talk focuses on how Dr. Loughlin is applying Computing and Systems models, tools and methods to more fully understand the linkages among energy systems, environmental quality, and climate change. Dr. Loughlin will highlight recent and ongoing research activities, including: ...

  13. Intelligent Decisional Assistant that Facilitate the Choice of a Proper Computer System Applied in Busines

    OpenAIRE

    Nicolae MARGINEAN

    2009-01-01

    The choice of a proper computer system is not an easy task for a decider. One reason could be the present market development of computer systems applied in business. The big number of the Romanian market players determines a big number of computerized products, with a multitude of various properties. Our proposal tries to optimize and facilitate this decisional process within an e-shop where are sold IT packets applied in business, building an online decisional assistant, a special component ...

  14. Applied Computational Intelligence in Engineering and Information Technology Revised and Selected Papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011

    CERN Document Server

    Precup, Radu-Emil; Preitl, Stefan

    2012-01-01

    This book highlights the potential of getting benefits from various applications of computational intelligence techniques. The present book is structured such that to include a set of selected and extended papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011, held in Timisoara, Romania, from 19 to 21 May 2011. After a serious paper review performed by the Technical Program Committee only 116 submissions were accepted, leading to a paper acceptance ratio of 65 %. A further refinement was made after the symposium, based also on the assessment of the presentation quality. Concluding, this book includes the extended and revised versions of the very best papers of SACI 2011 and few invited papers authored by prominent specialists. The readers will benefit from gaining knowledge of the computational intelligence and on what problems can be solved in several areas; they will learn what kind of approaches is advised to use in order to solve these problems. A...

  15. The justification principle applied to Computed tomography exams

    International Nuclear Information System (INIS)

    Machado Tejeda, A.; Mora Machado, R. de la; Garcia Moreira, T.; Hing Perdomo, J.

    2008-01-01

    The increasing use of imaging technologies and the installation of more sophisticated equipment in radiology services, such as multi-slice CT scanners have consequently increased the number of treated patients, as well as the collective doses to population. Radiation doses received from CT exams are higher than those received in conventional radiology. The optimal use of CT equipment, considering optimized techniques, and the justification of examinations, are imperative in order to minimize the undesirable effects of radiation. In this paper we do set out to the assessment of justification criteria applied for CT exams in a Cuban Hospital. The justification of prescribed tests by physicians was analyzed, assessing its incidence depending on the kind of studies and percentage (%) of positive and negative cases. The study was carried out in a Clinical Surgical Hospital in Havana City. This hospital has installed a Shimadzu SCT-7800TC helical single-slice device. The sample is made up of 81 patients, between 24 and 80 years old, both men and women. For all of them the pathology that causes the order of the exam as well as the existence of other previous tests, were considered. As a result of the assessment, the 56.8% of all cases turned out to be positives; the 55.5% only confirmed the pathologies and the 1.23% produced new evidences. On the other hand, the remaining 43.2% were negatives noting that the 65.3% of the patients there were not previous imaging tests. Skull exam was the most incidences compiling the 67.7% of cases, and it was the headache the most frequent clinical problem to perform the 41.1%. In terms of justification, the evaluation of prescriptions evidenced that CT exams were not justified in 43.2% of cases. As part of this last group, it was also found that 46.9% of clinical studies were negative. (author)

  16. Computer aided instrumented Charpy test applied dynamic fracture toughness evaluation system

    International Nuclear Information System (INIS)

    Kobayashi, Toshiro; Niinomi, Mitsuo

    1986-01-01

    Micro computer aided data treatment system and personal computer aided data analysis system were applied to the traditional instrumented Charpy impact test system. The analysis of Charpy absorbed energy (E i , E p , E t ) and load (P y , P m ), and the evaluation of dynamic toughness through whole fracture process, i.e. J Id , J R curve and T mat was examined using newly developed computer aided instrumented Charpy impact test system. E i , E p , E t , P y and P m were effectively analyzed using moving average method and printed out automatically by micro computer aided data treatment system. J Id , J R curve and T mat could be measured by stop block test method. Then, J Id , J R curve and T mat were effectively estimated using compliance changing rate method and key curve method on the load-load point displacement curve of single fatigue cracked specimen by personal computer aided data analysis system. (author)

  17. Cloud computing technologies applied in the virtual education of civil servants

    Directory of Open Access Journals (Sweden)

    Teodora GHERMAN

    2016-03-01

    Full Text Available From the perspective of education, e-learning through the use of Cloud Computing technologies represent one of the most important directions of educational software development, because Cloud Computing are in a rapid development and applies to all areas of the Information Society, including education. Systems require resources for virtual education on web platform (e-learning numerous hardware and software. The convenience of Internet learning, creating a learning environment based on web has become one of the strengths in virtual education research, including applied Cloud Computing technologies in virtual education of civil servants. The article presents Cloud Computing technologies as a platform for virtual education on web platforms, their advantages and disadvantages towards other technologies.

  18. Research in progress in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  19. Instrumentation for Scientific Computing in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics.

    Science.gov (United States)

    1987-10-01

    include Security Classification) Instrumentation for scientific computing in neural networks, information science, artificial intelligence, and...instrumentation grant to purchase equipment for support of research in neural networks, information science, artificail intellignece , and applied mathematics...in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics Contract AFOSR 86-0282 Principal Investigator: Stephen

  20. Recent progress and modern challenges in applied mathematics, modeling and computational science

    CERN Document Server

    Makarov, Roman; Belair, Jacques

    2017-01-01

    This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

  1. First International Symposium on Applied Computing and Information Technology (ACIT 2013)

    CERN Document Server

    Applied Computing and Information Technology

    2014-01-01

    This book presents the selected results of the 1st International Symposium on Applied Computers and Information Technology (ACIT 2013) held on August 31 – September 4, 2013 in Matsue City, Japan, which brought together researchers, scientists, engineers, industry practitioners, and students to discuss all aspects of  Applied Computers & Information Technology, and its practical challenges. This book includes the best 12 papers presented at the conference, which were chosen based on review scores submitted by members of the program committee and underwent further rigorous rounds of review.  

  2. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  3. International Conference on Applied Mathematics, Modeling and Computational Science & Annual meeting of the Canadian Applied and Industrial Mathematics

    CERN Document Server

    Bélair, Jacques; Kunze, Herb; Makarov, Roman; Melnik, Roderick; Spiteri, Raymond J

    2016-01-01

    Focusing on five main groups of interdisciplinary problems, this book covers a wide range of topics in mathematical modeling, computational science and applied mathematics. It presents a wealth of new results in the development of modeling theories and methods, advancing diverse areas of applications and promoting interdisciplinary interactions between mathematicians, scientists, engineers and representatives from other disciplines. The book offers a valuable source of methods, ideas, and tools developed for a variety of disciplines, including the natural and social sciences, medicine, engineering, and technology. Original results are presented on both the fundamental and applied level, accompanied by an ample number of real-world problems and examples emphasizing the interdisciplinary nature and universality of mathematical modeling, and providing an excellent outline of today’s challenges. Mathematical modeling, with applied and computational methods and tools, plays a fundamental role in modern science a...

  4. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  5. Current research activities: Applied and numerical mathematics, fluid mechanics, experiments in transition and turbulence and aerodynamics, and computer science

    Science.gov (United States)

    1992-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, fluid mechanics including fluid dynamics, acoustics, and combustion, aerodynamics, and computer science during the period 1 Apr. 1992 - 30 Sep. 1992 is summarized.

  6. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    Science.gov (United States)

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.

  7. A Delphi Study on Technology Enhanced Learning (TEL) Applied on Computer Science (CS) Skills

    Science.gov (United States)

    Porta, Marcela; Mas-Machuca, Marta; Martinez-Costa, Carme; Maillet, Katherine

    2012-01-01

    Technology Enhanced Learning (TEL) is a new pedagogical domain aiming to study the usage of information and communication technologies to support teaching and learning. The following study investigated how this domain is used to increase technical skills in Computer Science (CS). A Delphi method was applied, using three-rounds of online survey…

  8. Applied Linguistics Project: Student-Led Computer Assisted Research in High School EAL/EAP

    Science.gov (United States)

    Bohát, Róbert; Rödlingová, Beata; Horáková, Nina

    2015-01-01

    The Applied Linguistics Project (ALP) started at the International School of Prague (ISP) in 2013. Every year, Grade 9 English as an Additional Language (EAL) students identify an area of learning in need of improvement and design a research method followed by data collection and analysis using basic computer software tools or online corpora.…

  9. Evaluating the Theoretic Adequacy and Applied Potential of Computational Models of the Spacing Effect.

    Science.gov (United States)

    Walsh, Matthew M; Gluck, Kevin A; Gunzelmann, Glenn; Jastrzembski, Tiffany; Krusmark, Michael

    2018-03-02

    The spacing effect is among the most widely replicated empirical phenomena in the learning sciences, and its relevance to education and training is readily apparent. Yet successful applications of spacing effect research to education and training is rare. Computational modeling can provide the crucial link between a century of accumulated experimental data on the spacing effect and the emerging interest in using that research to enable adaptive instruction. In this paper, we review relevant literature and identify 10 criteria for rigorously evaluating computational models of the spacing effect. Five relate to evaluating the theoretic adequacy of a model, and five relate to evaluating its application potential. We use these criteria to evaluate a novel computational model of the spacing effect called the Predictive Performance Equation (PPE). Predictive Performance Equation combines elements of earlier models of learning and memory including the General Performance Equation, Adaptive Control of Thought-Rational, and the New Theory of Disuse, giving rise to a novel computational account of the spacing effect that performs favorably across the complete sets of theoretic and applied criteria. We implemented two other previously published computational models of the spacing effect and compare them to PPE using the theoretic and applied criteria as guides. © 2018 Cognitive Science Society, Inc.

  10. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  11. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  12. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  13. 2nd International Doctoral Symposium on Applied Computation and Security Systems

    CERN Document Server

    Cortesi, Agostino; Saeed, Khalid; Chaki, Nabendu

    2016-01-01

    The book contains the extended version of the works that have been presented and discussed in the Second International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2015) held during May 23-25, 2015 in Kolkata, India. The symposium has been jointly organized by the AGH University of Science & Technology, Cracow, Poland; Ca’ Foscari University, Venice, Italy and University of Calcutta, India. The book is divided into volumes and presents dissertation works in the areas of Image Processing, Biometrics-based Authentication, Soft Computing, Data Mining, Next Generation Networking and Network Security, Remote Healthcare, Communications, Embedded Systems, Software Engineering and Service Engineering.

  14. 3rd International Conference on Computer Science, Applied Mathematics and Applications

    CERN Document Server

    Nguyen, Ngoc; Do, Tien

    2015-01-01

    This volume contains the extended versions of papers presented at the 3rd International Conference on Computer Science, Applied Mathematics and Applications (ICCSAMA 2015) held on 11-13 May, 2015 in Metz, France. The book contains 5 parts: 1. Mathematical programming and optimization: theory, methods and software, Operational research and decision making, Machine learning, data security, and bioinformatics, Knowledge information system, Software engineering. All chapters in the book discuss theoretical and algorithmic as well as practical issues connected with computation methods & optimization methods for knowledge engineering and machine learning techniques.  

  15. Intelligent Decisional Assistant that Facilitate the Choice of a Proper Computer System Applied in Busines

    Directory of Open Access Journals (Sweden)

    Nicolae MARGINEAN

    2009-01-01

    Full Text Available The choice of a proper computer system is not an easy task for a decider. One reason could be the present market development of computer systems applied in business. The big number of the Romanian market players determines a big number of computerized products, with a multitude of various properties. Our proposal tries to optimize and facilitate this decisional process within an e-shop where are sold IT packets applied in business, building an online decisional assistant, a special component conceived to facilitate the decision making needed for the selection of the pertinent IT package that fits the requirements of one certain business, described by the decider. The user interacts with the system as an online buyer that visit an e-shop where are sold IT package applied in economy.

  16. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  17. TEACHING AND LEARNING METHODOLOGIES SUPPORTED BY ICT APPLIED IN COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Jose CAPACHO

    2016-04-01

    Full Text Available The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory. Genetic-Cognitive Psychology Theory and Dialectics Psychology. Based on the theoretical framework the following methodologies were developed: Game Theory, Constructivist Approach, Personalized Teaching, Problem Solving, Cooperative Collaborative learning, Learning projects using ICT. These methodologies were applied to the teaching learning process during the Algorithms and Complexity – A&C course, which belongs to the area of ​​Computer Science. The course develops the concepts of Computers, Complexity and Intractability, Recurrence Equations, Divide and Conquer, Greedy Algorithms, Dynamic Programming, Shortest Path Problem and Graph Theory. The main value of the research is the theoretical support of the methodologies and their application supported by ICT using learning objects. The course aforementioned was built on the Blackboard platform evaluating the operation of methodologies. The results of the evaluation are presented for each of them, showing the learning outcomes achieved by students, which verifies that methodologies are functional.

  18. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  19. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  20. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  1. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  2. Applying Computational Scoring Functions to Assess Biomolecular Interactions in Food Science: Applications to the Estrogen Receptors

    Directory of Open Access Journals (Sweden)

    Francesca Spyrakis

    2016-10-01

    Thus, key computational medicinal chemistry methods like molecular dynamics can be used to decipher protein flexibility and to obtain stable models for docking and scoring in food-related studies, and virtual screening is increasingly being applied to identify molecules with potential to act as endocrine disruptors, food mycotoxins, and new nutraceuticals [3,4,5]. All of these methods and simulations are based on protein-ligand interaction phenomena, and represent the basis for any subsequent modification of the targeted receptor's or enzyme's physiological activity. We describe here the energetics of binding of biological complexes, providing a survey of the most common and successful algorithms used in evaluating these energetics, and we report case studies in which computational techniques have been applied to food science issues. In particular, we explore a handful of studies involving the estrogen receptors for which we have a long-term interest.

  3. [Geometry, analysis, and computation in mathematics and applied science]. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, D.

    1994-02-01

    The principal investigators` work on a variety of pure and applied problems in Differential Geometry, Calculus of Variations and Mathematical Physics has been done in a computational laboratory and been based on interactive scientific computer graphics and high speed computation created by the principal investigators to study geometric interface problems in the physical sciences. We have developed software to simulate various physical phenomena from constrained plasma flow to the electron microscope imaging of the microstructure of compound materials, techniques for the visualization of geometric structures that has been used to make significant breakthroughs in the global theory of minimal surfaces, and graphics tools to study evolution processes, such as flow by mean curvature, while simultaneously developing the mathematical foundation of the subject. An increasingly important activity of the laboratory is to extend this environment in order to support and enhance scientific collaboration with researchers at other locations. Toward this end, the Center developed the GANGVideo distributed video software system and software methods for running lab-developed programs simultaneously on remote and local machines. Further, the Center operates a broadcast video network, running in parallel with the Center`s data networks, over which researchers can access stored video materials or view ongoing computations. The graphical front-end to GANGVideo can be used to make ``multi-media mail`` from both ``live`` computing sessions and stored materials without video editing. Currently, videotape is used as the delivery medium, but GANGVideo is compatible with future ``all-digital`` distribution systems. Thus as a byproduct of mathematical research, we are developing methods for scientific communication. But, most important, our research focuses on important scientific problems; the parallel development of computational and graphical tools is driven by scientific needs.

  4. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  5. Symbol manipulation by computer applied to plasma physics. Technical progress report 2

    International Nuclear Information System (INIS)

    Rosen, B.

    1977-09-01

    Progress has been made in automating the calculation of parametric processes analytically by computer. The computations are performed automatically to lowest order quickly and efficiently. Work has started on a method for solving the nonlinear differential equations describing interacting modes

  6. Summary of research in applied mathematics, numerical analysis, and computer sciences

    Science.gov (United States)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  7. 3rd International Doctoral Symposium on Applied Computation and Security Systems

    CERN Document Server

    Saeed, Khalid; Cortesi, Agostino; Chaki, Nabendu

    2017-01-01

    This book presents extended versions of papers originally presented and discussed at the 3rd International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2016) held from August 12 to 14, 2016 in Kolkata, India. The symposium was jointly organized by the AGH University of Science & Technology, Cracow, Poland; Ca’ Foscari University, Venice, Italy; and the University of Calcutta, India. The book is divided into two volumes, Volumes 3 and 4, and presents dissertation works in the areas of Image Processing, Biometrics-based Authentication, Soft Computing, Data Mining, Next-Generation Networking and Network Security, Remote Healthcare, Communications, Embedded Systems, Software Engineering and Service Engineering. The first two volumes of the book published the works presented at the ACSS 2015, which was held from May 23 to 25, 2015 in Kolkata, India.

  8. Special Issue on Entropy-Based Applied Cryptography and Enhanced Security for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    James (Jong Hyuk Park

    2016-09-01

    Full Text Available Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc.

  9. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  10. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  11. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  12. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  13. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  14. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  15. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  16. Costs incurred by applying computer-aided design/computer-aided manufacturing techniques for the reconstruction of maxillofacial defects.

    Science.gov (United States)

    Rustemeyer, Jan; Melenberg, Alex; Sari-Rieger, Aynur

    2014-12-01

    This study aims to evaluate the additional costs incurred by using a computer-aided design/computer-aided manufacturing (CAD/CAM) technique for reconstructing maxillofacial defects by analyzing typical cases. The medical charts of 11 consecutive patients who were subjected to the CAD/CAM technique were considered, and invoices from the companies providing the CAD/CAM devices were reviewed for every case. The number of devices used was significantly correlated with cost (r = 0.880; p costs were found between cases in which prebent reconstruction plates were used (€3346.00 ± €29.00) and cases in which they were not (€2534.22 ± €264.48; p costs of two, three and four devices, even when ignoring the cost of reconstruction plates. Additional fees provided by statutory health insurance covered a mean of 171.5% ± 25.6% of the cost of the CAD/CAM devices. Since the additional fees provide financial compensation, we believe that the CAD/CAM technique is suited for wide application and not restricted to complex cases. Where additional fees/funds are not available, the CAD/CAM technique might be unprofitable, so the decision whether or not to use it remains a case-to-case decision with respect to cost versus benefit. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  17. Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general.

    Science.gov (United States)

    Zander, Thorsten O; Kothe, Christian

    2011-04-01

    Cognitive monitoring is an approach utilizing realtime brain signal decoding (RBSD) for gaining information on the ongoing cognitive user state. In recent decades this approach has brought valuable insight into the cognition of an interacting human. Automated RBSD can be used to set up a brain-computer interface (BCI) providing a novel input modality for technical systems solely based on brain activity. In BCIs the user usually sends voluntary and directed commands to control the connected computer system or to communicate through it. In this paper we propose an extension of this approach by fusing BCI technology with cognitive monitoring, providing valuable information about the users' intentions, situational interpretations and emotional states to the technical system. We call this approach passive BCI. In the following we give an overview of studies which utilize passive BCI, as well as other novel types of applications resulting from BCI technology. We especially focus on applications for healthy users, and the specific requirements and demands of this user group. Since the presented approach of combining cognitive monitoring with BCI technology is very similar to the concept of BCIs itself we propose a unifying categorization of BCI-based applications, including the novel approach of passive BCI.

  18. Parallel computing in cluster of GPU applied to a problem of nuclear engineering

    International Nuclear Information System (INIS)

    Moraes, Sergio Ricardo S.; Heimlich, Adino; Resende, Pedro

    2013-01-01

    Cluster computing has been widely used as a low cost alternative for parallel processing in scientific applications. With the use of Message-Passing Interface (MPI) protocol development became even more accessible and widespread in the scientific community. A more recent trend is the use of Graphic Processing Unit (GPU), which is a powerful co-processor able to perform hundreds of instructions in parallel, reaching a capacity of hundreds of times the processing of a CPU. However, a standard PC does not allow, in general, more than two GPUs. Hence, it is proposed in this work development and evaluation of a hybrid low cost parallel approach to the solution to a nuclear engineering typical problem. The idea is to use clusters parallelism technology (MPI) together with GPU programming techniques (CUDA - Compute Unified Device Architecture) to simulate neutron transport through a slab using Monte Carlo method. By using a cluster comprised by four quad-core computers with 2 GPU each, it has been developed programs using MPI and CUDA technologies. Experiments, applying different configurations, from 1 to 8 GPUs has been performed and results were compared with the sequential (non-parallel) version. A speed up of about 2.000 times has been observed when comparing the 8-GPU with the sequential version. Results here presented are discussed and analyzed with the objective of outlining gains and possible limitations of the proposed approach. (author)

  19. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  20. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  1. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  2. Computational representation of Alzheimer's disease evolution applied to a cooking activity.

    Science.gov (United States)

    Serna, Audrey; Rialle, Vincent; Pigot, Hélène

    2006-01-01

    This article presents a computational model and a simulation of the decrease of activities of daily living performances due to Alzheimer's disease. The disease evolution is simulated thanks to the cognitive architecture ACT-R. Activities are represented according to the retrieval of semantic units in declarative memory and the trigger of rules in procedural memory. The simulation of Alzheimer's disease decrease is simulated thanks to the variation of subsymbolic parameters. The model is applied to a cooking activity. Simulation of 100 hundred subjects shows results similar to those realised in a standardized assessment with human subjects.

  3. Applying Machine Learning and High Performance Computing to Water Quality Assessment and Prediction

    OpenAIRE

    Ruijian Zhang; Deren Li

    2017-01-01

    Water quality assessment and prediction is a more and more important issue. Traditional ways either take lots of time or they can only do assessments. In this research, by applying machine learning algorithm to a long period time of water attributes’ data; we can generate a decision tree so that it can predict the future day’s water quality in an easy and efficient way. The idea is to combine the traditional ways and the computer algorithms together. Using machine learning algorithms, the ass...

  4. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  5. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  6. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  7. Symbiotic Cognitive Computing

    OpenAIRE

    Farrell, Robert G.; Lenchner, Jonathan; Kephjart, Jeffrey O.; Webb, Alan M.; Muller, MIchael J.; Erikson, Thomas D.; Melville, David O.; Bellamy, Rachel K.E.; Gruen, Daniel M.; Connell, Jonathan H.; Soroker, Danny; Aaron, Andy; Trewin, Shari M.; Ashoori, Maryam; Ellis, Jason B.

    2016-01-01

    IBM Research is engaged in a research program in symbiotic cognitive computing to investigate how to embed cognitive computing in physical spaces. This article proposes 5 key principles of symbiotic cognitive computing.  We describe how these principles are applied in a particular symbiotic cognitive computing environment and in an illustrative application.  

  8. Computational Fluid Dynamics (CFD): Future role and requirements as viewed by an applied aerodynamicist. [computer systems design

    Science.gov (United States)

    Yoshihara, H.

    1978-01-01

    The problem of designing the wing-fuselage configuration of an advanced transonic commercial airliner and the optimization of a supercruiser fighter are sketched, pointing out the essential fluid mechanical phenomena that play an important role. Such problems suggest that for a numerical method to be useful, it must be able to treat highly three dimensional turbulent separations, flows with jet engine exhausts, and complex vehicle configurations. Weaknesses of the two principal tools of the aerodynamicist, the wind tunnel and the computer, suggest a complementing combined use of these tools, which is illustrated by the case of the transonic wing-fuselage design. The anticipated difficulties in developing an adequate turbulent transport model suggest that such an approach may have to suffice for an extended period. On a longer term, experimentation of turbulent transport in meaningful cases must be intensified to provide a data base for both modeling and theory validation purposes.

  9. Applying Ancestry and Sex Computation as a Quality Control Tool in Targeted Next-Generation Sequencing.

    Science.gov (United States)

    Mathias, Patrick C; Turner, Emily H; Scroggins, Sheena M; Salipante, Stephen J; Hoffman, Noah G; Pritchard, Colin C; Shirts, Brian H

    2016-03-01

    To apply techniques for ancestry and sex computation from next-generation sequencing (NGS) data as an approach to confirm sample identity and detect sample processing errors. We combined a principal component analysis method with k-nearest neighbors classification to compute the ancestry of patients undergoing NGS testing. By combining this calculation with X chromosome copy number data, we determined the sex and ancestry of patients for comparison with self-report. We also modeled the sensitivity of this technique in detecting sample processing errors. We applied this technique to 859 patient samples with reliable self-report data. Our k-nearest neighbors ancestry screen had an accuracy of 98.7% for patients reporting a single ancestry. Visual inspection of principal component plots was consistent with self-report in 99.6% of single-ancestry and mixed-ancestry patients. Our model demonstrates that approximately two-thirds of potential sample swaps could be detected in our patient population using this technique. Patient ancestry can be estimated from NGS data incidentally sequenced in targeted panels, enabling an inexpensive quality control method when coupled with patient self-report. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Applying Machine Learning and High Performance Computing to Water Quality Assessment and Prediction

    Directory of Open Access Journals (Sweden)

    Ruijian Zhang

    2017-12-01

    Full Text Available Water quality assessment and prediction is a more and more important issue. Traditional ways either take lots of time or they can only do assessments. In this research, by applying machine learning algorithm to a long period time of water attributes’ data; we can generate a decision tree so that it can predict the future day’s water quality in an easy and efficient way. The idea is to combine the traditional ways and the computer algorithms together. Using machine learning algorithms, the assessment of water quality will be far more efficient, and by generating the decision tree, the prediction will be quite accurate. The drawback of the machine learning modeling is that the execution takes quite long time, especially when we employ a better accuracy but more time-consuming algorithm in clustering. Therefore, we applied the high performance computing (HPC System to deal with this problem. Up to now, the pilot experiments have achieved very promising preliminary results. The visualized water quality assessment and prediction obtained from this project would be published in an interactive website so that the public and the environmental managers could use the information for their decision making.

  11. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  12. Applying systemic-structural activity theory to design of human-computer interaction systems

    CERN Document Server

    Bedny, Gregory Z; Bedny, Inna

    2015-01-01

    Human-Computer Interaction (HCI) is an interdisciplinary field that has gained recognition as an important field in ergonomics. HCI draws on ideas and theoretical concepts from computer science, psychology, industrial design, and other fields. Human-Computer Interaction is no longer limited to trained software users. Today people interact with various devices such as mobile phones, tablets, and laptops. How can you make such interaction user friendly, even when user proficiency levels vary? This book explores methods for assessing the psychological complexity of computer-based tasks. It also p

  13. Applied & Computational MathematicsChallenges for the Design and Control of Dynamic Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; Burns, J A; Collis, S; Grosh, J; Jacobson, C A; Johansen, H; Mezic, I; Narayanan, S; Wetter, M

    2011-03-10

    The Energy Independence and Security Act of 2007 (EISA) was passed with the goal 'to move the United States toward greater energy independence and security.' Energy security and independence cannot be achieved unless the United States addresses the issue of energy consumption in the building sector and significantly reduces energy consumption in buildings. Commercial and residential buildings account for approximately 40% of the U.S. energy consumption and emit 50% of CO{sub 2} emissions in the U.S. which is more than twice the total energy consumption of the entire U.S. automobile and light truck fleet. A 50%-80% improvement in building energy efficiency in both new construction and in retrofitting existing buildings could significantly reduce U.S. energy consumption and mitigate climate change. Reaching these aggressive building efficiency goals will not happen without significant Federal investments in areas of computational and mathematical sciences. Applied and computational mathematics are required to enable the development of algorithms and tools to design, control and optimize energy efficient buildings. The challenge has been issued by the U.S. Secretary of Energy, Dr. Steven Chu (emphasis added): 'We need to do more transformational research at DOE including computer design tools for commercial and residential buildings that enable reductions in energy consumption of up to 80 percent with investments that will pay for themselves in less than 10 years.' On July 8-9, 2010 a team of technical experts from industry, government and academia were assembled in Arlington, Virginia to identify the challenges associated with developing and deploying newcomputational methodologies and tools thatwill address building energy efficiency. These experts concluded that investments in fundamental applied and computational mathematics will be required to build enabling technology that can be used to realize the target of 80% reductions in energy

  14. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  15. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  16. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  17. General design methodology applied to the research domain of physical programming for computer illiterate

    CSIR Research Space (South Africa)

    Smith, Andrew C

    2011-09-01

    Full Text Available The authors discuss the application of the 'general design methodology‘ in the context of a physical computing project. The aim of the project was to design and develop physical objects that could serve as metaphors for computer programming elements...

  18. 34 CFR 464.42 - What limit applies to purchasing computer hardware and software?

    Science.gov (United States)

    2010-07-01

    ... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...

  19. Design and fabrication of facial prostheses for cancer patient applying computer aided method and manufacturing (CADCAM)

    Science.gov (United States)

    Din, Tengku Noor Daimah Tengku; Jamayet, Nafij; Rajion, Zainul Ahmad; Luddin, Norhayati; Abdullah, Johari Yap; Abdullah, Abdul Manaf; Yahya, Suzana

    2016-12-01

    Facial defects are either congenital or caused by trauma or cancer where most of them affect the person appearance. The emotional pressure and low self-esteem are problems commonly related to patient with facial defect. To overcome this problem, silicone prosthesis was designed to cover the defect part. This study describes the techniques in designing and fabrication for facial prosthesis applying computer aided method and manufacturing (CADCAM). The steps of fabricating the facial prosthesis were based on a patient case. The patient was diagnosed for Gorlin Gotz syndrome and came to Hospital Universiti Sains Malaysia (HUSM) for prosthesis. The 3D image of the patient was reconstructed from CT data using MIMICS software. Based on the 3D image, the intercanthal and zygomatic measurements of the patient were compared with available data in the database to find the suitable nose shape. The normal nose shape for the patient was retrieved from the nasal digital library. Mirror imaging technique was used to mirror the facial part. The final design of facial prosthesis including eye, nose and cheek was superimposed to see the result virtually. After the final design was confirmed, the mould design was created. The mould of nasal prosthesis was printed using Objet 3D printer. Silicone casting was done using the 3D print mould. The final prosthesis produced from the computer aided method was acceptable to be used for facial rehabilitation to provide better quality of life.

  20. Building Energy Assessment and Computer Simulation Applied to Social Housing in Spain

    Directory of Open Access Journals (Sweden)

    Juan Aranda

    2018-01-01

    Full Text Available The actual energy consumption and simulated energy performance of a building usually differ. This gap widens in social housing, owing to the characteristics of these buildings and the consumption patterns of economically vulnerable households affected by energy poverty. The aim of this work is to characterise the energy poverty of the households that are representative of those residing in social housing, specifically in blocks of apartments in Southern Europe. The main variables that affect energy consumption and costs are analysed, and the models developed for software energy-performance simulations (which are applied to predict energy consumption in social housing are validated against actual energy-consumption values. The results demonstrate that this type of household usually lives in surroundings at a temperature below the average thermal comfort level. We have taken into account that a standard thermal comfort level may lead to significant differences between computer-aided energy building simulation and actual consumption data (which are 40–140% lower than simulated consumption. This fact is of integral importance, as we use computer simulation to predict building energy performance in social housing.

  1. IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.

    Science.gov (United States)

    Chen, Ying; Elenee Argentinis, J D; Weber, Griff

    2016-04-01

    Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Cloud Computing and Internet of Things Concepts Applied on Buildings Data Analysis

    Directory of Open Access Journals (Sweden)

    Hebean Florin-Adrian

    2017-12-01

    Full Text Available Used and developed initially for the IT industry, the Cloud computing and Internet of Things concepts are found at this moment in a lot of sectors of activity, building industry being one of them. These are defined like a global computing, monitoring and analyze network, which is composed of hardware and software resources, with the feature of allocating and dynamically relocating the shared resources, in accordance with user requirements. Data analysis and process optimization techniques based on these new concepts are used increasingly more in the buildings industry area, especially for an optimal operations of the buildings installations and also for increasing occupants comfort. The multitude of building data taken from HVAC sensor, from automation and control systems and from the other systems connected to the network are optimally managed by these new analysis techniques. Through analysis techniques can be identified and manage the issues the arise in operation of building installations like critical alarms, nonfunctional equipment, issues regarding the occupants comfort, for example the upper and lower temperature deviation to the set point and other issues related to equipment maintenance. In this study, a new approach regarding building control is presented and also a generalized methodology for applying data analysis to building services data is described. This methodology is then demonstrated using two case studies.

  3. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  4. APA Summit on Medical Student Education Task Force on Informatics and Technology: learning about computers and applying computer technology to education and practice.

    Science.gov (United States)

    Hilty, Donald M; Hales, Deborah J; Briscoe, Greg; Benjamin, Sheldon; Boland, Robert J; Luo, John S; Chan, Carlyle H; Kennedy, Robert S; Karlinsky, Harry; Gordon, Daniel B; Yager, Joel; Yellowlees, Peter M

    2006-01-01

    This article provides a brief overview of important issues for educators regarding medical education and technology. The literature describes key concepts, prototypical technology tools, and model programs. A work group of psychiatric educators was convened three times by phone conference to discuss the literature. Findings were presented to and input was received from the 2005 Summit on Medical Student Education by APA and the American Directors of Medical Student Education in Psychiatry. Knowledge of, skills in, and attitudes toward medical informatics are important to life-long learning and modern medical practice. A needs assessment is a starting place, since student, faculty, institution, and societal factors bear consideration. Technology needs to "fit" into a curriculum in order to facilitate learning and teaching. Learning about computers and applying computer technology to education and clinical care are key steps in computer literacy for physicians.

  5. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  6. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  7. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  8. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  9. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  10. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  11. Work flow management systems applied in nuclear power plants management system to a new computer platform

    International Nuclear Information System (INIS)

    Rodriguez Lorite, M.; Martin Lopez-Suevos, C.

    1996-01-01

    Activities performed in most companies are based on the flow of information between their different departments and personnel. Most of this information is on paper (delivery notes, invoices, reports, etc). The percentage of information transmitted electronically (electronic transactions, spread sheets, files from word processors, etc) is usually low. The implementation of systems to control and speed up this work flow is the aim of work flow management systems. This article presents a prototype for applying work flow management systems to a specific area: the basic life cycle of a purchase order in a nuclear power plant, which requires the involvement of various computer applications: purchase order management, warehouse management, accounting, etc. Once implemented, work flow management systems allow optimisation of the execution of different tasks included in the managed life cycles and provide parameters to, if necessary, control work cycles, allowing their temporary or definitive modification. (Author)

  12. Computer Sciences Applied to Management at Open University of Catalonia: Development of Competences of Teamworks

    Science.gov (United States)

    Pisa, Carlos Cabañero; López, Enric Serradell

    Teamwork is considered one of the most important professional skills in today's business environment. More specifically, the collaborative work between professionals and information technology managers from various functional areas is a strategic key in competitive business. Several university-level programs are focusing on developing these skills. This article presents the case of the course Computer Science Applied to Management (hereafter CSAM) that has been designed with the objective to develop the ability to work cooperatively in interdisciplinary teams. For their design and development have been addressed to the key elements of efficiency that appear in the literature, most notably the establishment of shared objectives and a feedback system, the management of the harmony of the team, their level of autonomy, independence, diversity and level of supervision. The final result is a subject in which, through a working virtual platform, interdisciplinary teams solve a problem raised by a case study.

  13. Energy saving during bulb storage applying modeling with computational fluid dynamics (CFD)

    Energy Technology Data Exchange (ETDEWEB)

    Sapounas, A.A.; Campen, J.B.; Wildschut, J.; Bot, G.P. [Wageningen UR Greenhouse Horticutlure and Applied Plant Research, Wageningen (Netherlands)

    2010-07-01

    Tulip bulbs are stored in ventilated containers to avoid high ethylene concentration between the bulbs. A commercial computational fluid dynamics (CFD) code was used in this study to examine the distribution of air flow between the containers and the potential energy saving by applying simple solutions concerning the design of the air inlet area and the adjustment of the ventilation rate. The variation in container ventilation was calculated to be between 60 and 180 per cent, with 100 per cent being the average flow through the containers. Various improvement measures were examined. The study showed that 7 per cent energy can be saved by smoothing the sharp corners of the entrance channels of the ventilation wall. The most effective and simple improvement was to cover the open top containers. In this case, the variation was between 80 and 120 per cent. The energy saving was about 38 per cent by adjusting the overall ventilation to the container with the minimal acceptable air flow.

  14. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  15. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  16. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  17. How should Fitts' Law be applied to human-computer interaction?

    Science.gov (United States)

    Gillan, D. J.; Holden, K.; Adam, S.; Rudisill, M.; Magee, L.

    1992-01-01

    The paper challenges the notion that any Fitts' Law model can be applied generally to human-computer interaction, and proposes instead that applying Fitts' Law requires knowledge of the users' sequence of movements, direction of movement, and typical movement amplitudes as well as target sizes. Two experiments examined a text selection task with sequences of controlled movements (point-click and point-drag). For the point-click sequence, a Fitts' Law model that used the diagonal across the text object in the direction of pointing (rather than the horizontal extent of the text object) as the target size provided the best fit for the pointing time data, whereas for the point-drag sequence, a Fitts' Law model that used the vertical size of the text object as the target size gave the best fit. Dragging times were fitted well by Fitts' Law models that used either the vertical or horizontal size of the terminal character in the text object. Additional results of note were that pointing in the point-click sequence was consistently faster than in the point-drag sequence, and that pointing in either sequence was consistently faster than dragging. The discussion centres around the need to define task characteristics before applying Fitts' Law to an interface design or analysis, analyses of pointing and of dragging, and implications for interface design.

  18. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  19. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  20. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  1. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  2. The 39th CERN School of Computing goes to Belgium: apply now!

    CERN Multimedia

    Alberto Pace, CSC Director

    2016-01-01

    Applications are now open for CERN’s 39th School of Computing. The CSC:2016 (see here) will take place from 28 August to 10 September 2016 in Mol, Belgium, in collaboration with SCK•CEN and the Vrije Universiteit Brussel.   The two-week programme consists of more than 50 hours of lectures and hands-on exercises, all on advanced, interesting and challenging computing topics. It covers three main themes: data technologies, base technologies and physics computing. In particular, we will explore: Many-core performance optimisation Concurrent programming Key aspects of multi-threading Writing code for tomorrow’s hardware today Storage technologies, reliability and performance Cryptography, authentication, authorisation and accounting Data replication, caching, monitoring, alarms and quota Writing secure software Observing software with an attacker's eyes Software engineering for physics computing Statistical methods and pr...

  3. The 38th CERN School of Computing visits Greece: Apply now!

    CERN Multimedia

    Alberto Pace, CSC Director

    2015-01-01

    CERN is organising its Summer Computing School (see here) for the 38th time since 1970. CSC2015 will take place from 14 September to 25 September in Kavala, Greece. The CSCs aim at creating a common technical culture in scientific computing among scientists and engineers involved in particle physics or in sister disciplines.   The two-week programme consists of 50 hours of lectures and hands-on exercises. It covers three main themes: data technologies, base technologies and physics computing, and in particular addresses: Many-core performance optimization Concurrent programming Key aspects of multi-threading Writing code for tomorrow’s hardware, today Storage technologies, reliability and performance Cryptography, authentication authorization and accounting Data Replication, caching, monitoring, alarms and quota Writing secure software Observing software with attacker's eyes Software engineering for physics computing Statistical methods and probability conce...

  4. The Blackboard Model of Computer Programming Applied to the Interpretation of Passive Sonar Data

    National Research Council Canada - National Science Library

    Liebing, David

    1997-01-01

    ... (location, course, speed, classification, etc.). At present the potential volume of data produced by modern sonar systems is so large that unless some form of computer assistance is provided with the interpretation of this data, information...

  5. The 37th CERN School of Computing visits Portugal: Apply now!

    CERN Multimedia

    Alberto Pace, CSC Director

    2014-01-01

    CERN is organising its summer School of Computing (see here) for the 37th time since 1970. CSC2014 (see here) will take place from 25 August to 6 September in Braga, Portugal.   The CSCs aim at creating a common technical culture in scientific computing among scientists and engineers involved in particle physics or in sister disciplines. The two-week programme consists of 50 hours of lectures and hands-on exercises. It covers three main themes: data technologies, base technologies and physics computing, and it addresses in particular: Many-core performance optimisation Concurrent programming Key aspects of multi-threading Writing code for tomorrow’s hardware today Storage technologies, reliability and performance Cryptography, authentication authorisation and accounting Data replication, caching, monitoring, alarms and quota Writing secure software Observing software with on attacker's eyes Software engineering for physics computing Statistical methods and proba...

  6. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Prowell, Stacy J [ORNL; Symons, Christopher T [ORNL

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  7. The employment of a spoken language computer applied to an air traffic control task.

    Science.gov (United States)

    Laveson, J. I.; Silver, C. A.

    1972-01-01

    Assessment of the merits of a limited spoken language (56 words) computer in a simulated air traffic control (ATC) task. An airport zone approximately 60 miles in diameter with a traffic flow simulation ranging from single-engine to commercial jet aircraft provided the workload for the controllers. This research determined that, under the circumstances of the experiments carried out, the use of a spoken-language computer would not improve the controller performance.

  8. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  9. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  10. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  11. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  12. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  13. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  14. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  15. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  16. Essays on environmental policy analysis: Computable general equilibrium approaches applied to Sweden

    International Nuclear Information System (INIS)

    Hill, M.

    2001-01-01

    This thesis consists of three essays within the field of applied environmental economics, with the common basic aim of analyzing effects of Swedish environmental policy. Starting out from Swedish environmental goals, the thesis assesses a range of policy-related questions. The objective is to quantify policy outcomes by constructing and applying numerical models especially designed for environmental policy analysis. Static and dynamic multi-sectoral computable general equilibrium models are developed in order to analyze the following issues. The costs and benefits of a domestic carbon dioxide (CO 2 ) tax reform. Special attention is given to how these costs and benefits depend on the structure of the tax system and, furthermore, how they depend on policy-induced changes in 'secondary' pollutants. The effects of allowing for emission permit trading through time when the domestic long-term domestic environmental goal is specified in CO 2 stock terms. The effects on long-term projected economic growth and welfare that are due to damages from emission flow and accumulation of 'local' pollutants (nitrogen oxides and sulfur dioxide), as well as the outcome of environmental policy when costs and benefits are considered in an integrated environmental-economic framework

  17. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  18. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  19. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  20. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  1. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  2. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  3. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  4. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  5. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  6. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  7. Computations in plasma physics

    International Nuclear Information System (INIS)

    Cohen, B.I.; Killeen, J.

    1984-01-01

    A review of computer application in plasma physics is presented. Computer contribution to the investigation of magnetic and inertial confinement of a plasma and charged particle beam propagation is described. Typical utilization of computer for simulation and control of laboratory and cosmic experiments with a plasma and for data accumulation in these experiments is considered. Basic computational methods applied in plasma physics are discussed. Future trends of computer utilization in plasma reseaches are considered in terms of an increasing role of microprocessors and high-speed data plotters and the necessity of more powerful computer application

  8. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  9. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  10. Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes

    Science.gov (United States)

    Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry

    2007-04-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  11. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  12. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  13. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  14. The 36th CERN School of Computing visits Cyprus: Apply now!

    CERN Multimedia

    François Fluckiger, CSC Director

    2013-01-01

    CERN is organising its Summer Computing School for the 36th time since 1970. CSC2013 will take place from 19 to 30 August in Nicosia, Republic of Cyprus, which was admitted last autumn as an Associate Member in the pre-stage to Membership of CERN.    The CSCs aim at creating a common technical culture in scientific computing among scientists and engineers involved in particle physics or in sister disciplines. The two-week programme consists of 50 hours of lectures and hands-on exercises. It covers three main themes: data technologies, base technologies and physics computing, and it particular addresses: Many-core performance optimization Concurrent programming Key aspects of multi-threading Writing code for tomorrow’s hardware, today Storage technologies, reliability and performance Cryptography, authentication authorization and accounting Data Replication, caching, monitoring, alarms and quota Writing secure software Observing  s...

  15. Review: computer vision applied to the inspection and quality control of fruits and vegetables

    Directory of Open Access Journals (Sweden)

    Erick Saldaña

    2013-12-01

    Full Text Available This is a review of the current existing literature concerning the inspection of fruits and vegetables with the application of computer vision, where the techniques most used to estimate various properties related to quality are analyzed. The objectives of the typical applications of such systems include the classification, quality estimation according to the internal and external characteristics, supervision of fruit processes during storage or the evaluation of experimental treatments. In general, computer vision systems do not only replace manual inspection, but can also improve their skills. In conclusion, computer vision systems are powerful tools for the automatic inspection of fruits and vegetables. In addition, the development of such systems adapted to the food industry is fundamental to achieve competitive advantages.

  16. The Geospatial Data Cloud: An Implementation of Applying Cloud Computing in Geosciences

    Directory of Open Access Journals (Sweden)

    Xuezhi Wang

    2014-11-01

    Full Text Available The rapid growth in the volume of remote sensing data and its increasing computational requirements bring huge challenges for researchers as traditional systems cannot adequately satisfy the huge demand for service. Cloud computing has the advantage of high scalability and reliability, which can provide firm technical support. This paper proposes a highly scalable geospatial cloud platform named the Geospatial Data Cloud, which is constructed based on cloud computing. The architecture of the platform is first introduced, and then two subsystems, the cloud-based data management platform and the cloud-based data processing platform, are described.  ––– This paper was presented at the First Scientific Data Conference on Scientific Research, Big Data, and Data Science, organized by CODATA-China and held in Beijing on 24-25 February, 2014.

  17. Quantifying the visual appearance of sunscreens applied to the skin using indirect computer image colorimetry.

    Science.gov (United States)

    Richer, Vincent; Kharazmi, Pegah; Lee, Tim K; Kalia, Sunil; Lui, Harvey

    2018-03-01

    There is no accepted method to objectively assess the visual appearance of sunscreens on the skin. We present a method for sunscreen application, digital photography, and computer analysis to quantify the appearance of the skin after sunscreen application. Four sunscreen lotions were applied randomly at densities of 0.5, 1.0, 1.5, and 2.0 mg/cm 2 to areas of the back of 29 subjects. Each application site had a matched contralateral control area. High-resolution standardized photographs including a color card were taken after sunscreen application. After color balance correction, CIE L*a*b* color values were extracted from paired sites. Differences in skin appearance attributed to sunscreen were represented by ΔE, which in turn was calculated from the linear Euclidean distance within the L*a*b* color space between the paired sites. Sunscreen visibility as measured by median ΔE varied across different products and application densities and ranged between 1.2 and 12.1. The visibility of sunscreens varied according to product SPF, composition (organic vs inorganic), presence of tint, and baseline b* of skin (P colorimetry represents a potential method to objectively quantify visibility of sunscreen on the skin. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Personality Questionnaires as a Basis for Improvement of University Courses in Applied Computer Science and Informatics

    Directory of Open Access Journals (Sweden)

    Vladimir Ivančević

    2017-07-01

    Full Text Available In this paper, we lay the foundation for an adaptation of the teaching process to the personality traits and academic performance of the university students enrolled in applied computer science and informatics (ACSI. We discuss how such an adaptation could be supported by an analytical software solution and present the initial version of this solution. In the form of a case study, we discuss the scores from a personality questionnaire that was administered to a group of university students enrolled in an introductory programming course at the Faculty of Technical Sciences, University of Novi Sad, Serbia. During a non-mandatory workshop on programming, the participants completed the 48-item short-scale Eysenck Personality Questionnaire–Revised (EPQ– R. By using various exploratory and analytical techniques, we inspect the student EPQ–R scores and elaborate on the specificities of the participating student group. As part of our efforts to understand the broader relevance of different student personality traits in an academic environment, we also discuss how the EPQ–R scores of students could provide information valuable to the process of improving student learning and performance in university courses in ACSI.

  19. Rayleigh to Compton ratio scatter tomography applied to breast cancer diagnosis: A preliminary computational study

    International Nuclear Information System (INIS)

    Antoniassi, M.; Conceição, A.L.C.; Poletti, M.E.

    2014-01-01

    In the present work, a tomographic technique based on Rayleigh to Compton scattering ratio (R/C) was studied using computational simulation in order to assess its application to breast cancer diagnosis. In this preliminary study, some parameters that affect the image quality were evaluated, such as: (i) energy beam, (ii) size and glandularity of the breast, and (iii) statistical count noise. The results showed that the R/C contrast increases with increasing photon energy and decreases with increasing glandularity of the sample. The statistical noise showed to be a significant parameter, although the quality of the obtained images was acceptable for a considerable range of noise level. The preliminary results suggest that the R/C tomographic technique has a potential of being applied as a complementary tool in the breast cancer diagnostic. - Highlights: ► A tomographic technique based on Rayleigh to Compton scattering ratio is proposed in order to study breast tissues. ► The Rayleigh to Compton scattering ratio technique is compared with conventional transmission technique. ► The influence of experimental parameters (energy, sample, detection system) is studied

  20. Human-computer interfaces applied to numerical solution of the Plateau problem

    Science.gov (United States)

    Elias Fabris, Antonio; Soares Bandeira, Ivana; Ramos Batista, Valério

    2015-09-01

    In this work we present a code in Matlab to solve the Problem of Plateau numerically, and the code will include human-computer interface. The Problem of Plateau has applications in areas of knowledge like, for instance, Computer Graphics. The solution method will be the same one of the Surface Evolver, but the difference will be a complete graphical interface with the user. This will enable us to implement other kinds of interface like ocular mouse, voice, touch, etc. To date, Evolver does not include any graphical interface, which restricts its use by the scientific community. Specially, its use is practically impossible for most of the Physically Challenged People.

  1. Physical factors affecting single photon emission computed tomography (SPECT) applied in nuclear medicine

    International Nuclear Information System (INIS)

    Farag, H.I.; Khalil, W.A.; Hassan, R.A.

    2003-01-01

    many physical factors degrade single photon emission computed tomography (SPECT) images both qualitatively and quantitatively. Physical properties important for the assessment of the potential of emission computed tomography implemented by collimated detector systems include sensitivity, statistical and angular sampling requirements, attenuation compensation, resolution, uniformity, and multisection design constraints. SPECT has highlighted the used to improve gamma camera performance. Flood field nonuniformity is translated into tomographic the need to improve gamma camera performance. Flood field nonuniformity is translated into tomographic images as major artifacts because it distorts the data obtained at each projection. Also, poor energy resolution translates directly into degraded spatial resolution through reduced ability to reject scattered photons on the basic of pluses height analysis. The aim of this work is study the different and most important acquisition and processing parameters, which affect the quality of the SPECT images. The present study investigates the various parameters effecting SPECT images and experimental results demonstrate that: daily uniformity checks and evaluation are essential to ensure that the SPECT system is working properly. The Core used in the reconstruction process could be correct to avoid data misalignment. 60 mumblers of views gave the best image quality, rather than 20 or 30 views. Time per view (TPV) 30 or 20 sec gave a good image quality, rather than high-resolution collimator, is recommended in order to provide good spatial resolution. On the other hand patient motion could cause serious reconstruction artifacts. A cine display is recommended to identify movement artifacts. In the case of matrix size, matrix 128x128 give the best resolution than matrix 64x64. Energy window width, 15% compared with the standard 20% improved the resolution. Butter worth filter (cut off 0.57 cyc/cm with order 6 ) give the best resolution

  2. Mathematics, Physics and Computer Sciences The computation of ...

    African Journals Online (AJOL)

    Mathematics, Physics and Computer Sciences The computation of system matrices for biquadraticsquare finite ... Global Journal of Pure and Applied Sciences ... The computation of system matrices for biquadraticsquare finite elements.

  3. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  4. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  5. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  6. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  7. A collaborative computing framework of cloud network and WBSN applied to fall detection and 3-D motion reconstruction.

    Science.gov (United States)

    Lai, Chin-Feng; Chen, Min; Pan, Jeng-Shyang; Youn, Chan-Hyun; Chao, Han-Chieh

    2014-03-01

    As cloud computing and wireless body sensor network technologies become gradually developed, ubiquitous healthcare services prevent accidents instantly and effectively, as well as provides relevant information to reduce related processing time and cost. This study proposes a co-processing intermediary framework integrated cloud and wireless body sensor networks, which is mainly applied to fall detection and 3-D motion reconstruction. In this study, the main focuses includes distributed computing and resource allocation of processing sensing data over the computing architecture, network conditions and performance evaluation. Through this framework, the transmissions and computing time of sensing data are reduced to enhance overall performance for the services of fall events detection and 3-D motion reconstruction.

  8. Applying computer adaptive testing to optimize online assessment of suicidal behavior: a simulation study.

    NARCIS (Netherlands)

    de Beurs, D.P.; de Vries, A.L.M.; de Groot, M.H.; de Keijser, J.; Kerkhof, A.J.F.M.

    2014-01-01

    Background: The Internet is used increasingly for both suicide research and prevention. To optimize online assessment of suicidal patients, there is a need for short, good-quality tools to assess elevated risk of future suicidal behavior. Computer adaptive testing (CAT) can be used to reduce

  9. Quantum computing applied to calculations of molecular energies: CH2 benchmark

    Czech Academy of Sciences Publication Activity Database

    Veis, L.; Pittner, Jiří

    2010-01-01

    Roč. 133, č. 19 (2010), s. 194106 ISSN 0021-9606 R&D Projects: GA ČR GA203/08/0626 Institutional research plan: CEZ:AV0Z40400503 Keywords : computation * algorithm * systems Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 2.920, year: 2010

  10. Applying computer modeling to eddy current signal analysis for steam generator and heat exchanger tube inspections

    International Nuclear Information System (INIS)

    Sullivan, S.P.; Cecco, V.S.; Carter, J.R.; Spanner, M.; McElvanney, M.; Krause, T.W.; Tkaczyk, R.

    2000-01-01

    Licensing requirements for eddy current inspections for nuclear steam generators and heat exchangers are becoming increasingly stringent. The traditional industry-standard method of comparing inspection signals with flaw signals from simple in-line calibration standards is proving to be inadequate. A more complete understanding of eddy current and magnetic field interactions with flaws and other anomalies is required for the industry to generate consistently reliable inspections. Computer modeling is a valuable tool in improving the reliability of eddy current signal analysis. Results from computer modeling are helping inspectors to properly discriminate between real flaw signals and false calls, and improving reliability in flaw sizing. This presentation will discuss complementary eddy current computer modeling techniques such as the Finite Element Method (FEM), Volume Integral Method (VIM), Layer Approximation and other analytic methods. Each of these methods have advantages and limitations. An extension of the Layer Approximation to model eddy current probe responses to ferromagnetic materials will also be presented. Finally examples will be discussed demonstrating how some significant eddy current signal analysis problems have been resolved using appropriate electromagnetic computer modeling tools

  11. The Variation Theorem Applied to H-2+: A Simple Quantum Chemistry Computer Project

    Science.gov (United States)

    Robiette, Alan G.

    1975-01-01

    Describes a student project which requires limited knowledge of Fortran and only minimal computing resources. The results illustrate such important principles of quantum mechanics as the variation theorem and the virial theorem. Presents sample calculations and the subprogram for energy calculations. (GS)

  12. Applying Computer-Assisted Musical Instruction to Music Appreciation Course: An Example with Chinese Musical Instruments

    Science.gov (United States)

    Lou, Shi-Jer; Guo, Yuan-Chang; Zhu, Yi-Zhen; Shih, Ru-Chu; Dzan, Wei-Yuan

    2011-01-01

    This study aims to explore the effectiveness of computer-assisted musical instruction (CAMI) in the Learning Chinese Musical Instruments (LCMI) course. The CAMI software for Chinese musical instruments was developed and administered to 228 students in a vocational high school. A pretest-posttest non-equivalent control group design with three…

  13. Artificial intelligence applied to natural products; computer study of pimarane diterpene

    International Nuclear Information System (INIS)

    Lopes, M.N.; Borges, J.H.G.; Furlan, M.; Gastmans, J.P.; Emerenciano, V. de

    1989-01-01

    This paper describes the study of the sup(13)C NMR characteristic signals of naturally occurring pimarane. The analysis is performed by computer, starting from a data base which encloses about 400 diterpenes and using the PICKUPS programm. BNy this way it is possible to analyse substructure from one to five atoms as well as the effects of substituents on them. (author)

  14. Applying mobile and pervasive computer technology to enhance coordination of work in a surgical ward

    DEFF Research Database (Denmark)

    Hansen, Thomas Riisgaard; Bardram, Jakob Eyvind

    2007-01-01

    , and unnecessary stress. To accommodate this situation and to increase the quality of work in operating wards, we have designed a set of pervasive computer systems which supports what we call context-mediated communication and awareness. These systems use large interactive displays, video streaming from key...

  15. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  16. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  17. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  18. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  19. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  20. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  1. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  2. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  3. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  4. Cloud Computing in Support of Applied Learning: A Baseline Study of Infrastructure Design at Southern Polytechnic State University

    Science.gov (United States)

    Conn, Samuel S.; Reichgelt, Han

    2013-01-01

    Cloud computing represents an architecture and paradigm of computing designed to deliver infrastructure, platforms, and software as constructible computing resources on demand to networked users. As campuses are challenged to better accommodate academic needs for applications and computing environments, cloud computing can provide an accommodating…

  5. Computational physics and applied mathematics capability review June 8-10, 2010

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Stephen R [Los Alamos National Laboratory

    2010-01-01

    Los Alamos National Laboratory will review its Computational Physics and Applied Mathematics (CPAM) capabilities in 2010. The goals of capability reviews are to assess the quality of science, technology, and engineering (STE) performed by the capability, evaluate the integration of this capability across the Laboratory and within the scientific community, examine the relevance of this capability to the Laboratory's programs, and provide advice on the current and future directions of this capability. This is the first such review for CPAM, which has a long and unique history at the Laboratory, starting from the inception of the Laboratory in 1943. The CPAM capability covers an extremely broad technical area at Los Alamos, encompassing a wide array of disciplines, research topics, and organizations. A vast array of technical disciplines and activities are included in this capability, from general numerical modeling, to coupled multi-physics simulations, to detailed domain science activities in mathematics, methods, and algorithms. The CPAM capability involves over 12 different technical divisions and a majority of our programmatic and scientific activities. To make this large scope tractable, the CPAM capability is broken into the following six technical 'themes.' These themes represent technical slices through the CPAM capability and collect critical core competencies of the Laboratory, each of which contributes to the capability (and each of which is divided into multiple additional elements in the detailed descriptions of the themes in subsequent sections), as follows. Theme 1: Computational Fluid Dynamics - This theme speaks to the vast array of scientific capabilities for the simulation of fluids under shocks, low-speed flow, and turbulent conditions - which are key, historical, and fundamental strengths of the Laboratory. Theme 2: Partial Differential Equations - The technical scope of this theme is the applied mathematics and numerical solution

  6. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  7. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  8. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  9. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  10. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  11. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  12. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  13. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  14. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  15. Stability Analysis of Finite Difference Approximations to Hyperbolic Systems, and Problems in Applied and Computational Matrix Theory

    Science.gov (United States)

    1988-07-08

    Marcus and C. Baczynski), Computer Science Press, Rockville, Maryland, 1986. 3. An Introduction to Pascal and Precalculus , Computer Science Press...Science Press, Rockville, Maryland, 1986. 35. An Introduction to Pascal and Precalculus , Computer Science Press, Rockville, Maryland, 1986. 36

  16. On-line computer system applied in a nuclear chemistry laboratory

    International Nuclear Information System (INIS)

    Banasik, Z.; Kierzek, J.; Parus, J.; Zoltowski, T.; Zalewski, J.

    1980-01-01

    A PDP-11/45 based computer system used in a radioanalytical chemical laboratory is described. It is mainly concerned with spectrometry of ionizing radiation and remote measurement of physico-chemical properties. The objectives in mind when constructing the hardware inter-connections and developing the software of the system were to minimize the work of the electronics and computer personnel and to provide maximum flexibility for the users. For the hardware interfacing, 3 categories of equipment are used: - LPS-11 Laboratory Peripheral System - CAMAC system with CA11F-P controller - interfaces from instrument manufacturers. Flexible operation has been achieved by using a 3-level programming structure: - data transfer by assembly language programs - data formatting using bit operations in FORTRAN - data evaluation by procedures written in FORTRAN. (Auth.)

  17. Computational drug design strategies applied to the modelling of human immunodeficiency virus-1 reverse transcriptase inhibitors

    Directory of Open Access Journals (Sweden)

    Lucianna Helene Santos

    2015-11-01

    Full Text Available Reverse transcriptase (RT is a multifunctional enzyme in the human immunodeficiency virus (HIV-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.

  18. Computational fluid dynamics applied to flows in an internal combustion engine

    Science.gov (United States)

    Griffin, M. D.; Diwakar, R.; Anderson, J. D., Jr.; Jones, E.

    1978-01-01

    The reported investigation is a continuation of studies conducted by Diwakar et al. (1976) and Griffin et al. (1976), who reported the first computational fluid dynamic results for the two-dimensional flowfield for all four strokes of a reciprocating internal combustion (IC) engine cycle. An analysis of rectangular and cylindrical three-dimensional engine models is performed. The working fluid is assumed to be inviscid air of constant specific heats. Calculations are carried out of a four-stroke IC engine flowfield wherein detailed finite-rate chemical combustion of a gasoline-air mixture is included. The calculations remain basically inviscid, except that in some instances thermal conduction is included to allow a more realistic model of the localized sparking of the mixture. All the results of the investigation are obtained by means of an explicity time-dependent finite-difference technique, using a high-speed digital computer.

  19. Computer vision techniques applied to the quality control of ceramic plates

    OpenAIRE

    Silveira, Joaquim; Ferreira, Manuel João Oliveira; Santos, Cristina; Martins, Teresa

    2009-01-01

    This paper presents a system, based on computer vision techniques, that detects and quantifies different types of defects in ceramic plates. It was developed in collaboration with the industrial ceramic sector and consequently it was focused on the defects that are considered more quality depreciating by the Portuguese industry. They are of three main types: cracks; granules and relief surface. For each type the development was specific as far as image processing techn...

  20. Think different: applying the old macintosh mantra to the computability of the SUSY auxiliary field problem

    Energy Technology Data Exchange (ETDEWEB)

    Calkins, Mathew; Gates, D.E.A.; Gates, S. James Jr. [Center for String and Particle Theory, Department of Physics, University of Maryland,College Park, MD 20742-4111 (United States); Golding, William M. [Sensors and Electron Devices Directorate, US Army Research Laboratory,Adelphi, Maryland 20783 (United States)

    2015-04-13

    Starting with valise supermultiplets obtained from 0-branes plus field redefinitions, valise adinkra networks, and the “Garden Algebra,” we discuss an architecture for algorithms that (starting from on-shell theories and, through a well-defined computation procedure), search for off-shell completions. We show in one dimension how to directly attack the notorious “off-shell auxiliary field” problem of supersymmetry with algorithms in the adinkra network-world formulation.

  1. A universal electronical adaptation of automats for biochemical analysis to a central processing computer by applying CAMAC-signals

    International Nuclear Information System (INIS)

    Schaefer, R.

    1975-01-01

    A universal expansion of a CAMAC-subsystem - BORER 3000 - for adapting analysis instruments in biochemistry to a processing computer is described. The possibility of standardizing input interfaces for lab instruments with such circuits is discussed and the advantages achieved by applying the CAMAC-specifications are described

  2. Applying Web-Based Co-Regulated Learning to Develop Students' Learning and Involvement in a Blended Computing Course

    Science.gov (United States)

    Tsai, Chia-Wen

    2015-01-01

    This research investigated, via quasi-experiments, the effects of web-based co-regulated learning (CRL) on developing students' computing skills. Two classes of 68 undergraduates in a one-semester course titled "Applied Information Technology: Data Processing" were chosen for this research. The first class (CRL group, n = 38) received…

  3. Quantum computing applied to calculations of molecular energies: CH2 benchmark.

    Science.gov (United States)

    Veis, Libor; Pittner, Jiří

    2010-11-21

    Quantum computers are appealing for their ability to solve some tasks much faster than their classical counterparts. It was shown in [Aspuru-Guzik et al., Science 309, 1704 (2005)] that they, if available, would be able to perform the full configuration interaction (FCI) energy calculations with a polynomial scaling. This is in contrast to conventional computers where FCI scales exponentially. We have developed a code for simulation of quantum computers and implemented our version of the quantum FCI algorithm. We provide a detailed description of this algorithm and the results of the assessment of its performance on the four lowest lying electronic states of CH(2) molecule. This molecule was chosen as a benchmark, since its two lowest lying (1)A(1) states exhibit a multireference character at the equilibrium geometry. It has been shown that with a suitably chosen initial state of the quantum register, one is able to achieve the probability amplification regime of the iterative phase estimation algorithm even in this case.

  4. Parallel Object-Oriented Computation Applied to a Finite Element Problem

    Directory of Open Access Journals (Sweden)

    Jon B. Weissman

    1993-01-01

    Full Text Available The conventional wisdom in the scientific computing community is that the best way to solve large-scale numerically intensive scientific problems on today's parallel MIMD computers is to use Fortran or C programmed in a data-parallel style using low-level message-passing primitives. This approach inevitably leads to nonportable codes and extensive development time, and restricts parallel programming to the domain of the expert programmer. We believe that these problems are not inherent to parallel computing but are the result of the programming tools used. We will show that comparable performance can be achieved with little effort if better tools that present higher level abstractions are used. The vehicle for our demonstration is a 2D electromagnetic finite element scattering code we have implemented in Mentat, an object-oriented parallel processing system. We briefly describe the application. Mentat, the implementation, and present performance results for both a Mentat and a hand-coded parallel Fortran version.

  5. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  6. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  7. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  8. Computed tomography

    International Nuclear Information System (INIS)

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  9. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  10. Computational universes

    International Nuclear Information System (INIS)

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  11. Review on Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    P. Sumithra

    2017-03-01

    Full Text Available Computational electromagnetics (CEM is applied to model the interaction of electromagnetic fields with the objects like antenna, waveguides, aircraft and their environment using Maxwell equations.  In this paper the strength and weakness of various computational electromagnetic techniques are discussed. Performance of various techniques in terms accuracy, memory and computational time for application specific tasks such as modeling RCS (Radar cross section, space applications, thin wires, antenna arrays are presented in this paper.

  12. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  13. Teaching strategies applied to teaching computer networks in Engineering in Telecommunications and Electronics

    Directory of Open Access Journals (Sweden)

    Elio Manuel Castañeda-González

    2016-07-01

    Full Text Available Because of the large impact that today computer networks, their study in related fields such as Telecommunications Engineering and Electronics is presented to the student with great appeal. However, by digging in content, lacking a strong practical component, you can make this interest decreases considerably. This paper proposes the use of teaching strategies and analogies, media and interactive applications that enhance the teaching of discipline networks and encourage their study. It is part of an analysis of how the teaching of the discipline process is performed and then a description of each of these strategies is done with their respective contribution to student learning.

  14. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    Science.gov (United States)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  15. TRIO-EF a general thermal hydraulics computer code applied to the Avlis process

    International Nuclear Information System (INIS)

    Magnaud, J.P.; Claveau, M.; Coulon, N.; Yala, P.; Guilbaud, D.; Mejane, A.

    1993-01-01

    TRIO(EF is a general purpose Fluid Mechanics 3D Finite Element Code. The system capabilities cover areas such as steady state or transient, laminar or turbulent, isothermal or temperature dependent fluid flows; it is applicable to the study of coupled thermo-fluid problems involving heat conduction and possibly radiative heat transfer. It has been used to study the thermal behaviour of the AVLIS process separation module. In this process, a linear electron beam impinges the free surface of a uranium ingot, generating a two dimensional curtain emission of vapour from a water-cooled crucible. The energy transferred to the metal causes its partial melting, forming a pool where strong convective motion increases heat transfer towards the crucible. In the upper part of the Separation Module, the internal structures are devoted to two main functions: vapor containment and reflux, irradiation and physical separation. They are subjected to very high temperature levels and heat transfer occurs mainly by radiation. Moreover, special attention has to be paid to electron backscattering. These two major points have been simulated numerically with TRIO-EF and the paper presents and comments the results of such a computation, for each of them. After a brief overview of the computer code, two examples of the TRIO-EF capabilities are given: a crucible thermal hydraulics model, a thermal analysis of the internal structures

  16. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  17. Applying a computer-aided scheme to detect a new radiographic image marker for prediction of chemotherapy outcome

    International Nuclear Information System (INIS)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; Moore, Kathleen; Liu, Hong; Zheng, Bin

    2016-01-01

    To investigate the feasibility of automated segmentation of visceral and subcutaneous fat areas from computed tomography (CT) images of ovarian cancer patients and applying the computed adiposity-related image features to predict chemotherapy outcome. A computerized image processing scheme was developed to segment visceral and subcutaneous fat areas, and compute adiposity-related image features. Then, logistic regression models were applied to analyze association between the scheme-generated assessment scores and progression-free survival (PFS) of patients using a leave-one-case-out cross-validation method and a dataset involving 32 patients. The correlation coefficients between automated and radiologist’s manual segmentation of visceral and subcutaneous fat areas were 0.76 and 0.89, respectively. The scheme-generated prediction scores using adiposity-related radiographic image features significantly associated with patients’ PFS (p < 0.01). Using a computerized scheme enables to more efficiently and robustly segment visceral and subcutaneous fat areas. The computed adiposity-related image features also have potential to improve accuracy in predicting chemotherapy outcome

  18. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  19. Quantum annealing versus classical machine learning applied to a simplified computational biology problem

    Science.gov (United States)

    Li, Richard Y.; Di Felice, Rosa; Rohs, Remo; Lidar, Daniel A.

    2018-01-01

    Transcription factors regulate gene expression, but how these proteins recognize and specifically bind to their DNA targets is still debated. Machine learning models are effective means to reveal interaction mechanisms. Here we studied the ability of a quantum machine learning approach to predict binding specificity. Using simplified datasets of a small number of DNA sequences derived from actual binding affinity experiments, we trained a commercially available quantum annealer to classify and rank transcription factor binding. The results were compared to state-of-the-art classical approaches for the same simplified datasets, including simulated annealing, simulated quantum annealing, multiple linear regression, LASSO, and extreme gradient boosting. Despite technological limitations, we find a slight advantage in classification performance and nearly equal ranking performance using the quantum annealer for these fairly small training data sets. Thus, we propose that quantum annealing might be an effective method to implement machine learning for certain computational biology problems. PMID:29652405

  20. Quantum annealing versus classical machine learning applied to a simplified computational biology problem

    Science.gov (United States)

    Li, Richard Y.; Di Felice, Rosa; Rohs, Remo; Lidar, Daniel A.

    2018-03-01

    Transcription factors regulate gene expression, but how these proteins recognize and specifically bind to their DNA targets is still debated. Machine learning models are effective means to reveal interaction mechanisms. Here we studied the ability of a quantum machine learning approach to classify and rank binding affinities. Using simplified data sets of a small number of DNA sequences derived from actual binding affinity experiments, we trained a commercially available quantum annealer to classify and rank transcription factor binding. The results were compared to state-of-the-art classical approaches for the same simplified data sets, including simulated annealing, simulated quantum annealing, multiple linear regression, LASSO, and extreme gradient boosting. Despite technological limitations, we find a slight advantage in classification performance and nearly equal ranking performance using the quantum annealer for these fairly small training data sets. Thus, we propose that quantum annealing might be an effective method to implement machine learning for certain computational biology problems.

  1. Image Analysis Based on Soft Computing and Applied on Space Shuttle During the Liftoff Process

    Science.gov (United States)

    Dominquez, Jesus A.; Klinko, Steve J.

    2007-01-01

    Imaging techniques based on Soft Computing (SC) and developed at Kennedy Space Center (KSC) have been implemented on a variety of prototype applications related to the safety operation of the Space Shuttle during the liftoff process. These SC-based prototype applications include detection and tracking of moving Foreign Objects Debris (FOD) during the Space Shuttle liftoff, visual anomaly detection on slidewires used in the emergency egress system for the Space Shuttle at the laJlIlch pad, and visual detection of distant birds approaching the Space Shuttle launch pad. This SC-based image analysis capability developed at KSC was also used to analyze images acquired during the accident of the Space Shuttle Columbia and estimate the trajectory and velocity of the foam that caused the accident.

  2. Diagnostic accuracy of coronary angiography with multislice computed tomography applied to 'the Real World'

    International Nuclear Information System (INIS)

    Rodriguez Granillo, Gaston A.; Rosales, Miguel A.; Llaurado, Claudio; Fernandez Pereira, Carlos; Garcia Carcia, Hector M.

    2006-01-01

    Objective: To assess the diagnostic accuracy of Coronary Angiography with Multislice Computed Tomography (MSCT) for the detection of significant coronary artery stenoses. Material and methods: Patients studied had an indication for diagnostic coronary angiography and no history of contrast allergies, renal failure or arrhythmias. A multislice tomography equipment (Brilliance 40, Philips, The Netherlands) with ECG gating was used for image acquisition. A total of 90-125 mI of iodinated contrast was administered by IV route. Obesity, diabetes, diffusely calcified segments with a diameter < 2.0 mm, and segments treated with stents were not considered exclusion criteria. Lesions were defined as significant when the decrease in Iumen was ≥ 50% by MSCT and quantitative coronary angiography (QCA). Results: A total of 38 patients were scanned before the intervention. Of them, one (3%) was excluded because of inadequate image quality. The remaining 37 patients (444 segments), with an adequate quality image, were included in the study (81% men, mean age 62.43 ± 12.5 years, 13.5% diabetics). Mean scan time was 15.12 ± 2.6 seconds, and 444 segments were assessed with both techniques. The number of lesions deemed significant by QCA and MSCT were 88 (17%) and 93 (18%), respectively. Sensitivity, specificity, positive and negative predictive values of MSCT to detect significant stenoses were 82%, 93%, 72% and 96%, respectively. Conclusion: In non-selected patients, coronary angiography by multislice computed tomography exhibits a high negative predictive value for the detection of obstructive coronary disease. (author)

  3. Computer-Aided Drug Design Applied to Marine Drug Discovery: Meridianins as Alzheimer's Disease Therapeutic Agents.

    Science.gov (United States)

    Llorach-Pares, Laura; Nonell-Canals, Alfons; Sanchez-Martinez, Melchor; Avila, Conxita

    2017-11-27

    Computer-aided drug discovery/design (CADD) techniques allow the identification of natural products that are capable of modulating protein functions in pathogenesis-related pathways, constituting one of the most promising lines followed in drug discovery. In this paper, we computationally evaluated and reported the inhibitory activity found in meridianins A-G, a group of marine indole alkaloids isolated from the marine tunicate Aplidium , against various protein kinases involved in Alzheimer's disease (AD), a neurodegenerative pathology characterized by the presence of neurofibrillary tangles (NFT). Balance splitting between tau kinase and phosphate activities caused tau hyperphosphorylation and, thereby, its aggregation and NTF formation. Inhibition of specific kinases involved in its phosphorylation pathway could be one of the key strategies to reverse tau hyperphosphorylation and would represent an approach to develop drugs to palliate AD symptoms. Meridianins bind to the adenosine triphosphate (ATP) binding site of certain protein kinases, acting as ATP competitive inhibitors. These compounds show very promising scaffolds to design new drugs against AD, which could act over tau protein kinases Glycogen synthetase kinase-3 Beta (GSK3β) and Casein kinase 1 delta (CK1δ, CK1D or KC1D), and dual specificity kinases as dual specificity tyrosine phosphorylation regulated kinase 1 (DYRK1A) and cdc2-like kinases (CLK1). This work is aimed to highlight the role of CADD techniques in marine drug discovery and to provide precise information regarding the binding mode and strength of meridianins against several protein kinases that could help in the future development of anti-AD drugs.

  4. NETL Super Computer

    Data.gov (United States)

    Federal Laboratory Consortium — The NETL Super Computer was designed for performing engineering calculations that apply to fossil energy research. It is one of the world’s larger supercomputers,...

  5. Computing Services and Assured Computing

    Science.gov (United States)

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  6. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  7. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  8. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  9. Applying standardized uptake values in gallium-67-citrate single-photon emission computed tomography/computed tomography studies and their correlation with blood test results in representative organs.

    Science.gov (United States)

    Toriihara, Akira; Daisaki, Hiromitsu; Yamaguchi, Akihiro; Yoshida, Katsuya; Isogai, Jun; Tateishi, Ukihide

    2018-05-21

    Recently, semiquantitative analysis using standardized uptake value (SUV) has been introduced in bone single-photon emission computed tomography/computed tomography (SPECT/CT). Our purposes were to apply SUV-based semiquantitative analytic method for gallium-67 (Ga)-citrate SPECT/CT and to evaluate correlation between SUV of physiological uptake and blood test results in representative organs. The accuracy of semiquantitative method was validated using an National Electrical Manufacturers Association body phantom study (radioactivity ratio of sphere : background=4 : 1). Thereafter, 59 patients (34 male and 25 female; mean age, 66.9 years) who had undergone Ga-citrate SPECT/CT were retrospectively enrolled in the study. A mean SUV of physiological uptake was calculated for the following organs: the lungs, right atrium, liver, kidneys, spleen, gluteal muscles, and bone marrow. The correlation between physiological uptakes and blood test results was evaluated using Pearson's correlation coefficient. The phantom study revealed only 1% error between theoretical and actual SUVs in the background, suggesting the sufficient accuracy of scatter and attenuation corrections. However, a partial volume effect could not be overlooked, particularly in small spheres with a diameter of less than 28 mm. The highest mean SUV was observed in the liver (range: 0.44-4.64), followed by bone marrow (range: 0.33-3.60), spleen (range: 0.52-2.12), and kidneys (range: 0.42-1.45). There was no significant correlation between hepatic uptake and liver function, renal uptake and renal function, or bone marrow uptake and blood cell count (P>0.05). The physiological uptake in Ga-citrate SPECT/CT can be represented as SUVs, which are not significantly correlated with corresponding blood test results.

  10. Development of a computational system for radiotherapic planning with the IMRT technique applied to the MCNP computer code with 3D graphic interface for voxel models

    International Nuclear Information System (INIS)

    Fonseca, Telma Cristina Ferreira

    2009-01-01

    The Intensity Modulated Radiation Therapy - IMRT is an advanced treatment technique used worldwide in oncology medicine branch. On this master proposal was developed a software package for simulating the IMRT protocol, namely SOFT-RT which attachment the research group 'Nucleo de Radiacoes Ionizantes' - NRI at UFMG. The computational system SOFT-RT allows producing the absorbed dose simulation of the radiotherapic treatment through a three-dimensional voxel model of the patient. The SISCODES code, from NRI, research group, helps in producing the voxel model of the interest region from a set of CT or MRI digitalized images. The SOFT-RT allows also the rotation and translation of the model about the coordinate system axis for better visualization of the model and the beam. The SOFT-RT collects and exports the necessary parameters to MCNP code which will carry out the nuclear radiation transport towards the tumor and adjacent healthy tissues for each orientation and position of the beam planning. Through three-dimensional visualization of voxel model of a patient, it is possible to focus on a tumoral region preserving the whole tissues around them. It takes in account where exactly the radiation beam passes through, which tissues are affected and how much dose is applied in both tissues. The Out-module from SOFT-RT imports the results and express the dose response superimposing dose and voxel model in gray scale in a three-dimensional graphic representation. The present master thesis presents the new computational system of radiotherapic treatment - SOFT-RT code which has been developed using the robust and multi-platform C ++ programming language with the OpenGL graphics packages. The Linux operational system was adopted with the goal of running it in an open source platform and free access. Preliminary simulation results for a cerebral tumor case will be reported as well as some dosimetric evaluations. (author)

  11. A computational code for resolution of general compartment models applied to internal dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Claro, Thiago R.; Todo, Alberto S., E-mail: claro@usp.br, E-mail: astodo@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The dose resulting from internal contamination can be estimated with the use of biokinetic models combined with experimental results obtained from bio analysis and the knowledge of the incorporation time. The biokinetics models can be represented by a set of compartments expressing the transportation, retention and elimination of radionuclides from the body. The ICRP publications, number 66, 78 and 100, present compartmental models for the respiratory tract, gastrointestinal tract and for systemic distribution for an array of radionuclides of interest for the radiological protection. The objective of this work is to develop a computational code for designing, visualization and resolution of compartmental models of any nature. There are available four different techniques for the resolution of system of differential equations, including semi-analytical and numerical methods. The software was developed in C{ne} programming, using a Microsoft Access database and XML standards for file exchange with other applications. Compartmental models for uranium, thorium and iodine radionuclides were generated for the validation of the CBT software. The models were subsequently solved by SSID software and the results compared with the values published in the issue 78 of ICRP. In all cases the system is in accordance with the values published by ICRP. (author)

  12. Computer-Aided Diagnosis of Micro-Malignant Melanoma Lesions Applying Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Joanna Jaworek-Korjakowska

    2016-01-01

    Full Text Available Background. One of the fatal disorders causing death is malignant melanoma, the deadliest form of skin cancer. The aim of the modern dermatology is the early detection of skin cancer, which usually results in reducing the mortality rate and less extensive treatment. This paper presents a study on classification of melanoma in the early stage of development using SVMs as a useful technique for data classification. Method. In this paper an automatic algorithm for the classification of melanomas in their early stage, with a diameter under 5 mm, has been presented. The system contains the following steps: image enhancement, lesion segmentation, feature calculation and selection, and classification stage using SVMs. Results. The algorithm has been tested on 200 images including 70 melanomas and 130 benign lesions. The SVM classifier achieved sensitivity of 90% and specificity of 96%. The results indicate that the proposed approach captured most of the malignant cases and could provide reliable information for effective skin mole examination. Conclusions. Micro-melanomas due to the small size and low advancement of development create enormous difficulties during the diagnosis even for experts. The use of advanced equipment and sophisticated computer systems can help in the early diagnosis of skin lesions.

  13. A computational code for resolution of general compartment models applied to internal dosimetry

    International Nuclear Information System (INIS)

    Claro, Thiago R.; Todo, Alberto S.

    2011-01-01

    The dose resulting from internal contamination can be estimated with the use of biokinetic models combined with experimental results obtained from bio analysis and the knowledge of the incorporation time. The biokinetics models can be represented by a set of compartments expressing the transportation, retention and elimination of radionuclides from the body. The ICRP publications, number 66, 78 and 100, present compartmental models for the respiratory tract, gastrointestinal tract and for systemic distribution for an array of radionuclides of interest for the radiological protection. The objective of this work is to develop a computational code for designing, visualization and resolution of compartmental models of any nature. There are available four different techniques for the resolution of system of differential equations, including semi-analytical and numerical methods. The software was developed in C≠ programming, using a Microsoft Access database and XML standards for file exchange with other applications. Compartmental models for uranium, thorium and iodine radionuclides were generated for the validation of the CBT software. The models were subsequently solved by SSID software and the results compared with the values published in the issue 78 of ICRP. In all cases the system is in accordance with the values published by ICRP. (author)

  14. Initial experience with computed tomographic colonography applied for noncolorectal cancerous conditions

    International Nuclear Information System (INIS)

    Ichikawa, Tamaki; Kawada, Shuichi; Hirata, Satoru; Ikeda, Shu; Sato, Yuuki; Imai, Yutaka

    2011-01-01

    The aim of this study was to asses retrospectively the performance of computed tomography colonography (CTC) for noncolorectal cancerous conditions. A total of 44 patients with non-colorectal cancerous conditions underwent CTC. We researched the indications for CTC or present illness and evaluated the CTC imaging findings. We assessed whether diagnosis by CTC reduced conventional colonoscopic examinations. A total of 47 examinations were performed in 44 patients. The indications for CTC or a present illness were as follows: 15 patients with impossible or incomplete colonoscopy, 7 with diverticular disease, 6 with malignancy (noncolorectal cancer), 6 with Crohn's disease, 4 suspected to have a submucosal tumor on colonoscopy, 2 with ischemic colitis, and 4 with various other diseases. Colonic findings were diagnosed on CTC in 36 examinations, and extracolonic findings were identified in 35 of 44 patients. In all, 17 patients had undergone colonoscopy previously, 9 (52.9%) of whom did not require further colonoscopy by CTC. Five patients underwent colonoscopy after CTC. The indications for CTC were varied for patients with noncolorectal cancerous conditions. CTC examinations could be performed safely. Unlike colonoscopy or CT without preparation, CTC revealed colonic and extracolonic findings and may reduce the indication of colonoscopy in patients with noncolorectal cancerous conditions. (author)

  15. Applying computational geometry techniques for advanced feature analysis in atom probe data

    International Nuclear Information System (INIS)

    Felfer, Peter; Ceguerra, Anna; Ringer, Simon; Cairney, Julie

    2013-01-01

    In this paper we present new methods for feature analysis in atom probe tomography data that have useful applications in materials characterisation. The analysis works on the principle of Voronoi subvolumes and piecewise linear approximations, and feature delineation based on the distance to the centre of mass of a subvolume (DCOM). Based on the coordinate systems defined by these approximations, two examples are shown of the new types of analyses that can be performed. The first is the analysis of line-like-objects (i.e. dislocations) using both proxigrams and line-excess plots. The second is interfacial excess mapping of an InGaAs quantum dot. - Highlights: • Computational geometry is used to detect and analyse features within atom probe data. • Limitations of conventional feature detection are overcome by using atomic density gradients. • 0D, 1D, 2D and 3D features can be analysed by using Voronoi tessellation for spatial binning. • New, robust analysis methods are demonstrated, including line and interfacial excess mapping

  16. Applied and numerical partial differential equations scientific computing in simulation, optimization and control in a multidisciplinary context

    CERN Document Server

    Glowinski, R; Kuznetsov, Y A; Periaux, Jacques; Neittaanmaki, Pekka; Pironneau, Olivier

    2010-01-01

    Standing at the intersection of mathematics and scientific computing, this collection of state-of-the-art papers in nonlinear PDEs examines their applications to subjects as diverse as dynamical systems, computational mechanics, and the mathematics of finance.

  17. Debunking the Computer Science Digital Library: Lessons Learned in Collection Development at Seneca College of Applied Arts & Technology

    Science.gov (United States)

    Buczynski, James Andrew

    2005-01-01

    Developing a library collection to support the curriculum of Canada's largest computer studies school has debunked many myths about collecting computer science and technology information resources. Computer science students are among the heaviest print book and e-book users in the library. Circulation statistics indicate that the demand for print…

  18. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  19. Computer-aided detection system applied to full-field digital mammograms

    International Nuclear Information System (INIS)

    Vega Bolivar, Alfonso; Sanchez Gomez, Sonia; Merino, Paula; Alonso-Bartolome, Pilar; Ortega Garcia, Estrella; Munoz Cacho, Pedro; Hoffmeister, Jeffrey W.

    2010-01-01

    Background: Although mammography remains the mainstay for breast cancer screening, it is an imperfect examination with a sensitivity of 75-92% for breast cancer. Computer-aided detection (CAD) has been developed to improve mammographic detection of breast cancer. Purpose: To retrospectively estimate CAD sensitivity and false-positive rate with full-field digital mammograms (FFDMs). Material and Methods: CAD was used to evaluate 151 cases of ductal carcinoma in situ (DCIS) (n=48) and invasive breast cancer (n=103) detected with FFDM. Retrospectively, CAD sensitivity was estimated based on breast density, mammographic presentation, histopathology type, and lesion size. CAD false-positive rate was estimated with screening FFDMs from 200 women. Results: CAD detected 93% (141/151) of cancer cases: 97% (28/29) in fatty breasts, 94% (81/86) in breasts containing scattered fibroglandular densities, 90% (28/31) in heterogeneously dense breasts, and 80% (4/5) in extremely dense breasts. CAD detected 98% (54/55) of cancers manifesting as calcifications, 89% (74/83) as masses, and 100% (13/13) as mixed masses and calcifications. CAD detected 92% (73/79) of invasive ductal carcinomas, 89% (8/9) of invasive lobular carcinomas, 93% (14/15) of other invasive carcinomas, and 96% (46/48) of DCIS. CAD sensitivity for cancers 1-10 mm was 87% (47/54); 11-20 mm, 99% (70/71); 21-30 mm, 86% (12/14); and larger than 30 mm, 100% (12/12). The CAD false-positive rate was 2.5 marks per case. Conclusion: CAD with FFDM showed a high sensitivity in identifying cancers manifesting as calcifications or masses. CAD sensitivity was maintained in small lesions (1-20 mm) and invasive lobular carcinomas, which have lower mammographic sensitivity

  20. Computer-aided detection system applied to full-field digital mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Vega Bolivar, Alfonso; Sanchez Gomez, Sonia; Merino, Paula; Alonso-Bartolome, Pilar; Ortega Garcia, Estrella (Dept. of Radiology, Univ. Marques of Valdecilla Hospital, Santander (Spain)), e-mail: avegab@telefonica.net; Munoz Cacho, Pedro (Dept. of Statistics, Univ. Marques of Valdecilla Hospital, Santander (Spain)); Hoffmeister, Jeffrey W. (iCAD, Inc., Nashua, NH (United States))

    2010-12-15

    Background: Although mammography remains the mainstay for breast cancer screening, it is an imperfect examination with a sensitivity of 75-92% for breast cancer. Computer-aided detection (CAD) has been developed to improve mammographic detection of breast cancer. Purpose: To retrospectively estimate CAD sensitivity and false-positive rate with full-field digital mammograms (FFDMs). Material and Methods: CAD was used to evaluate 151 cases of ductal carcinoma in situ (DCIS) (n=48) and invasive breast cancer (n=103) detected with FFDM. Retrospectively, CAD sensitivity was estimated based on breast density, mammographic presentation, histopathology type, and lesion size. CAD false-positive rate was estimated with screening FFDMs from 200 women. Results: CAD detected 93% (141/151) of cancer cases: 97% (28/29) in fatty breasts, 94% (81/86) in breasts containing scattered fibroglandular densities, 90% (28/31) in heterogeneously dense breasts, and 80% (4/5) in extremely dense breasts. CAD detected 98% (54/55) of cancers manifesting as calcifications, 89% (74/83) as masses, and 100% (13/13) as mixed masses and calcifications. CAD detected 92% (73/79) of invasive ductal carcinomas, 89% (8/9) of invasive lobular carcinomas, 93% (14/15) of other invasive carcinomas, and 96% (46/48) of DCIS. CAD sensitivity for cancers 1-10 mm was 87% (47/54); 11-20 mm, 99% (70/71); 21-30 mm, 86% (12/14); and larger than 30 mm, 100% (12/12). The CAD false-positive rate was 2.5 marks per case. Conclusion: CAD with FFDM showed a high sensitivity in identifying cancers manifesting as calcifications or masses. CAD sensitivity was maintained in small lesions (1-20 mm) and invasive lobular carcinomas, which have lower mammographic sensitivity

  1. Study of dosimetric quantities applied to patient undergoing routine chest examinations by computed tomography

    International Nuclear Information System (INIS)

    Gonzaga, Natalia Barbosa

    2012-01-01

    The radiological protection system has established a standard to protect persons against the harmful effects caused by ionizing radiation that is based on the justification, optimization and dose limitation principles. The increasing use of radiation in medicine and the related risks have stressed the discussion on patient radiation protection. The computed tomography (CT) is the diagnostic radiology technique that most contributes to patient doses and it requires optimization efforts. Diagnostic reference levels (DRL) has been established in many countries in terms of CT dosimetric quantities; in Brazil, the DRLs are still under investigation since the culture of patient protection is not very strong yet. The objective of this work was to investigate the dosimetric and protection quantities related to patients undergoing CT routine chest examinations. The ImPACT CT, CT Expo and ImpactDose softwares were used for calculations of the weight and volumetric air-kerma indexes (CW and CVOL), the air kerma - length product (P K,L ), organ equivalent dose (H T ) and the effective dose (E) for CT routine chest protocols in 19 tomographs in Belo Horizonte city. The CT Expo was selected to be validated against experimental measurements in three hospitals with thermoluminescent dosimeters and CT pencil ionization chamber in anthropomorphic and standard CT body phantoms. Experimental and calculated results indicated differences up to 97% for H T and E and acceptable agreement for C W ,C VOL and P K,L . All data from 19 tomographs showed that local DRLs for CT routine chest examinations may be chosen smaller than DRLs adopted in other countries; this would contribute to increase the radiological protection of patients. (author)

  2. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  3. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  4. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  5. Computer tomography in otolaryngology

    International Nuclear Information System (INIS)

    Gradzki, J.

    1981-01-01

    The principles of design and the action of computer tomography which was applied also for the diagnosis of nose, ear and throat diseases are discussed. Computer tomography makes possible visualization of the structures of the nose, nasal sinuses and facial skeleton in transverse and eoronal planes. The method enables an accurate evaluation of the position and size of neoplasms in these regions and differentiation of inflammatory exudates against malignant masses. In otology computer tomography is used particularly in the diagnosis of pontocerebellar angle tumours and otogenic brain abscesses. Computer tomography of the larynx and pharynx provides new diagnostic data owing to the possibility of obtaining transverse sections and visualization of cartilage. Computer tomograms of some cases are presented. (author)

  6. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  7. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  8. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  9. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  10. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  11. Computational biology for ageing

    Science.gov (United States)

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.

    2011-01-01

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530

  12. Computer architecture technology trends

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. This year's edition of Computer Architecture Technology Trends analyses the trends which are taking place in the architecture of computing systems today. Due to the sheer number of different applications to which computers are being applied, there seems no end to the different adoptions which proliferate. There are, however, some underlying trends which appear. Decision makers should be aware of these trends when specifying architectures, particularly for future applications. This report is fully revised and updated and provides insight in

  13. Computational Science and Innovation

    International Nuclear Information System (INIS)

    Dean, David Jarvis

    2011-01-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  14. Cloud computing strategies

    CERN Document Server

    Chorafas, Dimitris N

    2011-01-01

    A guide to managing cloud projects, Cloud Computing Strategies provides the understanding required to evaluate the technology and determine how it can be best applied to improve business and enhance your overall corporate strategy. Based on extensive research, it examines the opportunities and challenges that loom in the cloud. It explains exactly what cloud computing is, what it has to offer, and calls attention to the important issues management needs to consider before passing the point of no return regarding financial commitments.

  15. Computer Simulation Western

    International Nuclear Information System (INIS)

    Rasmussen, H.

    1992-01-01

    Computer Simulation Western is a unit within the Department of Applied Mathematics at the University of Western Ontario. Its purpose is the development of computational and mathematical methods for practical problems in industry and engineering and the application and marketing of such methods. We describe the unit and our efforts at obtaining research and development grants. Some representative projects will be presented and future plans discussed. (author)

  16. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  17. Mathematics and Computer Science | Argonne National Laboratory

    Science.gov (United States)

    Extreme Computing Data-Intensive Science Applied Mathematics Science & Engineering Applications Software Extreme Computing Data-Intensive Science Applied Mathematics Science & Engineering Opportunities For Employees Staff Directory Argonne National Laboratory Mathematics and Computer Science Tools

  18. User’s Emotions and Usability Study of a Brain-Computer Interface Applied to People with Cerebral Palsy

    Directory of Open Access Journals (Sweden)

    Alejandro Rafael García Ramírez

    2018-02-01

    Full Text Available People with motor and communication disorders face serious challenges in interacting with computers. To enhance this functionality, new human-computer interfaces are being studied. In this work, a brain-computer interface based on the Emotiv Epoc is used to analyze human-computer interactions in cases of cerebral palsy. The Phrase-Composer software was developed to interact with the brain-computer interface. A system usability evaluation was carried out with the participation of three specialists from The Fundação Catarinense de Educação especial (FCEE and four cerebral palsy volunteers. Even though the System Usability Scale (SUS score was acceptable, several challenges remain. Raw electroencephalography (EEG data were also analyzed in order to assess the user’s emotions during their interaction with the communication device. This study brings new evidences about human-computer interaction related to individuals with cerebral palsy.

  19. Computing with synthetic protocells.

    Science.gov (United States)

    Courbet, Alexis; Molina, Franck; Amar, Patrick

    2015-09-01

    In this article we present a new kind of computing device that uses biochemical reactions networks as building blocks to implement logic gates. The architecture of a computing machine relies on these generic and composable building blocks, computation units, that can be used in multiple instances to perform complex boolean functions. Standard logical operations are implemented by biochemical networks, encapsulated and insulated within synthetic vesicles called protocells. These protocells are capable of exchanging energy and information with each other through transmembrane electron transfer. In the paradigm of computation we propose, protoputing, a machine can solve only one problem and therefore has to be built specifically. Thus, the programming phase in the standard computing paradigm is represented in our approach by the set of assembly instructions (specific attachments) that directs the wiring of the protocells that constitute the machine itself. To demonstrate the computing power of protocellular machines, we apply it to solve a NP-complete problem, known to be very demanding in computing power, the 3-SAT problem. We show how to program the assembly of a machine that can verify the satisfiability of a given boolean formula. Then we show how to use the massive parallelism of these machines to verify in less than 20 min all the valuations of the input variables and output a fluorescent signal when the formula is satisfiable or no signal at all otherwise.

  20. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  1. A Case Study: Applying Critical Thinking Skills to Computer Science and Technology

    Science.gov (United States)

    Shannon, Li-Jen; Bennett, Judith

    2012-01-01

    A majority of incoming college freshmen and sophomores have not applied their critical thinking skills as part of their learning process. This paper investigates how students acquire their critical thinking skills while facing the copyright, fair use, and internet security challenges in this contemporary digital society. The findings show that 90…

  2. Applying a Global Sensitivity Analysis Workflow to Improve the Computational Efficiencies in Physiologically-Based Pharmacokinetic Modeling

    Directory of Open Access Journals (Sweden)

    Nan-Hung Hsieh

    2018-06-01

    Full Text Available Traditionally, the solution to reduce parameter dimensionality in a physiologically-based pharmacokinetic (PBPK model is through expert judgment. However, this approach may lead to bias in parameter estimates and model predictions if important parameters are fixed at uncertain or inappropriate values. The purpose of this study was to explore the application of global sensitivity analysis (GSA to ascertain which parameters in the PBPK model are non-influential, and therefore can be assigned fixed values in Bayesian parameter estimation with minimal bias. We compared the elementary effect-based Morris method and three variance-based Sobol indices in their ability to distinguish “influential” parameters to be estimated and “non-influential” parameters to be fixed. We illustrated this approach using a published human PBPK model for acetaminophen (APAP and its two primary metabolites APAP-glucuronide and APAP-sulfate. We first applied GSA to the original published model, comparing Bayesian model calibration results using all the 21 originally calibrated model parameters (OMP, determined by “expert judgment”-based approach vs. the subset of original influential parameters (OIP, determined by GSA from the OMP. We then applied GSA to all the PBPK parameters, including those fixed in the published model, comparing the model calibration results using this full set of 58 model parameters (FMP vs. the full set influential parameters (FIP, determined by GSA from FMP. We also examined the impact of different cut-off points to distinguish the influential and non-influential parameters. We found that Sobol indices calculated by eFAST provided the best combination of reliability (consistency with other variance-based methods and efficiency (lowest computational cost to achieve convergence in identifying influential parameters. We identified several originally calibrated parameters that were not influential, and could be fixed to improve computational

  3. Experiences in applying Bayesian integrative models in interdisciplinary modeling: the computational and human challenges

    DEFF Research Database (Denmark)

    Kuikka, Sakari; Haapasaari, Päivi Elisabet; Helle, Inari

    2011-01-01

    We review the experience obtained in using integrative Bayesian models in interdisciplinary analysis focusing on sustainable use of marine resources and environmental management tasks. We have applied Bayesian models to both fisheries and environmental risk analysis problems. Bayesian belief...... be time consuming and research projects can be difficult to manage due to unpredictable technical problems related to parameter estimation. Biology, sociology and environmental economics have their own scientific traditions. Bayesian models are becoming traditional tools in fisheries biology, where...

  4. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  5. AGH University of Science and Technology, Faculty of Physics and Applied Computer Science, Annual Report 2008

    International Nuclear Information System (INIS)

    2009-01-01

    The most important research activities of the Faculty are condensed matter physics and physics of elementary particles. Advanced fundamental as well as applied studies are also carried out in the fields of nuclear physics and technology, electronics, environmental physics and medicinal physics. Report presents short descriptions of the results obtained in 2009. It contains also list of 198 papers published in the national and international scientific journals and of 6 book chapters published in 2009. Report contains full list of grants (national and international) realized in 2009 [pl

  6. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  7. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  8. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  9. Applying a new computer-aided detection scheme generated imaging marker to predict short-term breast cancer risk

    Science.gov (United States)

    Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Patel, Bhavika; Heidari, Morteza; Liu, Hong; Zheng, Bin

    2018-05-01

    This study aims to investigate the feasibility of identifying a new quantitative imaging marker based on false-positives generated by a computer-aided detection (CAD) scheme to help predict short-term breast cancer risk. An image dataset including four view mammograms acquired from 1044 women was retrospectively assembled. All mammograms were originally interpreted as negative by radiologists. In the next subsequent mammography screening, 402 women were diagnosed with breast cancer and 642 remained negative. An existing CAD scheme was applied ‘as is’ to process each image. From CAD-generated results, four detection features including the total number of (1) initial detection seeds and (2) the final detected false-positive regions, (3) average and (4) sum of detection scores, were computed from each image. Then, by combining the features computed from two bilateral images of left and right breasts from either craniocaudal or mediolateral oblique view, two logistic regression models were trained and tested using a leave-one-case-out cross-validation method to predict the likelihood of each testing case being positive in the next subsequent screening. The new prediction model yielded the maximum prediction accuracy with an area under a ROC curve of AUC  =  0.65  ±  0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of (2.95, 6.83). The results also showed an increasing trend in the adjusted odds ratio and risk prediction scores (p  breast cancer risk.

  10. Computer application in scientific investigations

    International Nuclear Information System (INIS)

    Govorun, N.N.

    1981-01-01

    A short review of the computer development and application and software in JINR for the last 15 years is presented. Main trends of studies on computer application in experimental and theoretical investigations are enumerated: software of computers and their systems, software of data processing systems, designing automatic and automized systems for measuring track detectors images, development of technique of carrying out experiments on computer line, packets of applied computer codes and specialized systems. The development of the on line technique is successfully used in investigations of nuclear processes at relativistic energies. The new trend is the development of television methods of data output and its computer recording [ru

  11. Applications of Computer Algebra Conference

    CERN Document Server

    Martínez-Moro, Edgar

    2017-01-01

    The Applications of Computer Algebra (ACA) conference covers a wide range of topics from Coding Theory to Differential Algebra to Quantam Computing, focusing on the interactions of these and other areas with the discipline of Computer Algebra. This volume provides the latest developments in the field as well as its applications in various domains, including communications, modelling, and theoretical physics. The book will appeal to researchers and professors of computer algebra, applied mathematics, and computer science, as well as to engineers and computer scientists engaged in research and development.

  12. Applied data-centric social sciences concepts, data, computation, and theory

    CERN Document Server

    Sato, Aki-Hiro

    2014-01-01

    Applied data-centric social sciences aim to develop both methodology and practical applications of various fields of social sciences and businesses with rich data. Specifically, in the social sciences, a vast amount of data on human activities may be useful for understanding collective human nature. In this book, the author introduces several mathematical techniques for handling a huge volume of data and analysing collective human behaviour. The book is constructed from data-oriented investigation, with mathematical methods and expressions used for dealing with data for several specific problems. The fundamental philosophy underlying the book is that both mathematical and physical concepts are determined by the purposes of data analysis. This philosophy is shown throughout exemplar studies of several fields in socio-economic systems. From a data-centric point of view, the author proposes a concept that may change people’s minds and cause them to start thinking from the basis of data. Several goals underlie ...

  13. Evolutionary computation applied to the reconstruction of 3-D surface topography in the SEM.

    Science.gov (United States)

    Kodama, Tetsuji; Li, Xiaoyuan; Nakahira, Kenji; Ito, Dai

    2005-10-01

    A genetic algorithm has been applied to the line profile reconstruction from the signals of the standard secondary electron (SE) and/or backscattered electron detectors in a scanning electron microscope. This method solves the topographical surface reconstruction problem as one of combinatorial optimization. To extend this optimization approach for three-dimensional (3-D) surface topography, this paper considers the use of a string coding where a 3-D surface topography is represented by a set of coordinates of vertices. We introduce the Delaunay triangulation, which attains the minimum roughness for any set of height data to capture the fundamental features of the surface being probed by an electron beam. With this coding, the strings are processed with a class of hybrid optimization algorithms that combine genetic algorithms and simulated annealing algorithms. Experimental results on SE images are presented.

  14. GEM-E3: A computable general equilibrium model applied for Switzerland

    Energy Technology Data Exchange (ETDEWEB)

    Bahn, O. [Paul Scherrer Inst., CH-5232 Villigen PSI (Switzerland); Frei, C. [Ecole Polytechnique Federale de Lausanne (EPFL) and Paul Scherrer Inst. (Switzerland)

    2000-01-01

    The objectives of the European Research Project GEM-E3-ELITE, funded by the European Commission and coordinated by the Centre for European Economic Research (Germany), were to further develop the general equilibrium model GEM-E3 (Capros et al., 1995, 1997) and to conduct policy analysis through case studies. GEM-E3 is an applied general equilibrium model that analyses the macro-economy and its interaction with the energy system and the environment through the balancing of energy supply and demand, atmospheric emissions and pollution control, together with the fulfillment of overall equilibrium conditions. PSI's research objectives within GEM-E3-ELITE were to implement and apply GEM-E3 for Switzerland. The first objective required in particular the development of a Swiss database for each of GEM-E3 modules (economic module and environmental module). For the second objective, strategies to reduce CO{sub 2} emissions were evaluated for Switzerland. In order to develop the economic, PSI collaborated with the Laboratory of Applied Economics (LEA) of the University of Geneva and the Laboratory of Energy Systems (LASEN) of the Federal Institute of Technology in Lausanne (EPFL). The Swiss Federal Statistical Office (SFSO) and the Institute for Business Cycle Research (KOF) of the Swiss Federal Institute of Technology (ETH Zurich) contributed also data. The Swiss environmental database consists mainly of an Energy Balance Table and of an Emission Coefficients Table. Both were designed using national and international official statistics. The Emission Coefficients Table is furthermore based on know-how of the PSI GaBE Project. Using GEM-E3 Switzerland, two strategies to reduce the Swiss CO{sub 2} emissions were evaluated: a carbon tax ('tax only' strategy), and the combination of a carbon tax with the buying of CO{sub 2} emission permits ('permits and tax' strategy). In the first strategy, Switzerland would impose the necessary carbon tax to achieve

  15. GEM-E3: A computable general equilibrium model applied for Switzerland

    International Nuclear Information System (INIS)

    Bahn, O.; Frei, C.

    2000-01-01

    The objectives of the European Research Project GEM-E3-ELITE, funded by the European Commission and coordinated by the Centre for European Economic Research (Germany), were to further develop the general equilibrium model GEM-E3 (Capros et al., 1995, 1997) and to conduct policy analysis through case studies. GEM-E3 is an applied general equilibrium model that analyses the macro-economy and its interaction with the energy system and the environment through the balancing of energy supply and demand, atmospheric emissions and pollution control, together with the fulfillment of overall equilibrium conditions. PSI's research objectives within GEM-E3-ELITE were to implement and apply GEM-E3 for Switzerland. The first objective required in particular the development of a Swiss database for each of GEM-E3 modules (economic module and environmental module). For the second objective, strategies to reduce CO 2 emissions were evaluated for Switzerland. In order to develop the economic, PSI collaborated with the Laboratory of Applied Economics (LEA) of the University of Geneva and the Laboratory of Energy Systems (LASEN) of the Federal Institute of Technology in Lausanne (EPFL). The Swiss Federal Statistical Office (SFSO) and the Institute for Business Cycle Research (KOF) of the Swiss Federal Institute of Technology (ETH Zurich) contributed also data. The Swiss environmental database consists mainly of an Energy Balance Table and of an Emission Coefficients Table. Both were designed using national and international official statistics. The Emission Coefficients Table is furthermore based on know-how of the PSI GaBE Project. Using GEM-E3 Switzerland, two strategies to reduce the Swiss CO 2 emissions were evaluated: a carbon tax ('tax only' strategy), and the combination of a carbon tax with the buying of CO 2 emission permits ('permits and tax' strategy). In the first strategy, Switzerland would impose the necessary carbon tax to achieve the reduction target, and use the tax

  16. Applications of interval computations

    CERN Document Server

    Kreinovich, Vladik

    1996-01-01

    Primary Audience for the Book • Specialists in numerical computations who are interested in algorithms with automatic result verification. • Engineers, scientists, and practitioners who desire results with automatic verification and who would therefore benefit from the experience of suc­ cessful applications. • Students in applied mathematics and computer science who want to learn these methods. Goal Of the Book This book contains surveys of applications of interval computations, i. e. , appli­ cations of numerical methods with automatic result verification, that were pre­ sented at an international workshop on the subject in EI Paso, Texas, February 23-25, 1995. The purpose of this book is to disseminate detailed and surveyed information about existing and potential applications of this new growing field. Brief Description of the Papers At the most fundamental level, interval arithmetic operations work with sets: The result of a single arithmetic operation is the set of all possible results as the o...

  17. Computational problems in engineering

    CERN Document Server

    Mladenov, Valeri

    2014-01-01

    This book provides readers with modern computational techniques for solving variety of problems from electrical, mechanical, civil and chemical engineering. Mathematical methods are presented in a unified manner, so they can be applied consistently to problems in applied electromagnetics, strength of materials, fluid mechanics, heat and mass transfer, environmental engineering, biomedical engineering, signal processing, automatic control and more.   • Features contributions from distinguished researchers on significant aspects of current numerical methods and computational mathematics; • Presents actual results and innovative methods that provide numerical solutions, while minimizing computing times; • Includes new and advanced methods and modern variations of known techniques that can solve difficult scientific problems efficiently.  

  18. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  19. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  20. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  1. Computers: Instruments of Change.

    Science.gov (United States)

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  2. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  3. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  4. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  5. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  6. Computational complementarity

    International Nuclear Information System (INIS)

    Finkelstein, D.; Finkelstein, S.R.

    1983-01-01

    Interactivity generates paradox in that the interactive control by one system C of predicates about another system-under-study S may falsify these predicates. An ''interactive logic'' is formulated to resolve this paradox of interactivity. The construction generalizes one, the Galois connection, used by Von Neumann for the similar quantum paradox. The construction is applied to a transition system, a concept that includes general systems, automata, and quantum systems. In some (classical) automata S, the interactive predicates about S show quantumlike complementarity arising from interactivity. The interactive paradox generates the quantum paradox. Some classical S's have noncommutative algebras of interactively observable coordinates similar to the Heisenberg algebra of a quantum system. Such S's are ''hidden variable'' models of quantum theory not covered by the hidden variable studies of Von Neumann, Bohm, Bell, or Kochen and Specker. It is conceivable that some quantum effects in Nature arise from interactivity. (author)

  7. Computational complementarity

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, D; Finkelstein, S R

    1983-08-01

    Interactivity generates paradox in that the interactive control by one system C of predicates about another system-under-study S may falsify these predicates. An ''interactive logic'' is formulated to resolve this paradox of interactivity. The construction generalizes one, the Galois connection, used by Von Neumann for the similar quantum paradox. The construction is applied to a transition system, a concept that includes general systems, automata, and quantum systems. In some (classical) automata S, the interactive predicates about S show quantumlike complementarity arising from interactivity. The interactive paradox generates the quantum paradox. Some classical S's have noncommutative algebras of interactively observable coordinates similar to the Heisenberg algebra of a quantum system. Such S's are ''hidden variable'' models of quantum theory not covered by the hidden variable studies of Von Neumann, Bohm, Bell, or Kochen and Specker. It is conceivable that some quantum effects in Nature arise from interactivity.

  8. Evaluation of Shifted Excitation Raman Difference Spectroscopy and Comparison to Computational Background Correction Methods Applied to Biochemical Raman Spectra.

    Science.gov (United States)

    Cordero, Eliana; Korinth, Florian; Stiebing, Clara; Krafft, Christoph; Schie, Iwan W; Popp, Jürgen

    2017-07-27

    Raman spectroscopy provides label-free biochemical information from tissue samples without complicated sample preparation. The clinical capability of Raman spectroscopy has been demonstrated in a wide range of in vitro and in vivo applications. However, a challenge for in vivo applications is the simultaneous excitation of auto-fluorescence in the majority of tissues of interest, such as liver, bladder, brain, and others. Raman bands are then superimposed on a fluorescence background, which can be several orders of magnitude larger than the Raman signal. To eliminate the disturbing fluorescence background, several approaches are available. Among instrumentational methods shifted excitation Raman difference spectroscopy (SERDS) has been widely applied and studied. Similarly, computational techniques, for instance extended multiplicative scatter correction (EMSC), have also been employed to remove undesired background contributions. Here, we present a theoretical and experimental evaluation and comparison of fluorescence background removal approaches for Raman spectra based on SERDS and EMSC.

  9. Applying knowledge engineering tools for the personal computer to the operation and maintenance of radiopharmaceutical production systems

    International Nuclear Information System (INIS)

    Alexoff, D.L.

    1990-01-01

    A practical consequence of over three decades of Artificial Intelligence (AI) research has been the emergence of Personal Computer-based AI programming tools. A special class of this microcomputer-based software, called expert systems shells, is now applied routinely outside the realm of classical AI to solve many types of problems, particularly in analytical chemistry. These AI tools offer not only some of the advantages inherent to symbolic programming languages, but, as significant, they bring with them advanced program development environments which can facilitate software development and maintenance. Exploitation of this enhanced programming environment was a major motivation for using an AI tool. The goal of this work is to evaluate the use of an example-based expert system shell (1st Class FUSION, 1st Class Expert Systems, Inc.) as a programming tool for developing software useful for automated radiopharmaceutical production

  10. Computer Virus and Trends

    OpenAIRE

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  11. (Some) Computer Futures: Mainframes.

    Science.gov (United States)

    Joseph, Earl C.

    Possible futures for the world of mainframe computers can be forecast through studies identifying forces of change and their impact on current trends. Some new prospects for the future have been generated by advances in information technology; for example, recent United States successes in applied artificial intelligence (AI) have created new…

  12. Performance analysis of the FDTD method applied to holographic volume gratings: Multi-core CPU versus GPU computing

    Science.gov (United States)

    Francés, J.; Bleda, S.; Neipp, C.; Márquez, A.; Pascual, I.; Beléndez, A.

    2013-03-01

    The finite-difference time-domain method (FDTD) allows electromagnetic field distribution analysis as a function of time and space. The method is applied to analyze holographic volume gratings (HVGs) for the near-field distribution at optical wavelengths. Usually, this application requires the simulation of wide areas, which implies more memory and time processing. In this work, we propose a specific implementation of the FDTD method including several add-ons for a precise simulation of optical diffractive elements. Values in the near-field region are computed considering the illumination of the grating by means of a plane wave for different angles of incidence and including absorbing boundaries as well. We compare the results obtained by FDTD with those obtained using a matrix method (MM) applied to diffraction gratings. In addition, we have developed two optimized versions of the algorithm, for both CPU and GPU, in order to analyze the improvement of using the new NVIDIA Fermi GPU architecture versus highly tuned multi-core CPU as a function of the size simulation. In particular, the optimized CPU implementation takes advantage of the arithmetic and data transfer streaming SIMD (single instruction multiple data) extensions (SSE) included explicitly in the code and also of multi-threading by means of OpenMP directives. A good agreement between the results obtained using both FDTD and MM methods is obtained, thus validating our methodology. Moreover, the performance of the GPU is compared to the SSE+OpenMP CPU implementation, and it is quantitatively determined that a highly optimized CPU program can be competitive for a wider range of simulation sizes, whereas GPU computing becomes more powerful for large-scale simulations.

  13. Parallel computation

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Maurel, G.; Silva, J.; Wolff-Bacha, F.

    1997-01-01

    The work in the field of parallel processing has developed as research activities using several numerical Monte Carlo simulations related to basic or applied current problems of nuclear and particle physics. For the applications utilizing the GEANT code development or improvement works were done on parts simulating low energy physical phenomena like radiation, transport and interaction. The problem of actinide burning by means of accelerators was approached using a simulation with the GEANT code. A program of neutron tracking in the range of low energies up to the thermal region has been developed. It is coupled to the GEANT code and permits in a single pass the simulation of a hybrid reactor core receiving a proton burst. Other works in this field refers to simulations for nuclear medicine applications like, for instance, development of biological probes, evaluation and characterization of the gamma cameras (collimators, crystal thickness) as well as the method for dosimetric calculations. Particularly, these calculations are suited for a geometrical parallelization approach especially adapted to parallel machines of the TN310 type. Other works mentioned in the same field refer to simulation of the electron channelling in crystals and simulation of the beam-beam interaction effect in colliders. The GEANT code was also used to simulate the operation of germanium detectors designed for natural and artificial radioactivity monitoring of environment

  14. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  15. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  16. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  17. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  18. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  19. Computers in nuclear medicine

    International Nuclear Information System (INIS)

    Giannone, Carlos A.

    1999-01-01

    This chapter determines: capture and observation of images in computers; hardware and software used, personal computers, networks and workstations. The use of special filters determine the quality image

  20. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  1. Computers and conversation

    CERN Document Server

    Luff, Paul; Gilbert, Nigel G

    1986-01-01

    In the past few years a branch of sociology, conversation analysis, has begun to have a significant impact on the design of human*b1computer interaction (HCI). The investigation of human*b1human dialogue has emerged as a fruitful foundation for interactive system design.****This book includes eleven original chapters by leading researchers who are applying conversation analysis to HCI. The fundamentals of conversation analysis are outlined, a number of systems are described, and a critical view of their value for HCI is offered.****Computers and Conversation will be of interest to all concerne

  2. Computer-controlled attenuator.

    Science.gov (United States)

    Mitov, D; Grozev, Z

    1991-01-01

    Various possibilities for applying electronic computer-controlled attenuators for the automation of physiological experiments are considered. A detailed description is given of the design of a 4-channel computer-controlled attenuator, in two of the channels of which the output signal can change by a linear step, in the other two channels--by a logarithmic step. This, together with the existence of additional programmable timers, allows to automate a wide range of studies in different spheres of physiology and psychophysics, including vision and hearing.

  3. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  4. Performance of computer-aided detection applied to full-field digital mammography in detection of breast cancers

    International Nuclear Information System (INIS)

    Sadaf, Arifa; Crystal, Pavel; Scaranelo, Anabel; Helbich, Thomas

    2011-01-01

    Objective: The aim of this retrospective study was to evaluate performance of computer-aided detection (CAD) with full-field digital mammography (FFDM) in detection of breast cancers. Materials and Methods: CAD was retrospectively applied to standard mammographic views of 127 cases with biopsy proven breast cancers detected with FFDM (Senographe 2000, GE Medical Systems). CAD sensitivity was assessed in total group of 127 cases and for subgroups based on breast density, mammographic lesion type, mammographic lesion size, histopathology and mode of presentation. Results: Overall CAD sensitivity was 91% (115 of 127 cases). There were no statistical differences (p > 0.1) in CAD detection of cancers in dense breasts 90% (53/59) versus non-dense breasts 91% (62/68). There was statistical difference (p 20 mm 97% (22/23). Conclusion: CAD applied to FFDM showed 100% sensitivity in identifying cancers manifesting as microcalcifications only and high sensitivity 86% (71/83) for other mammographic appearances of cancer. Sensitivity is influenced by lesion size. CAD in FFDM is an adjunct helping radiologist in early detection of breast cancers.

  5. Mathematics for computer graphics

    CERN Document Server

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  6. Computations and interaction

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, S.P.; Tilburg, van P.J.A.; Natarajan, R.; Ojo, A.

    2011-01-01

    We enhance the notion of a computation of the classical theory of computing with the notion of interaction. In this way, we enhance a Turing machine as a model of computation to a Reactive Turing Machine that is an abstract model of a computer as it is used nowadays, always interacting with the user

  7. Progress in computational toxicology.

    Science.gov (United States)

    Ekins, Sean

    2014-01-01

    Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Get set for computer science

    CERN Document Server

    Edwards, Alistair

    2006-01-01

    This book is aimed at students who are thinking of studying Computer Science or a related topic at university. Part One is a brief introduction to the topics that make up Computer Science, some of which you would expect to find as course modules in a Computer Science programme. These descriptions should help you to tell the difference between Computer Science as taught in different departments and so help you to choose a course that best suits you. Part Two builds on what you have learned about the nature of Computer Science by giving you guidance in choosing universities and making your appli

  9. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  10. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  11. Proceedings of the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering - M and C 2013

    International Nuclear Information System (INIS)

    2013-01-01

    The Mathematics and Computation Division of the American Nuclear (ANS) and the Idaho Section of the ANS hosted the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering (M and C 2013). This proceedings contains over 250 full papers with topics ranging from reactor physics; radiation transport; materials science; nuclear fuels; core performance and optimization; reactor systems and safety; fluid dynamics; medical applications; analytical and numerical methods; algorithms for advanced architectures; and validation verification, and uncertainty quantification

  12. Proceedings of the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering - M and C 2013

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-07-01

    The Mathematics and Computation Division of the American Nuclear (ANS) and the Idaho Section of the ANS hosted the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering (M and C 2013). This proceedings contains over 250 full papers with topics ranging from reactor physics; radiation transport; materials science; nuclear fuels; core performance and optimization; reactor systems and safety; fluid dynamics; medical applications; analytical and numerical methods; algorithms for advanced architectures; and validation verification, and uncertainty quantification.

  13. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  14. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  15. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  16. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  17. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  18. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  19. Computers and Computation. Readings from Scientific American.

    Science.gov (United States)

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  20. Know Your Personal Computer Introduction to Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 1. Know Your Personal Computer Introduction to Computers. Siddhartha Kumar Ghoshal. Series Article Volume 1 Issue 1 January 1996 pp 48-55. Fulltext. Click here to view fulltext PDF. Permanent link:

  1. Computational synthetic geometry

    CERN Document Server

    Bokowski, Jürgen

    1989-01-01

    Computational synthetic geometry deals with methods for realizing abstract geometric objects in concrete vector spaces. This research monograph considers a large class of problems from convexity and discrete geometry including constructing convex polytopes from simplicial complexes, vector geometries from incidence structures and hyperplane arrangements from oriented matroids. It turns out that algorithms for these constructions exist if and only if arbitrary polynomial equations are decidable with respect to the underlying field. Besides such complexity theorems a variety of symbolic algorithms are discussed, and the methods are applied to obtain new mathematical results on convex polytopes, projective configurations and the combinatorics of Grassmann varieties. Finally algebraic varieties characterizing matroids and oriented matroids are introduced providing a new basis for applying computer algebra methods in this field. The necessary background knowledge is reviewed briefly. The text is accessible to stud...

  2. Computers, Nanotechnology and Mind

    Science.gov (United States)

    Ekdahl, Bertil

    2008-10-01

    In 1958, two years after the Dartmouth conference, where the term artificial intelligence was coined, Herbert Simon and Allen Newell asserted the existence of "machines that think, that learn and create." They were further prophesying that the machines' capacity would increase and be on par with the human mind. Now, 50 years later, computers perform many more tasks than one could imagine in the 1950s but, virtually, no computer can do more than could the first digital computer, developed by John von Neumann in the 1940s. Computers still follow algorithms, they do not create them. However, the development of nanotechnology seems to have given rise to new hopes. With nanotechnology two things are supposed to happen. Firstly, due to the small scale it will be possible to construct huge computer memories which are supposed to be the precondition for building an artificial brain, secondly, nanotechnology will make it possible to scan the brain which in turn will make reverse engineering possible; the mind will be decoded by studying the brain. The consequence of such a belief is that the brain is no more than a calculator, i.e., all that the mind can do is in principle the results of arithmetical operations. Computers are equivalent to formal systems which in turn was an answer to an idea by Hilbert that proofs should contain ideal statements for which operations cannot be applied in a contentual way. The advocates of artificial intelligence will place content in a machine that is developed not only to be free of content but also cannot contain content. In this paper I argue that the hope for artificial intelligence is in vain.

  3. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  4. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  5. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  6. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  7. Computed Tomography. Chapter 11

    Energy Technology Data Exchange (ETDEWEB)

    Geleijns, J. [Leiden University Medical Centre, Leiden (Netherlands)

    2014-09-15

    After its clinical introduction in 1971, computed tomography (CT) developed from an X ray modality that was limited to axial imaging of the brain in neuroradiology into a versatile 3-D whole body imaging modality for a wide range of applications, including oncology, vascular radiology, cardiology, traumatology and interventional radiology. CT is applied for diagnosis and follow-up studies of patients, for planning of radiotherapy, and even for screening of healthy subpopulations with specific risk factors.

  8. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  9. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  10. Computer Lexis and Terminology

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2011-04-01

    Full Text Available Computer becomes a widely used tool in everyday work and at home. Every computer user sees texts on its screen containing a lot of words naming new concepts. Those words come from the terminology used by specialists. The common vocabury between computer terminology and lexis of everyday language comes into existence. The article deals with the part of computer terminology which goes to everyday usage and the influence of ordinary language to computer terminology. The relation between English and Lithuanian computer terminology, the construction and pronouncing of acronyms are discussed as well.

  11. Quantum computer science

    CERN Document Server

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  12. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  13. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  14. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  15. Security and policy driven computing

    CERN Document Server

    Liu, Lei

    2010-01-01

    Security and Policy Driven Computing covers recent advances in security, storage, parallelization, and computing as well as applications. The author incorporates a wealth of analysis, including studies on intrusion detection and key management, computer storage policy, and transactional management.The book first describes multiple variables and index structure derivation for high dimensional data distribution and applies numeric methods to proposed search methods. It also focuses on discovering relations, logic, and knowledge for policy management. To manage performance, the text discusses con

  16. PREFACE: First International Congress of the International Association of Inverse Problems (IPIA): Applied Inverse Problems 2007: Theoretical and Computational Aspects

    Science.gov (United States)

    Uhlmann, Gunther

    2008-07-01

    This volume represents the proceedings of the fourth Applied Inverse Problems (AIP) international conference and the first congress of the Inverse Problems International Association (IPIA) which was held in Vancouver, Canada, June 25 29, 2007. The organizing committee was formed by Uri Ascher, University of British Columbia, Richard Froese, University of British Columbia, Gary Margrave, University of Calgary, and Gunther Uhlmann, University of Washington, chair. The conference was part of the activities of the Pacific Institute of Mathematical Sciences (PIMS) Collaborative Research Group on inverse problems (http://www.pims.math.ca/scientific/collaborative-research-groups/past-crgs). This event was also supported by grants from NSF and MITACS. Inverse Problems (IP) are problems where causes for a desired or an observed effect are to be determined. They lie at the heart of scientific inquiry and technological development. The enormous increase in computing power and the development of powerful algorithms have made it possible to apply the techniques of IP to real-world problems of growing complexity. Applications include a number of medical as well as other imaging techniques, location of oil and mineral deposits in the earth's substructure, creation of astrophysical images from telescope data, finding cracks and interfaces within materials, shape optimization, model identification in growth processes and, more recently, modelling in the life sciences. The series of Applied Inverse Problems (AIP) Conferences aims to provide a primary international forum for academic and industrial researchers working on all aspects of inverse problems, such as mathematical modelling, functional analytic methods, computational approaches, numerical algorithms etc. The steering committee of the AIP conferences consists of Heinz Engl (Johannes Kepler Universität, Austria), Joyce McLaughlin (RPI, USA), William Rundell (Texas A&M, USA), Erkki Somersalo (Helsinki University of Technology

  17. An Evaluation into the Views of Candidate Mathematics Teachers over "Tablet Computers" to be Applied in Secondary Schools

    Science.gov (United States)

    Aksu, Hasan Hüseyin

    2014-01-01

    This study aims to investigate, in terms of different variables, the views of prospective Mathematics teachers on tablet computers to be used in schools as an outcome of the Fatih Project, which was initiated by the Ministry of National Education. In the study, scanning model, one of the quantitative research methods, was used. In the population…

  18. Extending and Applying Spartan to Perform Temporal Sensitivity Analyses for Predicting Changes in Influential Biological Pathways in Computational Models.

    Science.gov (United States)

    Alden, Kieran; Timmis, Jon; Andrews, Paul S; Veiga-Fernandes, Henrique; Coles, Mark

    2017-01-01

    Through integrating real time imaging, computational modelling, and statistical analysis approaches, previous work has suggested that the induction of and response to cell adhesion factors is the key initiating pathway in early lymphoid tissue development, in contrast to the previously accepted view that the process is triggered by chemokine mediated cell recruitment. These model derived hypotheses were developed using spartan, an open-source sensitivity analysis toolkit designed to establish and understand the relationship between a computational model and the biological system that model captures. Here, we extend the functionality available in spartan to permit the production of statistical analyses that contrast the behavior exhibited by a computational model at various simulated time-points, enabling a temporal analysis that could suggest whether the influence of biological mechanisms changes over time. We exemplify this extended functionality by using the computational model of lymphoid tissue development as a time-lapse tool. By generating results at twelve- hour intervals, we show how the extensions to spartan have been used to suggest that lymphoid tissue development could be biphasic, and predict the time-point when a switch in the influence of biological mechanisms might occur.

  19. Applying activity theory to computer-supported collaborative learning and work-based activities in corporate settings

    NARCIS (Netherlands)

    Collis, Betty; Margaryan, A.

    2004-01-01

    Business needs in many corporations call for learning outcomes that involve problem solutions, and creating and sharing new knowledge within worksplace situation that may involve collaboration among members of a team. We argue that work-based activities (WBA) and computer-supported collaborative

  20. The Language Factor in Elementary Mathematics Assessments: Computational Skills and Applied Problem Solving in a Multidimensional IRT Framework

    Science.gov (United States)

    Hickendorff, Marian

    2013-01-01

    The results of an exploratory study into measurement of elementary mathematics ability are presented. The focus is on the abilities involved in solving standard computation problems on the one hand and problems presented in a realistic context on the other. The objectives were to assess to what extent these abilities are shared or distinct, and…

  1. 36 CFR 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Science.gov (United States)

    2010-07-01

    ... access use of the Internet on NARA-supplied computers? 1254.32 Section 1254.32 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION PUBLIC AVAILABILITY AND USE USING RECORDS AND DONATED... for Internet use in all NARA research rooms. The number of workstations varies per location. We...

  2. Methodology and computer program for applying improved, inelastic ERR for the design of mine layouts on planar reefs.

    CSIR Research Space (South Africa)

    Spottiswoode, SM

    2002-08-01

    Full Text Available and the visco-plastic models of Napier and Malan (1997) and Malan (2002). Methodologies and a computer program (MINF) are developed during this project that write synthetic catalogues of seismic events to simulate the rock response to mining...

  3. Using Physical and Computer Simulations of Collective Behaviour as an Introduction to Modelling Concepts for Applied Biologists

    Science.gov (United States)

    Rands, Sean A.

    2012-01-01

    Models are an important tool in science: not only do they act as a convenient device for describing a system or problem, but they also act as a conceptual tool for framing and exploring hypotheses. Models, and in particular computer simulations, are also an important education tool for training scientists, but it is difficult to teach students the…

  4. Ten iterative steps for model development and evaluation applied to Computational Fluid Dynamics for Environmental Fluid Mechanic

    NARCIS (Netherlands)

    Blocken, B.J.E.; Gualtieri, C.

    2012-01-01

    Computational Fluid Dynamics (CFD) is increasingly used to study a wide variety of complex Environmental Fluid Mechanics (EFM) processes, such as water flow and turbulent mixing of contaminants in rivers and estuaries and wind flow and air pollution dispersion in urban areas. However, the accuracy

  5. For the Love of Statistics: Appreciating and Learning to Apply Experimental Analysis and Statistics through Computer Programming Activities

    Science.gov (United States)

    Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.

    2016-01-01

    For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…

  6. An approach to computing discrete adjoints for MPI-parallelized models applied to Ice Sheet System Model 4.11

    Directory of Open Access Journals (Sweden)

    E. Larour

    2016-11-01

    Full Text Available Within the framework of sea-level rise projections, there is a strong need for hindcast validation of the evolution of polar ice sheets in a way that tightly matches observational records (from radar, gravity, and altimetry observations mainly. However, the computational requirements for making hindcast reconstructions possible are severe and rely mainly on the evaluation of the adjoint state of transient ice-flow models. Here, we look at the computation of adjoints in the context of the NASA/JPL/UCI Ice Sheet System Model (ISSM, written in C++ and designed for parallel execution with MPI. We present the adaptations required in the way the software is designed and written, but also generic adaptations in the tools facilitating the adjoint computations. We concentrate on the use of operator overloading coupled with the AdjoinableMPI library to achieve the adjoint computation of the ISSM. We present a comprehensive approach to (1 carry out type changing through the ISSM, hence facilitating operator overloading, (2 bind to external solvers such as MUMPS and GSL-LU, and (3 handle MPI-based parallelism to scale the capability. We demonstrate the success of the approach by computing sensitivities of hindcast metrics such as the misfit to observed records of surface altimetry on the northeastern Greenland Ice Stream, or the misfit to observed records of surface velocities on Upernavik Glacier, central West Greenland. We also provide metrics for the scalability of the approach, and the expected performance. This approach has the potential to enable a new generation of hindcast-validated projections that make full use of the wealth of datasets currently being collected, or already collected, in Greenland and Antarctica.

  7. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  8. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  9. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  10. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Watase, Yoshiyuki

    1991-09-15

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors.

  11. Searching with Quantum Computers

    OpenAIRE

    Grover, Lov K.

    2000-01-01

    This article introduces quantum computation by analogy with probabilistic computation. A basic description of the quantum search algorithm is given by representing the algorithm as a C program in a novel way.

  12. Book Review: Computational Topology

    DEFF Research Database (Denmark)

    Raussen, Martin

    2011-01-01

    Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...

  13. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  14. Know Your Personal Computer

    Indian Academy of Sciences (India)

    computer with IBM PC .... read by a human and not translated by a compiler are called .... by different stages of education becomes a computer scientist. ... ancestors knew and carried out the semantic actions without question or comment.

  15. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  16. SSCL computer planning

    International Nuclear Information System (INIS)

    Price, L.E.

    1990-01-01

    The SSC Laboratory is in the process of planning the acquisition of a substantial computing system to support the design of detectors. Advice has been sought from users and computer experts in several stages. This paper discuss this process

  17. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  18. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  19. Quantum Computer Science

    Science.gov (United States)

    Mermin, N. David

    2007-08-01

    Preface; 1. Cbits and Qbits; 2. General features and some simple examples; 3. Breaking RSA encryption with a quantum computer; 4. Searching with a quantum computer; 5. Quantum error correction; 6. Protocols that use just a few Qbits; Appendices; Index.

  20. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  1. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... are the limitations of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed ... nasal cavity by small openings. top of page What are some common uses of the procedure? CT ...

  2. Computer Technology Directory.

    Science.gov (United States)

    Exceptional Parent, 1990

    1990-01-01

    This directory lists approximately 300 commercial vendors that offer computer hardware, software, and communication aids for children with disabilities. The company listings indicate computer compatibility and specific disabilities served by their products. (JDD)

  3. My Computer Is Learning.

    Science.gov (United States)

    Good, Ron

    1986-01-01

    Describes instructional uses of computer programs found in David Heiserman's book "Projects in Machine Intelligence for Your Home Computer." The programs feature "creatures" of various colors that move around within a rectangular white border. (JN)

  4. What is Computed Tomography?

    Science.gov (United States)

    ... Imaging Medical X-ray Imaging What is Computed Tomography? Share Tweet Linkedin Pin it More sharing options ... Chest X ray Image back to top Computed Tomography (CT) Although also based on the variable absorption ...

  5. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  6. Computing for Belle

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    2s-1, 10 times as much as we obtain now. This presentation describes Belle's efficient computing operations, struggles to manage large amount of raw and physics data, and plans for Belle computing for Super KEKB/Belle.

  7. Computational Continuum Mechanics

    CERN Document Server

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  8. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  9. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  10. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  11. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Stroke Brain Tumors Computer Tomography (CT) Safety During Pregnancy Head and Neck Cancer X-ray, Interventional Radiology and Nuclear Medicine Radiation Safety Images related to Computed Tomography (CT) - ...

  12. Intimacy and Computer Communication.

    Science.gov (United States)

    Robson, Dave; Robson, Maggie

    1998-01-01

    Addresses the relationship between intimacy and communication that is based on computer technology. Discusses definitions of intimacy and the nature of intimate conversations that use computers as a communications medium. Explores implications for counseling. (MKA)

  13. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ... ray beam follows a spiral path. A special computer program processes this large volume of data to ...

  14. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  15. Computational physics and applied mathematics capability review June 8-10, 2010 (Advance materials to committee members)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Stephen R [Los Alamos National Laboratory

    2010-01-01

    Los Alamos National Laboratory will review its Computational Physics and Applied Mathematics (CPAM) capabilities in 2010. The goals of capability reviews are to assess the quality of science, technology, and engineering (STE) performed by the capability, evaluate the integration of this capability across the Laboratory and within the scientific community, examine the relevance of this capability to the Laboratory's programs, and provide advice on the current and future directions of this capability. This is the first such review for CPAM, which has a long and unique history at the laboratory, starting from the inception of the Laboratory in 1943. The CPAM capability covers an extremely broad technical area at Los Alamos, encompassing a wide array of disciplines, research topics, and organizations. A vast array of technical disciplines and activities are included in this capability, from general numerical modeling, to coupled mUlti-physics simulations, to detailed domain science activities in mathematics, methods, and algorithms. The CPAM capability involves over 12 different technical divisions and a majority of our programmatic and scientific activities. To make this large scope tractable, the CPAM capability is broken into the following six technical 'themes.' These themes represent technical slices through the CP AM capability and collect critical core competencies of the Laboratory, each of which contributes to the capability (and each of which is divided into multiple additional elements in the detailed descriptions of the themes in subsequent sections): (1) Computational Fluid Dynamics - This theme speaks to the vast array of scientific capabilities for the simulation of fluids under shocks, low-speed flow, and turbulent conditions - which are key, historical, and fundamental strengths of the laboratory; (2) Partial Differential Equations - The technical scope of this theme is the applied mathematics and numerical solution of partial

  16. Security and Privacy in Fog Computing: Challenges

    OpenAIRE

    Mukherjee, Mithun; Matam, Rakesh; Shu, Lei; Maglaras, Leandros; Ferrag, Mohamed Amine; Choudhry, Nikumani; Kumar, Vikas

    2017-01-01

    open access article Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing toward the edge of the networks while offloading the cloud data centers and reducing service latency to the end users. However, the characteristics of fog computing arise new security and privacy challenges. The existing security and privacy measurements for cloud computing cannot be directly applied to the fog computing due to its features, such as mobility, heteroge...

  17. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  18. Nanoelectronics: Metrology and Computation

    International Nuclear Information System (INIS)

    Lundstrom, Mark; Clark, Jason V.; Klimeck, Gerhard; Raman, Arvind

    2007-01-01

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (www.nanoHUB.org) as a specific example

  19. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  20. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  1. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  2. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  3. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  4. Quantum mechanics and computation

    International Nuclear Information System (INIS)

    Cirac Sasturain, J. I.

    2000-01-01

    We review how some of the basic principles of Quantum Mechanics can be used in the field of computation. In particular, we explain why a quantum computer can perform certain tasks in a much more efficient way than the computers we have available nowadays. We give the requirements for a quantum system to be able to implement a quantum computer and illustrate these requirements in some particular physical situations. (Author) 16 refs

  5. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  6. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  7. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  8. Observer Evaluation of a Metal Artifact Reduction Algorithm Applied to Head and Neck Cone Beam Computed Tomographic Images

    Energy Technology Data Exchange (ETDEWEB)

    Korpics, Mark; Surucu, Murat; Mescioglu, Ibrahim; Alite, Fiori; Block, Alec M.; Choi, Mehee; Emami, Bahman; Harkenrider, Matthew M.; Solanki, Abhishek A.; Roeske, John C., E-mail: jroeske@lumc.edu

    2016-11-15

    Purpose and Objectives: To quantify, through an observer study, the reduction in metal artifacts on cone beam computed tomographic (CBCT) images using a projection-interpolation algorithm, on images containing metal artifacts from dental fillings and implants in patients treated for head and neck (H&N) cancer. Methods and Materials: An interpolation-substitution algorithm was applied to H&N CBCT images containing metal artifacts from dental fillings and implants. Image quality with respect to metal artifacts was evaluated subjectively and objectively. First, 6 independent radiation oncologists were asked to rank randomly sorted blinded images (before and after metal artifact reduction) using a 5-point rating scale (1 = severe artifacts; 5 = no artifacts). Second, the standard deviation of different regions of interest (ROI) within each image was calculated and compared with the mean rating scores. Results: The interpolation-substitution technique successfully reduced metal artifacts in 70% of the cases. From a total of 60 images from 15 H&N cancer patients undergoing image guided radiation therapy, the mean rating score on the uncorrected images was 2.3 ± 1.1, versus 3.3 ± 1.0 for the corrected images. The mean difference in ranking score between uncorrected and corrected images was 1.0 (95% confidence interval: 0.9-1.2, P<.05). The standard deviation of each ROI significantly decreased after artifact reduction (P<.01). Moreover, a negative correlation between the mean rating score for each image and the standard deviation of the oral cavity and bilateral cheeks was observed. Conclusion: The interpolation-substitution algorithm is efficient and effective for reducing metal artifacts caused by dental fillings and implants on CBCT images, as demonstrated by the statistically significant increase in observer image quality ranking and by the decrease in ROI standard deviation between uncorrected and corrected images.

  9. IMACS 󈨟: Proceedings of the IMACS World Congress on Computation and Applied Mathematics (13th) Held in Dublin, Ireland on July 22-26, 1991. Volume 2. Computational Fluid Dynamics and Wave Propagation, Parallel Computing, Concurrent and Supercomputing, Computational Physics/Computational Chemistry and Evolutionary Systems

    Science.gov (United States)

    1991-01-01

    Computation 14, 1000. sensible to allow-a small networ,’ to grow 𔃻 uring ear!y training, until a 27 XViI Pres, Bil-FMonnery SA Teukoisky, &VWT...Tecnologia Fot6nica, ETSI Telecomunicaci6n, Ciudad Universitaria- 28040 Madrid Spain Abstract.- Modelling of ferroelectric liquid crystal The optical

  10. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  11. The Computer Revolution.

    Science.gov (United States)

    Berkeley, Edmund C.

    "The Computer Revolution", a part of the "Second Industrial Revolution", is examined with reference to the social consequences of computers. The subject is introduced in an opening section which discusses the revolution in the handling of information and the history, powers, uses, and working s of computers. A second section examines in detail the…

  12. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  13. Physics of quantum computation

    International Nuclear Information System (INIS)

    Belokurov, V.V.; Khrustalev, O.A.; Sadovnichij, V.A.; Timofeevskaya, O.D.

    2003-01-01

    In the paper, the modern status of the theory of quantum computation is considered. The fundamental principles of quantum computers and their basic notions such as quantum processors and computational basis states of the quantum Turing machine as well as the quantum Fourier transform are discussed. Some possible experimental realizations on the basis of NMR methods are given

  14. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  15. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  16. Visitor's Computer Guidelines | CTIO

    Science.gov (United States)

    Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments Guidelines Library Facilities Outreach NOAO-S EPO Program team Art of Darkness Image Gallery EPO/CADIAS ‹› You are here CTIO Home » Astronomers » Visitor's Computer Guidelines Visitor's Computer

  17. Medical Computational Thinking

    DEFF Research Database (Denmark)

    Musaeus, Peter; Tatar, Deborah Gail; Rosen, Michael A.

    2017-01-01

    Computational thinking (CT) in medicine means deliberating when to pursue computer-mediated solutions to medical problems and evaluating when such solutions are worth pursuing in order to assist in medical decision making. Teaching computational thinking (CT) at medical school should be aligned...

  18. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  19. Emission computed tomography

    International Nuclear Information System (INIS)

    Ott, R.J.

    1986-01-01

    Emission Computed Tomography is a technique used for producing single or multiple cross-sectional images of the distribution of radionuclide labelled agents in vivo. The techniques of Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) are described with particular regard to the function of the detectors used to produce images and the computer techniques used to build up images. (UK)

  20. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...

  1. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...

  2. Beyond the Computer Literacy.

    Science.gov (United States)

    Streibel, Michael J.; Garhart, Casey

    1985-01-01

    Describes the approach taken in an education computing course for pre- and in-service teachers. Outlines the basic operational, analytical, and evaluation skills that are emphasized in the course, suggesting that these skills go beyond the attainment of computer literacy and can assist in the effective use of computers. (ML)

  3. Computer algebra applications

    International Nuclear Information System (INIS)

    Calmet, J.

    1982-01-01

    A survey of applications based either on fundamental algorithms in computer algebra or on the use of a computer algebra system is presented. Recent work in biology, chemistry, physics, mathematics and computer science is discussed. In particular, applications in high energy physics (quantum electrodynamics), celestial mechanics and general relativity are reviewed. (Auth.)

  4. Computer-assisted instruction

    NARCIS (Netherlands)

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss

  5. Designing with computational intelligence

    CERN Document Server

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  6. A new computing principle

    International Nuclear Information System (INIS)

    Fatmi, H.A.; Resconi, G.

    1988-01-01

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  7. Computers and Information Flow.

    Science.gov (United States)

    Patrick, R. L.

    This paper is designed to fill the need for an easily understood introduction to the computing and data processing field for the layman who has, or can expect to have, some contact with it. Information provided includes the unique terminology and jargon of the field, the various types of computers and the scope of computational capabilities, and…

  8. Computer naratology: narrative templates in computer games

    OpenAIRE

    Praks, Vítězslav

    2009-01-01

    Relations and actions between literature and computer games were examined. Study contains theoretical analysis of game as an aesthetic artefact. To play a game means to leave practical world for sake of a fictional world. Artistic communication has more similarities with game communication than with normal, practical communication. Game study can help us understand basic concepts of art communication (game rules - poetic rules, game world - fiction, function in game - meaning in art). Compute...

  9. The neutron computer tomography

    International Nuclear Information System (INIS)

    Matsumoto, G.; Krata, S.

    1983-01-01

    The method of computer tomography (CT) was applied for neutrons instead of X-rays. The neutron radiography image of samples was scanned by microphotometer to get the transmission data. This process was so time-consuming that the number of incident angles to samples could not be increased. The transmission data was processed by FACOM computer and CT image was gained. In the experiment at the Japan Research Reactor No. 4 at Tokai-mura with 18 projection angles, the resolution of paraffin in the aluminum block was less than 0.8 mm. In the experiment at Van de Graaf accelerator of Nagoya University, this same resolution was 1.2 mm because of the angle distribution of neutron beam. This experiment is the preliminary one, the facility which utilizes neutron television and video-recorder will be necessary for the next stage. (Auth.)

  10. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  11. Bioinspired computation in combinatorial optimization: algorithms and their computational complexity

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2012-01-01

    Bioinspired computation methods, such as evolutionary algorithms and ant colony optimization, are being applied successfully to complex engineering and combinatorial optimization problems, and it is very important that we understand the computational complexity of these algorithms. This tutorials...... problems. Classical single objective optimization is examined first. They then investigate the computational complexity of bioinspired computation applied to multiobjective variants of the considered combinatorial optimization problems, and in particular they show how multiobjective optimization can help...... to speed up bioinspired computation for single-objective optimization problems. The tutorial is based on a book written by the authors with the same title. Further information about the book can be found at www.bioinspiredcomputation.com....

  12. Computer graphics and research projects

    International Nuclear Information System (INIS)

    Ingtrakul, P.

    1994-01-01

    This report was prepared as an account of scientific visualization tools and application tools for scientists and engineers. It is provided a set of tools to create pictures and to interact with them in natural ways. It applied many techniques of computer graphics and computer animation through a number of full-color presentations as computer animated commercials, 3D computer graphics, dynamic and environmental simulations, scientific modeling and visualization, physically based modelling, and beavioral, skelatal, dynamics, and particle animation. It took in depth at original hardware and limitations of existing PC graphics adapters contain syste m performance, especially with graphics intensive application programs and user interfaces

  13. Computational Methods and Function Theory

    CERN Document Server

    Saff, Edward; Salinas, Luis; Varga, Richard

    1990-01-01

    The volume is devoted to the interaction of modern scientific computation and classical function theory. Many problems in pure and more applied function theory can be tackled using modern computing facilities: numerically as well as in the sense of computer algebra. On the other hand, computer algorithms are often based on complex function theory, and dedicated research on their theoretical foundations can lead to great enhancements in performance. The contributions - original research articles, a survey and a collection of problems - cover a broad range of such problems.

  14. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  15. RPM-WEBBSYS: A web-based computer system to apply the rational polynomial method for estimating static formation temperatures of petroleum and geothermal wells

    Science.gov (United States)

    Wong-Loya, J. A.; Santoyo, E.; Andaverde, J. A.; Quiroz-Ruiz, A.

    2015-12-01

    A Web-Based Computer System (RPM-WEBBSYS) has been developed for the application of the Rational Polynomial Method (RPM) to estimate static formation temperatures (SFT) of geothermal and petroleum wells. The system is also capable to reproduce the full thermal recovery processes occurred during the well completion. RPM-WEBBSYS has been programmed using advances of the information technology to perform more efficiently computations of SFT. RPM-WEBBSYS may be friendly and rapidly executed by using any computing device (e.g., personal computers and portable computing devices such as tablets or smartphones) with Internet access and a web browser. The computer system was validated using bottomhole temperature (BHT) measurements logged in a synthetic heat transfer experiment, where a good matching between predicted and true SFT was achieved. RPM-WEBBSYS was finally applied to BHT logs collected from well drilling and shut-in operations, where the typical problems of the under- and over-estimation of the SFT (exhibited by most of the existing analytical methods) were effectively corrected.

  16. Local approach of cleavage fracture applied to a vessel with subclad flaw. A benchmark on computational simulation

    International Nuclear Information System (INIS)

    Moinereau, D.; Brochard, J.; Guichard, D.; Bhandari, S.; Sherry, A.; France, C.

    1996-10-01

    A benchmark on the computational simulation of a cladded vessel with a 6.2 mm sub-clad flaw submitted to a thermal transient has been conducted. Two-dimensional elastic and elastic-plastic finite element computations of the vessel have been performed by the different partners with respective finite element codes ASTER (EDF), CASTEM 2000 (CEA), SYSTUS (Framatome) and ABAQUS (AEA Technology). Main results have been compared: temperature field in the vessel, crack opening, opening stress at crack tips, stress intensity factor in cladding and base metal, Weibull stress σ w and probability of failure in base metal, void growth rate R/R 0 in cladding. This comparison shows an excellent agreement on main results, in particular on results obtained with local approach. (K.A.)

  17. Analyze image quality and comparative study between conventional and computed radiography applied to the inspection of alloys

    International Nuclear Information System (INIS)

    Machado, Alessandra S.; Oliveira, Davi F.; Silva, Aline S.S.; Nascimento, Joseilson R.; Lopes, Ricardo T.

    2011-01-01

    Piping system design takes into account relevant factors such as: internal coating, dimensioning, vibration system, adequate supports and principally, piping material. Cost is a decisive factor in the phase of material selection. The non-destructive testing method most commonly employed in industry to analyze the structure of an object is radiographic testing. Computed radiography (CR) is a quicker and much more efficient alternative to conventional radiography but, although CR presents numerous advantages, testing procedures are still largely based on trial and error, due to the lack of a consecrated methodology to choose parameters as it exists for conventional radiography. Notwithstanding, this paper presents a study that uses the technique of computed radiography to analyze metal alloys. These metal alloys are used as internal pipe coating aiming to protect against corrosion and cracks. This study seeks to evaluate parameters such as basic spatial resolution, Normalized Signal-to-Noise Ratio (SNRN), contrast, intensity and also to compare conventional radiography with CR. (author)

  18. Evaluation of Current Computer Models Applied in the DOE Complex for SAR Analysis of Radiological Dispersion & Consequences

    Energy Technology Data Exchange (ETDEWEB)

    O' Kula, K. R. [Savannah River Site (SRS), Aiken, SC (United States); East, J. M. [Savannah River Site (SRS), Aiken, SC (United States); Weber, A. H. [Savannah River Site (SRS), Aiken, SC (United States); Savino, A. V. [Savannah River Site (SRS), Aiken, SC (United States); Mazzola, C. A. [Savannah River Site (SRS), Aiken, SC (United States)

    2003-01-01

    The evaluation of atmospheric dispersion/ radiological dose analysis codes included fifteen models identified in authorization basis safety analysis at DOE facilities, or from regulatory and research agencies where past or current work warranted inclusion of a computer model. All computer codes examined were reviewed using general and specific evaluation criteria developed by the Working Group. The criteria were based on DOE Orders and other regulatory standards and guidance for performing bounding and conservative dose calculations. Included were three categories of criteria: (1) Software Quality/User Interface; (2) Technical Model Adequacy; and (3) Application/Source Term Environment. A consensus-based limited quantitative ranking process was used to base an order of model preference as both an overall conclusion, and under specific conditions.

  19. The challenge of computer mathematics.

    Science.gov (United States)

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  20. Computing in high energy physics

    International Nuclear Information System (INIS)

    Hertzberger, L.O.; Hoogland, W.

    1986-01-01

    This book deals with advanced computing applications in physics, and in particular in high energy physics environments. The main subjects covered are networking; vector and parallel processing; and embedded systems. Also examined are topics such as operating systems, future computer architectures and commercial computer products. The book presents solutions that are foreseen as coping, in the future, with computing problems in experimental and theoretical High Energy Physics. In the experimental environment the large amounts of data to be processed offer special problems on-line as well as off-line. For on-line data reduction, embedded special purpose computers, which are often used for trigger applications are applied. For off-line processing, parallel computers such as emulator farms and the cosmic cube may be employed. The analysis of these topics is therefore a main feature of this volume

  1. Computer applications in nuclear medicine

    International Nuclear Information System (INIS)

    Lancaster, J.L.; Lasher, J.C.; Blumhardt, R.

    1987-01-01

    Digital computers were introduced to nuclear medicine research as an imaging modality in the mid-1960s. Widespread use of imaging computers (scintigraphic computers) was not seen in nuclear medicine clinics until the mid-1970s. For the user, the ability to acquire scintigraphic images into the computer for quantitative purposes, with accurate selection of regions of interest (ROIs), promised almost endless computational capabilities. Investigators quickly developed many new methods for quantitating the distribution patterns of radiopharmaceuticals within the body both spatially and temporally. The computer was used to acquire data on practically every organ that could be imaged by means of gamma cameras or rectilinear scanners. Methods of image processing borrowed from other disciplines were applied to scintigraphic computer images in an attempt to improve image quality. Image processing in nuclear medicine has evolved into a relatively extensive set of tasks that can be called on by the user to provide additional clinical information rather than to improve image quality. Digital computers are utilized in nuclear medicine departments for nonimaging applications also, Patient scheduling, archiving, radiopharmaceutical inventory, radioimmunoassay (RIA), and health physics are just a few of the areas in which the digital computer has proven helpful. The computer is useful in any area in which a large quantity of data needs to be accurately managed, especially over a long period of time

  2. Development of a computational model applied to a unitary 144 CM2 proton exchange membrane fuel cell

    International Nuclear Information System (INIS)

    Robalinho, Eric

    2009-01-01

    This work presents the development of a numerical computer model and methodology to study and design polymeric exchange membrane fuel cell - PEM. For the validation of experimental results, a sequence of routines, appropriate to fit the data obtained in the laboratory, was described. At the computational implementation it was created a new strategy of coupling two 3-dimensional models to satisfy the requirements of the comprehensive model of the fuel cell, including its various geometries and materials, as well as the various physical and chemical processes simulated. To effective assessment of the real cell analogy with numerical model, numerical studies were carried out. Comparisons with values obtained in the literature, characterization of variables through laboratory experiments and estimates from models already tested in the literature were also performed. Regarding the experimental part, a prototype of a fuel cell unit of 144 cm 2 of geometric area was designed, produced and operated at laboratory with the purpose of validating the numerical computer model proposed, with positive results. The results of simulations for the 2D and 3D geometries proposed are presented in the form of polarization curves, highlighting the catalytic layer model based on the geometry of agglomerates. Parametric and sensitivity studies are presented to illustrate the change in performance of the fuel cell studied. The final model is robust and useful as a tool for design and optimization of PEM type fuel cells in a wide range of operating conditions. (author)

  3. Quantum computing and spintronics

    International Nuclear Information System (INIS)

    Kantser, V.

    2007-01-01

    Tentative to build a computer, which can operate according to the quantum laws, has leaded to concept of quantum computing algorithms and hardware. In this review we highlight recent developments which point the way to quantum computing on the basis solid state nanostructures after some general considerations concerning quantum information science and introducing a set of basic requirements for any quantum computer proposal. One of the major direction of research on the way to quantum computing is to exploit the spin (in addition to the orbital) degree of freedom of the electron, giving birth to the field of spintronics. We address some semiconductor approach based on spin orbit coupling in semiconductor nanostructures. (authors)

  4. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  5. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  6. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  7. Scalable optical quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Manykin, E A; Mel' nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre ' Kurchatov Institute' , Moscow (Russian Federation)

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  8. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  9. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  10. Computer algebra and operators

    Science.gov (United States)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  11. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  12. Mobile learning and computational thinking

    Directory of Open Access Journals (Sweden)

    José Manuel Freixo Nunes

    2017-11-01

    Full Text Available Computational thinking can be thought of as an approach to problem solving which has been applied to different areas of learning and which has become an important field of investigation in the area of educational research. [continue

  13. Mobile learning and computational thinking

    OpenAIRE

    José Manuel Freixo Nunes; Teresa Margarida Loureiro Cardoso

    2017-01-01

    Computational thinking can be thought of as an approach to problem solving which has been applied to different areas of learning and which has become an important field of investigation in the area of educational research. [continue

  14. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  15. Computer in radiology

    International Nuclear Information System (INIS)

    Kuesters, H.

    1985-01-01

    With this publication, the author presents the requirements that a user specific software should fulfill to reach an effective practice rationalisation through computer usage and the hardware configuration necessary as basic equipment. This should make it more difficult in the future for sales representatives to sell radiologists unusable computer systems. Furthermore, questions shall be answered that were asked by computer interested radiologists during the system presentation. On the one hand there still exists a prejudice against programmes of standard texts and on the other side undefined fears, that handling a computer is to difficult and that one has to learn a computer language first to be able to work with computers. Finally, it i pointed out, the real competitive advantages can be obtained through computer usage. (orig.) [de

  16. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  17. Computer assisted radiology

    International Nuclear Information System (INIS)

    Lemke, H.U.; Jaffe, C.C.; Felix, R.

    1993-01-01

    The proceedings of the CAR'93 symposium present the 126 oral papers and the 58 posters contributed to the four Technical Sessions entitled: (1) Image Management, (2) Medical Workstations, (3) Digital Image Generation - DIG, and (4) Application Systems - AS. Topics discussed in Session (1) are: picture archiving and communication systems, teleradiology, hospital information systems and radiological information systems, technology assessment and implications, standards, and data bases. Session (2) deals with computer vision, computer graphics, design and application, man computer interaction. Session (3) goes into the details of the diagnostic examination methods such as digital radiography, MRI, CT, nuclear medicine, ultrasound, digital angiography, and multimodality imaging. Session (4) is devoted to computer-assisted techniques, as there are: computer assisted radiological diagnosis, knowledge based systems, computer assisted radiation therapy and computer assisted surgical planning. (UWA). 266 figs [de

  18. Computing for Finance

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. R...

  19. Computational fluid dynamic applications

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S.-L.; Lottes, S. A.; Zhou, C. Q.

    2000-04-03

    The rapid advancement of computational capability including speed and memory size has prompted the wide use of computational fluid dynamics (CFD) codes to simulate complex flow systems. CFD simulations are used to study the operating problems encountered in system, to evaluate the impacts of operation/design parameters on the performance of a system, and to investigate novel design concepts. CFD codes are generally developed based on the conservation laws of mass, momentum, and energy that govern the characteristics of a flow. The governing equations are simplified and discretized for a selected computational grid system. Numerical methods are selected to simplify and calculate approximate flow properties. For turbulent, reacting, and multiphase flow systems the complex processes relating to these aspects of the flow, i.e., turbulent diffusion, combustion kinetics, interfacial drag and heat and mass transfer, etc., are described in mathematical models, based on a combination of fundamental physics and empirical data, that are incorporated into the code. CFD simulation has been applied to a large variety of practical and industrial scale flow systems.

  20. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented