WorldWideScience

Sample records for performance computing technology

  1. Teaching Machines, Programming, Computers, and Instructional Technology: The Roots of Performance Technology.

    Science.gov (United States)

    Deutsch, William

    1992-01-01

    Reviews the history of the development of the field of performance technology. Highlights include early teaching machines, instructional technology, learning theory, programed instruction, the systems approach, needs assessment, branching versus linear program formats, programing languages, and computer-assisted instruction. (LRW)

  2. High performance computing and communications: Advancing the frontiers of information technology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental in the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.

  3. [Earth Science Technology Office's Computational Technologies Project

    Science.gov (United States)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  4. FY 1995 Blue Book: High Performance Computing and Communications: Technology for the National Information Infrastructure

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — The Federal High Performance Computing and Communications HPCC Program was created to accelerate the development of future generations of high performance computers...

  5. On the impact of quantum computing technology on future developments in high-performance scientific computing

    OpenAIRE

    Möller, Matthias; Vuik, Cornelis

    2017-01-01

    Quantum computing technologies have become a hot topic in academia and industry receiving much attention and financial support from all sides. Building a quantum computer that can be used practically is in itself an outstanding challenge that has become the ‘new race to the moon’. Next to researchers and vendors of future computing technologies, national authorities are showing strong interest in maturing this technology due to its known potential to break many of today’s encryption technique...

  6. The Effect of Using in Computer Skills on Teachers’ Perceived Self-Efficacy Beliefs Towards Technology Integration, Attitudes and Performance

    Directory of Open Access Journals (Sweden)

    Badrie Mohammad Nour EL-Daou

    2016-07-01

    Full Text Available The current study analyzes the relationship between the apparent teacher’s Self-efficacy and attitudes towards integrating technology into classroom teaching, self- evaluation reports and computer performance results. Pre-post measurement of the Computer Technology Integration Survey (CTIS (Wang et al,2004 was used to determine the confidence level with of 60 science teachers and 12 mixed-major teachers enrolled at the Lebanese University, Faculty of Education in the academic year 2011-2012. Pre –post measurement on teachers’ attitudes towards using technology was examined using an opened and a closed questionnaire. Teachers’ performance was measured by means of their Activeinspire projects results using active boards after their third practice of training in computer skills and Activeinspire program. To accumulate data on teachers’ self-report, this study uses Robert Reasoner's five components: feeling of security, feeling of belonging, feeling of identity, feeling of goal, and self-actualization which teachers used to rate themselves (Reasoner,1983. The study acknowledged probable impacts of computer training skills on teachers ‘self-evaluation report, effectiveness of computer technology skills, and evaluations of self-efficacy attitudes toward technology integration. Pearson correlation revealed a strong relationship r = 0.99 between the perceived self-efficacy towards technology incorporation and teachers’ self-evaluation report. Also, the findings of this research revealed that 82.7% of teachers earned high computer technology scores on their Activeinspire projects and 33.3% received excellent grades on computer performance test. Recommendations and potential research were discussed.

  7. The effect of using in computer skills on teachers’ perceived self-efficacy beliefs towards technology integration, attitudes and performance

    Directory of Open Access Journals (Sweden)

    Badrie Mohammad Nour ELDaou

    2016-10-01

    Full Text Available The current study analyzesthe relationship between the apparentteacher’s Self-efficacyand attitudes towardsintegrating technology into classroom teaching, self-evaluation reportsand computer performance results. Pre-post measurement of the Computer Technology Integration Survey (CTIS (Wang et al, 2004 was used to determine theconfidence level with of 60 science teachers and 12 mixed-major teachers enrolled at the Lebanese University, Faculty of Education in the academic year 2011-2012. Pre –post measurement onteachers’attitudes towards usingtechnologywas examined using an opened and a closed questionnaire.Teachers’ performance was measured by means of their Activeinspire projects results using active boards after their third practice of training in computer skills and Activeinspire program. To accumulate data on teachers’ self-report, this study uses Robert Reasoner's five components: feeling of security, feeling of belonging, feeling of identity, feeling of goal, and self-actualization which teachers used to rate themselves (Reasoner,1983. The study acknowledged probable impacts of computer training skills on teachers ‘self-evaluation report, effectiveness of computer technology skills, and evaluations of self-efficacy attitudes toward technology integration. Pearson correlation revealed a strong relationship r= 0.99 between the perceived self-efficacy towards technology incorporation and teachers’ self-evaluation report. Also, the findings of this research revealed that 82.7% of teachers earned high computer technology scores on their Activeinspire projects and 33.3% received excellent grades on computer performance test. Recommendations and potential research were discussed

  8. International Conference on Modern Mathematical Methods and High Performance Computing in Science and Technology

    CERN Document Server

    Srivastava, HM; Venturino, Ezio; Resch, Michael; Gupta, Vijay

    2016-01-01

    The book discusses important results in modern mathematical models and high performance computing, such as applied operations research, simulation of operations, statistical modeling and applications, invisibility regions and regular meta-materials, unmanned vehicles, modern radar techniques/SAR imaging, satellite remote sensing, coding, and robotic systems. Furthermore, it is valuable as a reference work and as a basis for further study and research. All contributing authors are respected academicians, scientists and researchers from around the globe. All the papers were presented at the international conference on Modern Mathematical Methods and High Performance Computing in Science & Technology (M3HPCST 2015), held at Raj Kumar Goel Institute of Technology, Ghaziabad, India, from 27–29 December 2015, and peer-reviewed by international experts. The conference provided an exceptional platform for leading researchers, academicians, developers, engineers and technocrats from a broad range of disciplines ...

  9. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    Science.gov (United States)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  10. FY 1997 Blue Book: High Performance Computing and Communications: Advancing the Frontiers of Information Technology

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — The Federal High Performance Computing and Communications HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of...

  11. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  12. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  13. Trusted Computing Technologies, Intel Trusted Execution Technology.

    Energy Technology Data Exchange (ETDEWEB)

    Guise, Max Joseph; Wendt, Jeremy Daniel

    2011-01-01

    We describe the current state-of-the-art in Trusted Computing Technologies - focusing mainly on Intel's Trusted Execution Technology (TXT). This document is based on existing documentation and tests of two existing TXT-based systems: Intel's Trusted Boot and Invisible Things Lab's Qubes OS. We describe what features are lacking in current implementations, describe what a mature system could provide, and present a list of developments to watch. Critical systems perform operation-critical computations on high importance data. In such systems, the inputs, computation steps, and outputs may be highly sensitive. Sensitive components must be protected from both unauthorized release, and unauthorized alteration: Unauthorized users should not access the sensitive input and sensitive output data, nor be able to alter them; the computation contains intermediate data with the same requirements, and executes algorithms that the unauthorized should not be able to know or alter. Due to various system requirements, such critical systems are frequently built from commercial hardware, employ commercial software, and require network access. These hardware, software, and network system components increase the risk that sensitive input data, computation, and output data may be compromised.

  14. CUDA/GPU Technology : Parallel Programming For High Performance Scientific Computing

    OpenAIRE

    YUHENDRA; KUZE, Hiroaki; JOSAPHAT, Tetuko Sri Sumantyo

    2009-01-01

    [ABSTRACT]Graphics processing units (GP Us) originally designed for computer video cards have emerged as the most powerful chip in a high-performance workstation. In the high performance computation capabilities, graphic processing units (GPU) lead to much more powerful performance than conventional CPUs by means of parallel processing. In 2007, the birth of Compute Unified Device Architecture (CUDA) and CUDA-enabled GPUs by NVIDIA Corporation brought a revolution in the general purpose GPU a...

  15. On the impact of quantum computing technology on future developments in high-performance scientific computing

    NARCIS (Netherlands)

    Möller, M.; Vuik, C.

    2017-01-01

    Quantum computing technologies have become a hot topic in academia and industry receiving much attention and financial support from all sides. Building a quantum computer that can be used practically is in itself an outstanding challenge that has become the ‘new race to the moon’. Next to

  16. Information Technology Service Management with Cloud Computing Approach to Improve Administration System and Online Learning Performance

    Directory of Open Access Journals (Sweden)

    Wilianto Wilianto

    2015-10-01

    Full Text Available This work discusses the development of information technology service management using cloud computing approach to improve the performance of administration system and online learning at STMIK IBBI Medan, Indonesia. The network topology is modeled and simulated for system administration and online learning. The same network topology is developed in cloud computing using Amazon AWS architecture. The model is designed and modeled using Riverbed Academic Edition Modeler to obtain values of the parameters: delay, load, CPU utilization, and throughput. The simu- lation results are the following. For network topology 1, without cloud computing, the average delay is 54  ms, load 110 000 bits/s, CPU utilization 1.1%, and throughput 440  bits/s.  With  cloud  computing,  the  average  delay  is 45 ms,  load  2 800  bits/s,  CPU  utilization  0.03%,  and throughput 540 bits/s. For network topology 2, without cloud computing, the average delay is 39  ms, load 3 500 bits/s, CPU utilization 0.02%, and throughput database server 1 400 bits/s. With cloud computing, the average delay is 26 ms, load 5 400 bits/s, CPU utilization email server 0.0001%, FTP server 0.001%, HTTP server 0.0002%, throughput email server 85 bits/s, FTP    server 100 bits/sec, and HTTP server 95  bits/s.  Thus,  the  delay, the load, and the CPU utilization decrease; but,  the throughput increases. Information technology service management with cloud computing approach has better performance.

  17. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  18. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  19. Cooperation, Technology, and Performance: A Case Study.

    Science.gov (United States)

    Cavanagh, Thomas; Dickenson, Sabrina; Brandt, Suzanne

    1999-01-01

    Describes the CTP (Cooperation, Technology, and Performance) model and explains how it is used by the Department of Veterans Affairs-Veteran's Benefit Administration (VBA) for training. Discusses task analysis; computer-based training; cooperative-based learning environments; technology-based learning; performance-assessment methods; courseware…

  20. Center for Advanced Computational Technology

    Science.gov (United States)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  1. Tutorial on Computing: Technological Advances, Social Implications, Ethical and Legal Issues

    OpenAIRE

    Debnath, Narayan

    2012-01-01

    Computing and information technology have made significant advances. The use of computing and technology is a major aspect of our lives, and this use will only continue to increase in our lifetime. Electronic digital computers and high performance communication networks are central to contemporary information technology. The computing applications in a wide range of areas including business, communications, medical research, transportation, entertainments, and education are transforming lo...

  2. Multicore Challenges and Benefits for High Performance Scientific Computing

    Directory of Open Access Journals (Sweden)

    Ida M.B. Nielsen

    2008-01-01

    Full Text Available Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexity of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.

  3. Optical interconnection networks for high-performance computing systems

    International Nuclear Information System (INIS)

    Biberman, Aleksandr; Bergman, Keren

    2012-01-01

    Enabled by silicon photonic technology, optical interconnection networks have the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. Chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, offer unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of our work, we demonstrate such feasibility of waveguides, modulators, switches and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. We propose novel silicon photonic devices, subsystems, network topologies and architectures to enable unprecedented performance of these photonic interconnection networks. Furthermore, the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers. (review article)

  4. Summary of researches being performed in the Institute of Mathematics and Computer Science on computer science and information technologies

    Directory of Open Access Journals (Sweden)

    Artiom Alhazov

    2008-07-01

    Full Text Available Evolution of the informatization notion (which assumes automation of majority of human activities applying computers, computer networks, information technologies towards the notion of {\\it Global Information Society} (GIS challenges the determination of new paradigms of society: automation and intellectualization of production, new level of education and teaching, formation of new styles of work, active participation in decision making, etc. To assure transition to GIS for any society, including that from Republic of Moldova, requires both special training and broad application of progressive technologies and information systems. Methodological aspects concerning impact of GIS creation over the citizen, economic unit, national economy in the aggregate demands a profound study. Without systematic approach to these aspects the GIS creation would have confront great difficulties. Collective of researchers from the Institute of Mathematics and Computer Science (IMCS of Academy of Sciences of Moldova, which work in the field of computer science, constitutes the center of advanced researches and activates in those directions of researches of computer science which facilitate technologies and applications without of which the development of GIS cannot be assured.

  5. New data processing technologies at LHC: From Grid to Cloud Computing and beyond

    International Nuclear Information System (INIS)

    De Salvo, A.

    2011-01-01

    Since a few years the LHC experiments at CERN are successfully using the Grid Computing Technologies for their distributed data processing activities, on a global scale. Recently, the experience gained with the current systems allowed the design of the future Computing Models, involving new technologies like Could Computing, virtualization and high performance distributed database access. In this paper we shall describe the new computational technologies of the LHC experiments at CERN, comparing them with the current models, in terms of features and performance.

  6. A Longitudinal Examination of the Effects of Computer Self-efficacy Growth on Performance during Technology Training

    Directory of Open Access Journals (Sweden)

    James P. Downey

    2015-02-01

    Full Text Available Technology training in the classroom is critical in preparing students for upper level classes as well as professional careers, especially in fields such as technology. One of the key enablers to this process is computer self-efficacy (CSE, which has an extensive stream of empirical research. Despite this, one of the missing pieces is how CSE actually changes during training, and how such change is related to antecedents and performance outcomes. Measuring change requires repeated data gathering and the use of latent growth modeling, a relatively new statistical technique. This study examines CSE (specifically general CSE or GCSE growth over time during training, and how this growth is influenced by anxiety and gender and influences performance, using a semester-long lab course covering three applications. The use of GCSE growth more accurately models how students actually learn in a technology classroom. It provides novel clarity in the interaction of gender, anxiety, GCSE, specific CSEs, and performance during training. The study finds that the relationship between anxiety and self-efficacy decreases over time during training, becoming non-significant; it clarifies the significant role gender plays in influencing GCSE at the start of and during training. It finds GCSE influences application performance only through specific CSEs.

  7. High-performance computing in seismology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    The scientific, technical, and economic importance of the issues discussed here presents a clear agenda for future research in computational seismology. In this way these problems will drive advances in high-performance computing in the field of seismology. There is a broad community that will benefit from this work, including the petroleum industry, research geophysicists, engineers concerned with seismic hazard mitigation, and governments charged with enforcing a comprehensive test ban treaty. These advances may also lead to new applications for seismological research. The recent application of high-resolution seismic imaging of the shallow subsurface for the environmental remediation industry is an example of this activity. This report makes the following recommendations: (1) focused efforts to develop validated documented software for seismological computations should be supported, with special emphasis on scalable algorithms for parallel processors; (2) the education of seismologists in high-performance computing technologies and methodologies should be improved; (3) collaborations between seismologists and computational scientists and engineers should be increased; (4) the infrastructure for archiving, disseminating, and processing large volumes of seismological data should be improved.

  8. Effect of information and communication technology on nursing performance.

    Science.gov (United States)

    Fujino, Yuriko; Kawamoto, Rieko

    2013-05-01

    The aim of this study was to investigate the influence of information and communication technology use and skills on nursing performance. Questionnaires were prepared relating to using the technology, practical skills in utilizing information, the Six-Dimension Scale of Nursing Performance, and demographics. In all, 556 nurses took part (response rate, 72.6%). A two-way analysis of variance was used to determine the influence of years of nursing experience on the relationship between nursing performance and information and communication technology use. The results showed that the group possessing high technological skills had greater nursing ability than the group with low skills; the level of nursing performance improved with years of experience in the former group, but not in the latter group. Regarding information and communication technology use, the results showed that nursing performance improved among participants who used computers for sending and receiving e-mails, but it decreased for those who used cell phones for e-mail. The results suggest that nursing performance may be negatively affected if information and communication technology are inappropriately used. Informatics education should therefore be provided for all nurses, and it should include information use relating to cell phones and computers.

  9. High Performance Computing in Science and Engineering '99 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2000-01-01

    The book contains reports about the most significant projects from science and engineering of the Federal High Performance Computing Center Stuttgart (HLRS). They were carefully selected in a peer-review process and are showcases of an innovative combination of state-of-the-art modeling, novel algorithms and the use of leading-edge parallel computer technology. The projects of HLRS are using supercomputer systems operated jointly by university and industry and therefore a special emphasis has been put on the industrial relevance of results and methods.

  10. Benchmarking high performance computing architectures with CMS’ skeleton framework

    Science.gov (United States)

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    2017-10-01

    In 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta, Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.

  11. Future Computing Technology (2/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Computing of the future will be affected by a number of fundamental technologies in development today, many of which are already on the way to becoming commercialized. In this series of lectures, we will discuss hardware and software development that will become mainstream in the timeframe of a few years and how they will shape or change the computing landscape - commercial and personal alike. Topics range from processor and memory aspects, programming models and the limits of artificial intelligence, up to end-user interaction with wearables or e-textiles. We discuss the impact of these technologies on the art of programming, the data centres of the future and daily life. On the second day of the Future Computing Technology series, we will talk about ubiquitous computing. From smart watches through mobile devices to virtual reality, computing devices surround us, and innovative new technologies are introduces every day. We will briefly explore how this propagation might continue, how computers can take ove...

  12. Computational technologies a first course

    CERN Document Server

    Borisov, Victor S; Grigoriev, Aleksander V 1; Kolesov, Alexandr E 1; Popov, Petr A 1; Sirditov, Ivan K 1; Vabishchevich, Petr N 1; Vasilieva, Maria V 1; Zakharov, Petr E 1; Vabishchevich, Petr N 0

    2015-01-01

    In this book we describe the basic elements of present computational technologies that use the algorithmic languages C/C++. The emphasis is on GNU compilers and libraries, FOSS for the solution of computational mathematics problems and visualization of the obtained data. Many examples illustrate the basic features of computational technologies.

  13. Computer based training: Technology and trends

    International Nuclear Information System (INIS)

    O'Neal, A.F.

    1986-01-01

    Computer Based Training (CBT) offers great potential for revolutionizing the training environment. Tremendous advances in computer cost performance, instructional design science, and authoring systems have combined to put CBT within the reach of all. The ability of today's CBT systems to implement powerful training strategies, simulate complex processes and systems, and individualize and control the training process make it certain that CBT will now, at long last, live up to its potential. This paper reviews the major technologies and trends involved and offers some suggestions for getting started in CBT

  14. Computer Technology for Industry

    Science.gov (United States)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  15. Future Computing Technology (3/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Computing of the future will be affected by a number of fundamental technologies in development today, many of which are already on the way to becoming commercialized. In this series of lectures, we will discuss hardware and software development that will become mainstream in the timeframe of a few years and how they will shape or change the computing landscape - commercial and personal alike. Topics range from processor and memory aspects, programming models and the limits of artificial intelligence, up to end-user interaction with wearables or e-textiles. We discuss the impact of these technologies on the art of programming, the data centres of the future and daily life. On the third day of the Future Computing Technology series, we will touch on societal aspects of the future of computing. Our perception of computers may at time seem passive, but in reality we are a vital chain of the feedback loop. Human-computer interaction, innovative forms of computers, privacy, process automation, threats and medica...

  16. AHPCRC - Army High Performance Computing Research Center

    Science.gov (United States)

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  17. MUSICAL-COMPUTER TECHNOLOGY: THE LABORATORY

    Directory of Open Access Journals (Sweden)

    Gorbunova Irina B.

    2012-12-01

    Full Text Available The article deals with musically-computer technology in the educational system on example of the Educational and Methodical Laboratory Music & Computer Technologies at the Herzen State Pedagogical University of Russia, St. Petersburg. Interdisciplinary field of professional activities relates to the creation and application of specialized music software and hardware tools and the knowledges in music and informatics. A realization of the concept of musical-computer education in preparing music teachers is through basic educational programs of vocational training, supplementary education, professional development of teachers and methodical support via Internet. In addition, the laboratory Music & Computer Technologies engaged in scientific activity: it is, above all, specialized researches in the field of pedagogy and international conferences.

  18. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  19. High Performance Computing in Science and Engineering '98 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    1999-01-01

    The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.

  20. Understanding computer and information technology

    International Nuclear Information System (INIS)

    Choi, Yun Cheol; Han, Tack Don; Im, Sun Beom

    2009-01-01

    This book consists of four parts. The first part describes IT technology and information community understanding of computer system, constitution of software system and information system and application of software. The second part is about computer network, information and communication, application and internet service. The third part contains application and multi media, application of mobile computer, ubiquitous computing and ubiquitous environment and computer and digital life. The last part explains information security and ethics of information-oriented society, information industry and IT venture, digital contents technology and industry and the future and development of information-oriented society.

  1. Computer technology forecasting at the National Laboratories

    International Nuclear Information System (INIS)

    Peskin, A.M.

    1980-01-01

    The DOE Office of ADP Management organized a group of scientists and computer professionals, mostly from their own national laboratories, to prepare an annually updated technology forecast to accompany the Department's five-year ADP Plan. The activities of the task force were originally reported in an informal presentation made at the ACM Conference in 1978. This presentation represents an update of that report. It also deals with the process of applying the results obtained at a particular computing center, Brookhaven National Laboratory. Computer technology forecasting is a difficult and hazardous endeavor, but it can reap considerable advantage. The forecast performed on an industry-wide basis can be applied to the particular needs of a given installation, and thus give installation managers considerable guidance in planning. A beneficial side effect of this process is that it forces installation managers, who might otherwise tend to preoccupy themselves with immediate problems, to focus on longer term goals and means to their ends

  2. Philosophy of computing and information technology

    OpenAIRE

    Brey, Philip A.E.; Soraker, Johnny; Meijers, A.

    2009-01-01

    Philosophy has been described as having taken a “computational turn,” referring to the ways in which computers and information technology throw new light upon traditional philosophical issues, provide new tools and concepts for philosophical reasoning, and pose theoretical and practical questions that cannot readily be approached within traditional philosophical frameworks. As such, computer technology is arguably the technology that has had the most profound impact on philosophy. Philosopher...

  3. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  4. Parallel, distributed and GPU computing technologies in single-particle electron microscopy.

    Science.gov (United States)

    Schmeisser, Martin; Heisen, Burkhard C; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-07-01

    Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.

  5. 5th International Conference on High Performance Scientific Computing

    CERN Document Server

    Hoang, Xuan; Rannacher, Rolf; Schlöder, Johannes

    2014-01-01

    This proceedings volume gathers a selection of papers presented at the Fifth International Conference on High Performance Scientific Computing, which took place in Hanoi on March 5-9, 2012. The conference was organized by the Institute of Mathematics of the Vietnam Academy of Science and Technology (VAST), the Interdisciplinary Center for Scientific Computing (IWR) of Heidelberg University, Ho Chi Minh City University of Technology, and the Vietnam Institute for Advanced Study in Mathematics. The contributions cover the broad interdisciplinary spectrum of scientific computing and present recent advances in theory, development of methods, and practical applications. Subjects covered include mathematical modeling; numerical simulation; methods for optimization and control; parallel computing; software development; and applications of scientific computing in physics, mechanics and biomechanics, material science, hydrology, chemistry, biology, biotechnology, medicine, sports, psychology, transport, logistics, com...

  6. 3rd International Conference on High Performance Scientific Computing

    CERN Document Server

    Kostina, Ekaterina; Phu, Hoang; Rannacher, Rolf

    2008-01-01

    This proceedings volume contains a selection of papers presented at the Third International Conference on High Performance Scientific Computing held at the Hanoi Institute of Mathematics, Vietnamese Academy of Science and Technology (VAST), March 6-10, 2006. The conference has been organized by the Hanoi Institute of Mathematics, Interdisciplinary Center for Scientific Computing (IWR), Heidelberg, and its International PhD Program ``Complex Processes: Modeling, Simulation and Optimization'', and Ho Chi Minh City University of Technology. The contributions cover the broad interdisciplinary spectrum of scientific computing and present recent advances in theory, development of methods, and applications in practice. Subjects covered are mathematical modelling, numerical simulation, methods for optimization and control, parallel computing, software development, applications of scientific computing in physics, chemistry, biology and mechanics, environmental and hydrology problems, transport, logistics and site loca...

  7. FY 1992 Blue Book: Grand Challenges: High Performance Computing and Communications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — High performance computing and computer communications networks are becoming increasingly important to scientific advancement, economic competition, and national...

  8. Resilient and Robust High Performance Computing Platforms for Scientific Computing Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Yier [Univ. of Central Florida, Orlando, FL (United States)

    2017-07-14

    As technology advances, computer systems are subject to increasingly sophisticated cyber-attacks that compromise both their security and integrity. High performance computing platforms used in commercial and scientific applications involving sensitive, or even classified data, are frequently targeted by powerful adversaries. This situation is made worse by a lack of fundamental security solutions that both perform efficiently and are effective at preventing threats. Current security solutions fail to address the threat landscape and ensure the integrity of sensitive data. As challenges rise, both private and public sectors will require robust technologies to protect its computing infrastructure. The research outcomes from this project try to address all these challenges. For example, we present LAZARUS, a novel technique to harden kernel Address Space Layout Randomization (KASLR) against paging-based side-channel attacks. In particular, our scheme allows for fine-grained protection of the virtual memory mappings that implement the randomization. We demonstrate the effectiveness of our approach by hardening a recent Linux kernel with LAZARUS, mitigating all of the previously presented side-channel attacks on KASLR. Our extensive evaluation shows that LAZARUS incurs only 0.943% overhead for standard benchmarks, and is therefore highly practical. We also introduced HA2lloc, a hardware-assisted allocator that is capable of leveraging an extended memory management unit to detect memory errors in the heap. We also perform testing using HA2lloc in a simulation environment and find that the approach is capable of preventing common memory vulnerabilities.

  9. International Conference on Emerging Technologies for Information Systems, Computing, and Management

    CERN Document Server

    Ma, Tinghuai; Emerging Technologies for Information Systems, Computing, and Management

    2013-01-01

    This book aims to examine innovation in the fields of information technology, software engineering, industrial engineering, management engineering. Topics covered in this publication include; Information System Security, Privacy, Quality Assurance, High-Performance Computing and Information System Management and Integration. The book presents papers from The Second International Conference for Emerging Technologies Information Systems, Computing, and Management (ICM2012) which was held on December 1 to 2, 2012 in Hangzhou, China.

  10. Inclusive vision for high performance computing at the CSIR

    CSIR Research Space (South Africa)

    Gazendam, A

    2006-02-01

    Full Text Available and computationally intensive applications. A number of different technologies and standards were identified as core to the open and distributed high-performance infrastructure envisaged...

  11. Guide to cloud computing for business and technology managers from distributed computing to cloudware applications

    CERN Document Server

    Kale, Vivek

    2014-01-01

    Guide to Cloud Computing for Business and Technology Managers: From Distributed Computing to Cloudware Applications unravels the mystery of cloud computing and explains how it can transform the operating contexts of business enterprises. It provides a clear understanding of what cloud computing really means, what it can do, and when it is practical to use. Addressing the primary management and operation concerns of cloudware, including performance, measurement, monitoring, and security, this pragmatic book:Introduces the enterprise applications integration (EAI) solutions that were a first ste

  12. Field-programmable custom computing technology architectures, tools, and applications

    CERN Document Server

    Luk, Wayne; Pocek, Ken

    2000-01-01

    Field-Programmable Custom Computing Technology: Architectures, Tools, and Applications brings together in one place important contributions and up-to-date research results in this fast-moving area. In seven selected chapters, the book describes the latest advances in architectures, design methods, and applications of field-programmable devices for high-performance reconfigurable systems. The contributors to this work were selected from the leading researchers and practitioners in the field. It will be valuable to anyone working or researching in the field of custom computing technology. It serves as an excellent reference, providing insight into some of the most challenging issues being examined today.

  13. Parallel, distributed and GPU computing technologies in single-particle electron microscopy

    International Nuclear Information System (INIS)

    Schmeisser, Martin; Heisen, Burkhard C.; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-01-01

    An introduction to the current paradigm shift towards concurrency in software. Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today’s technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined

  14. Computer Education and Instructional Technology Teacher Trainees' Opinions about Cloud Computing Technology

    Science.gov (United States)

    Karamete, Aysen

    2015-01-01

    This study aims to show the present conditions about the usage of cloud computing in the department of Computer Education and Instructional Technology (CEIT) amongst teacher trainees in School of Necatibey Education, Balikesir University, Turkey. In this study, a questionnaire with open-ended questions was used. 17 CEIT teacher trainees…

  15. Computer technology: its potential for industrial energy conservation. A technology applications manual

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-01-01

    Today, computer technology is within the reach of practically any industrial corporation regardless of product size. This manual highlights a few of the many applications of computers in the process industry and provides the technical reader with a basic understanding of computer technology, terminology, and the interactions among the various elements of a process computer system. The manual has been organized to separate process applications and economics from computer technology. Chapter 1 introduces the present status of process computer technology and describes the four major applications - monitoring, analysis, control, and optimization. The basic components of a process computer system also are defined. Energy-saving applications in the four major categories defined in Chapter 1 are discussed in Chapter 2. The economics of process computer systems is the topic of Chapter 3, where the historical trend of process computer system costs is presented. Evaluating a process for the possible implementation of a computer system requires a basic understanding of computer technology as well as familiarity with the potential applications; Chapter 4 provides enough technical information for an evaluation. Computer and associated peripheral costs and the logical sequence of steps in the development of a microprocessor-based process control system are covered in Chapter 5.

  16. Advances in Computing and Information Technology : Proceedings of the Second International Conference on Advances in Computing and Information Technology

    CERN Document Server

    Nagamalai, Dhinaharan; Chaki, Nabendu

    2013-01-01

    The international conference on Advances in Computing and Information technology (ACITY 2012) provides an excellent international forum for both academics and professionals for sharing knowledge and results in theory, methodology and applications of Computer Science and Information Technology. The Second International Conference on Advances in Computing and Information technology (ACITY 2012), held in Chennai, India, during July 13-15, 2012, covered a number of topics in all major fields of Computer Science and Information Technology including: networking and communications, network security and applications, web and internet computing, ubiquitous computing, algorithms, bioinformatics, digital image processing and pattern recognition, artificial intelligence, soft computing and applications. Upon a strength review process, a number of high-quality, presenting not only innovative ideas but also a founded evaluation and a strong argumentation of the same, were selected and collected in the present proceedings, ...

  17. An esthetics rehabilitation with computer-aided design/ computer-aided manufacturing technology.

    Science.gov (United States)

    Mazaro, Josá Vitor Quinelli; de Mello, Caroline Cantieri; Zavanelli, Adriana Cristina; Santiago, Joel Ferreira; Amoroso, Andressa Paschoal; Pellizzer, Eduardo Piza

    2014-07-01

    This paper describes a case of a rehabilitation involving Computer Aided Design/Computer Aided Manufacturing (CAD-CAM) system in implant supported and dental supported prostheses using zirconia as framework. The CAD-CAM technology has developed considerably over last few years, becoming a reality in dental practice. Among the widely used systems are the systems based on zirconia which demonstrate important physical and mechanical properties of high strength, adequate fracture toughness, biocompatibility and esthetics, and are indicated for unitary prosthetic restorations and posterior and anterior framework. All the modeling was performed by using CAD-CAM system and prostheses were cemented using resin cement best suited for each situation. The rehabilitation of the maxillary arch using zirconia framework demonstrated satisfactory esthetic and functional results after a 12-month control and revealed no biological and technical complications. This article shows the important of use technology CAD/CAM in the manufacture of dental prosthesis and implant-supported.

  18. Application of software technology to a future spacecraft computer design

    Science.gov (United States)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  19. Future Computing Technology (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Computing of the future will be affected by a number of fundamental technologies in development today, many of which are already on the way to becoming commercialized. In this series of lectures, we will discuss hardware and software development that will become mainstream in the timeframe of a few years and how they will shape or change the computing landscape - commercial and personal alike. Topics range from processor and memory aspects, programming models and the limits of artificial intelligence, up to end-user interaction with wearables or e-textiles. We discuss the impact of these technologies on the art of programming, the data centres of the future and daily life. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and initiated projects with the private sector (e.g. HP and Go...

  20. PRIMARY SCHOOL PRINCIPALS’ ATTITUDES TOWARDS COMPUTER TECHNOLOGY IN THE USE OF COMPUTER TECHNOLOGY IN SCHOOL ADMINISTRATION

    OpenAIRE

    GÜNBAYI, İlhan; CANTÜRK, Gökhan

    2011-01-01

    The aim of the study is to determine the usage of computer technology in school administration, primary school administrators’ attitudes towards computer technology, administrators’ and teachers’ computer literacy level. The study was modeled as a survey search. The population of the study consists primary school principals, assistant principals in public primary schools in the center of Antalya. The data were collected from 161 (%51) administrator questionnaires in 68 of 129 public primary s...

  1. Computer-Based Technologies in Dentistry: Types and Applications

    Directory of Open Access Journals (Sweden)

    Rajaa Mahdi Musawi

    2016-10-01

    Full Text Available During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR simulators, augmented reality (AR and computer aided design/computer aided manufacturing (CAD/CAM systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established.This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.Keywords: Virtual Reality Exposure Therapy; Immersion; Computer-Aided Design; Dentistry; Education

  2. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  3. FY 1993 Blue Book: Grand Challenges 1993: High Performance Computing and Communications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — High performance computing and computer communications networks are becoming increasingly important to scientific advancement, economic competition, and national...

  4. International Conference on Computers and Advanced Technology in Education

    CERN Document Server

    Advanced Information Technology in Education

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computers and Advanced Technology in Education. With the development of computers and advanced technology, the human social activities are changing basically. Education, especially the education reforms in different countries, has been experiencing the great help from the computers and advanced technology. Generally speaking, education is a field which needs more information, while the computers, advanced technology and internet are a good information provider. Also, with the aid of the computer and advanced technology, persons can make the education an effective combination. Therefore, computers and advanced technology should be regarded as an important media in the modern education. Volume Advanced Information Technology in Education is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of computers and advanced technology in education to d...

  5. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  6. Recent technology on steam turbine performance improvement

    International Nuclear Information System (INIS)

    Hirada, M.; Watanabe, E.; Tashiro, H.

    1991-01-01

    Continuous efforts have been made to improve turbine efficiency by applying the latest aerodynamic technologies to meet the energy saving requirement. In recent years, there has been considerable improvement in the field of computational fluid dynamics and these new technologies have been applied to the new blade design for HP, IP and LP turbines. Experimental verification for the new blade in turbine tests has established the overall turbine performance improvement and the excellent correspondence of flow pattern to the predicted value. This paper introduces the latest design technologies for the newly developed high efficiency blade and the verification test results

  7. Cloud Computing for Maintenance Performance Improvement

    OpenAIRE

    Kour, Ravdeep; Karim, Ramin; Parida, Aditya

    2013-01-01

    Cloud Computing is an emerging research area. It can be utilised for acquiring an effective and efficient information logistics. This paper uses cloud-based technology for the establishment of information logistics for railway system which requires information based on data from different data sources (e.g. railway maintenance, railway operation, and railway business data). In order to improve the performance of the maintenance process relevant data from various sources need to be acquired, f...

  8. Computing, Information and Communications Technology (CICT) Website

    Science.gov (United States)

    Hardman, John; Tu, Eugene (Technical Monitor)

    2002-01-01

    The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) Information Technology Strategic Research (ITSR).

  9. Use of Soft Computing Technologies For Rocket Engine Control

    Science.gov (United States)

    Trevino, Luis C.; Olcmen, Semih; Polites, Michael

    2003-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to further improve overall engine system reliability and performance. Specifically, this will be presented by enhancing rocket engine control and engine health management (EHM) using SCT coupled with conventional control technologies, and sound software engineering practices used in Marshall s Flight Software Group. The principle goals are to improve software management, software development time and maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control and EHM methodologies, but to provide alternative design choices for control, EHM, implementation, performance, and sustaining engineering. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion, software engineering for embedded systems, and soft computing technologies (i.e., neural networks, fuzzy logic, and Bayesian belief networks), much of which is presented in this paper. The first targeted demonstration rocket engine platform is the MC-1 (formerly FASTRAC Engine) which is simulated with hardware and software in the Marshall Avionics & Software Testbed laboratory that

  10. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing: the PRIMA Project

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D. [Department of Computer and Information Science, University of Oregon; Wolf, Felix G. [Juelich Supercomputing Centre, Forschungszentrum Juelich

    2014-01-31

    The growing number of cores provided by today’s high-end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensively across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to accomplish

  11. Study of application technology of ultra-high speed computer to the elucidation of complex phenomena

    International Nuclear Information System (INIS)

    Sekiguchi, Tomotsugu

    1996-01-01

    The basic design of numerical information library in the decentralized computer network was explained at the first step of constructing the application technology of ultra-high speed computer to the elucidation of complex phenomena. Establishment of the system makes possible to construct the efficient application environment of ultra-high speed computer system to be scalable with the different computing systems. We named the system Ninf (Network Information Library for High Performance Computing). The summary of application technology of library was described as follows: the application technology of library under the distributed environment, numeric constants, retrieval of value, library of special functions, computing library, Ninf library interface, Ninf remote library and registration. By the system, user is able to use the program concentrating the analyzing technology of numerical value with high precision, reliability and speed. (S.Y.)

  12. Computer architecture technology trends

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. This year's edition of Computer Architecture Technology Trends analyses the trends which are taking place in the architecture of computing systems today. Due to the sheer number of different applications to which computers are being applied, there seems no end to the different adoptions which proliferate. There are, however, some underlying trends which appear. Decision makers should be aware of these trends when specifying architectures, particularly for future applications. This report is fully revised and updated and provides insight in

  13. Human Computer Music Performance

    OpenAIRE

    Dannenberg, Roger B.

    2012-01-01

    Human Computer Music Performance (HCMP) is the study of music performance by live human performers and real-time computer-based performers. One goal of HCMP is to create a highly autonomous artificial performer that can fill the role of a human, especially in a popular music setting. This will require advances in automated music listening and understanding, new representations for music, techniques for music synchronization, real-time human-computer communication, music generation, sound synt...

  14. HIGH PERFORMANCE PHOTOGRAMMETRIC PROCESSING ON COMPUTER CLUSTERS

    Directory of Open Access Journals (Sweden)

    V. N. Adrov

    2012-07-01

    Full Text Available Most cpu consuming tasks in photogrammetric processing can be done in parallel. The algorithms take independent bits as input and produce independent bits as output. The independence of bits comes from the nature of such algorithms since images, stereopairs or small image blocks parts can be processed independently. Many photogrammetric algorithms are fully automatic and do not require human interference. Photogrammetric workstations can perform tie points measurements, DTM calculations, orthophoto construction, mosaicing and many other service operations in parallel using distributed calculations. Distributed calculations save time reducing several days calculations to several hours calculations. Modern trends in computer technology show the increase of cpu cores in workstations, speed increase in local networks, and as a result dropping the price of the supercomputers or computer clusters that can contain hundreds or even thousands of computing nodes. Common distributed processing in DPW is usually targeted for interactive work with a limited number of cpu cores and is not optimized for centralized administration. The bottleneck of common distributed computing in photogrammetry can be in the limited lan throughput and storage performance, since the processing of huge amounts of large raster images is needed.

  15. Computing, Information, and Communications Technology (CICT) Program Overview

    Science.gov (United States)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  16. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing. The PRIMA Project

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D. [Univ. of Oregon, Eugene, OR (United States). Dept. of Computer and Information Science; Wolf, Felix G. [Wilhelm-Johnen-Strasse, Julich (Germany). Forschungszentrum Julich GmbH

    2014-01-31

    The growing number of cores provided by today’s high-­end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-­performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensively across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-­fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to

  17. A Longitudinal Examination of the Effects of Computer Self-Efficacy Growth on Performance during Technology Training

    Science.gov (United States)

    Downey, James P.; Kher, Hemant V.

    2015-01-01

    Technology training in the classroom is critical in preparing students for upper level classes as well as professional careers, especially in fields such as technology. One of the key enablers to this process is computer self-efficacy (CSE), which has an extensive stream of empirical research. Despite this, one of the missing pieces is how CSE…

  18. High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations

    Science.gov (United States)

    Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.

    2003-01-01

    Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.

  19. Edge computing technologies for Internet of Things: a primer

    Directory of Open Access Journals (Sweden)

    Yuan Ai

    2018-04-01

    Full Text Available With the rapid development of mobile internet and Internet of Things applications, the conventional centralized cloud computing is encountering severe challenges, such as high latency, low Spectral Efficiency (SE, and non-adaptive machine type of communication. Motivated to solve these challenges, a new technology is driving a trend that shifts the function of centralized cloud computing to edge devices of networks. Several edge computing technologies originating from different backgrounds to decrease latency, improve SE, and support the massive machine type of communication have been emerging. This paper comprehensively presents a tutorial on three typical edge computing technologies, namely mobile edge computing, cloudlets, and fog computing. In particular, the standardization efforts, principles, architectures, and applications of these three technologies are summarized and compared. From the viewpoint of radio access network, the differences between mobile edge computing and fog computing are highlighted, and the characteristics of fog computing-based radio access network are discussed. Finally, open issues and future research directions are identified as well. Keywords: Internet of Things (IoT, Mobile edge computing, Cloudlets, Fog computing

  20. Using high performance interconnects in a distributed computing and mass storage environment

    International Nuclear Information System (INIS)

    Ernst, M.

    1994-01-01

    Detector Collaborations of the HERA Experiments typically involve more than 500 physicists from a few dozen institutes. These physicists require access to large amounts of data in a fully transparent manner. Important issues include Distributed Mass Storage Management Systems in a Distributed and Heterogeneous Computing Environment. At the very center of a distributed system, including tens of CPUs and network attached mass storage peripherals are the communication links. Today scientists are witnessing an integration of computing and communication technology with the open-quote network close-quote becoming the computer. This contribution reports on a centrally operated computing facility for the HERA Experiments at DESY, including Symmetric Multiprocessor Machines (84 Processors), presently more than 400 GByte of magnetic disk and 40 TB of automoted tape storage, tied together by a HIPPI open-quote network close-quote. Focussing on the High Performance Interconnect technology, details will be provided about the HIPPI based open-quote Backplane close-quote configured around a 20 Gigabit/s Multi Media Router and the performance and efficiency of the related computer interfaces

  1. Ubiquitous Computing Technologies in Education

    Science.gov (United States)

    Hwang, Gwo-Jen; Wu, Ting-Ting; Chen, Yen-Jung

    2007-01-01

    The prosperous development of wireless communication and sensor technologies has attracted the attention of researchers from both computer and education fields. Various investigations have been made for applying the new technologies to education purposes, such that more active and adaptive learning activities can be conducted in the real world.…

  2. Weightbearing Computed Tomography of the Foot and Ankle: Emerging Technology Topical Review.

    Science.gov (United States)

    Barg, Alexej; Bailey, Travis; Richter, Martinus; de Cesar Netto, Cesar; Lintz, François; Burssens, Arne; Phisitkul, Phinit; Hanrahan, Christopher J; Saltzman, Charles L

    2018-03-01

    In the last decade, cone-beam computed tomography technology with improved designs allowing flexible gantry movements has allowed both supine and standing weight-bearing imaging of the lower extremity. There is an increasing amount of literature describing the use of weightbearing computed tomography in patients with foot and ankle disorders. To date, there is no review article summarizing this imaging modality in the foot and ankle. Therefore, we performed a systematic literature review of relevant clinical studies targeting the use of weightbearing computed tomography in diagnosis of patients with foot and ankle disorders. Furthermore, this review aims to offer insight to those with interest in considering possible future research opportunities with use of this technology. Level V, expert opinion.

  3. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Kozacik, Stephen [EM Photonics, Inc., Newark, DE (United States)

    2017-05-15

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  4. The Impact of Cloud Computing Technologies in E-learning

    Directory of Open Access Journals (Sweden)

    Hosam Farouk El-Sofany

    2013-01-01

    Full Text Available Cloud computing is a new computing model which is based on the grid computing, distributed computing, parallel computing and virtualization technologies define the shape of a new technology. It is the core technology of the next generation of network computing platform, especially in the field of education, cloud computing is the basic environment and platform of the future E-learning. It provides secure data storage, convenient internet services and strong computing power. This article mainly focuses on the research of the application of cloud computing in E-learning environment. The research study shows that the cloud platform is valued for both students and instructors to achieve the course objective. The paper presents the nature, benefits and cloud computing services, as a platform for e-learning environment.

  5. Basic research and 12 years of clinical experience in computer-assisted navigation technology: a review.

    Science.gov (United States)

    Ewers, R; Schicho, K; Undt, G; Wanschitz, F; Truppe, M; Seemann, R; Wagner, A

    2005-01-01

    Computer-aided surgical navigation technology is commonly used in craniomaxillofacial surgery. It offers substantial improvement regarding esthetic and functional aspects in a range of surgical procedures. Based on augmented reality principles, where the real operative site is merged with computer generated graphic information, computer-aided navigation systems were employed, among other procedures, in dental implantology, arthroscopy of the temporomandibular joint, osteotomies, distraction osteogenesis, image guided biopsies and removals of foreign bodies. The decision to perform a procedure with or without computer-aided intraoperative navigation depends on the expected benefit to the procedure as well as on the technical expenditure necessary to achieve that goal. This paper comprises the experience gained in 12 years of research, development and routine clinical application. One hundred and fifty-eight operations with successful application of surgical navigation technology--divided into five groups--are evaluated regarding the criteria "medical benefit" and "technical expenditure" necessary to perform these procedures. Our results indicate that the medical benefit is likely to outweight the expenditure of technology with few exceptions (calvaria transplant, resection of the temporal bone, reconstruction of the orbital floor). Especially in dental implantology, specialized software reduces time and additional costs necessary to plan and perform procedures with computer-aided surgical navigation.

  6. Computer-Related Task Performance

    DEFF Research Database (Denmark)

    Longstreet, Phil; Xiao, Xiao; Sarker, Saonee

    2016-01-01

    The existing information system (IS) literature has acknowledged computer self-efficacy (CSE) as an important factor contributing to enhancements in computer-related task performance. However, the empirical results of CSE on performance have not always been consistent, and increasing an individual......'s CSE is often a cumbersome process. Thus, we introduce the theoretical concept of self-prophecy (SP) and examine how this social influence strategy can be used to improve computer-related task performance. Two experiments are conducted to examine the influence of SP on task performance. Results show...... that SP and CSE interact to influence performance. Implications are then discussed in terms of organizations’ ability to increase performance....

  7. Multimedia Image Technology and Computer Aided Manufacturing Engineering Analysis

    Science.gov (United States)

    Nan, Song

    2018-03-01

    Since the reform and opening up, with the continuous development of science and technology in China, more and more advanced science and technology have emerged under the trend of diversification. Multimedia imaging technology, for example, has a significant and positive impact on computer aided manufacturing engineering in China. From the perspective of scientific and technological advancement and development, the multimedia image technology has a very positive influence on the application and development of computer-aided manufacturing engineering, whether in function or function play. Therefore, this paper mainly starts from the concept of multimedia image technology to analyze the application of multimedia image technology in computer aided manufacturing engineering.

  8. Infinite possibilities: Computational structures technology

    Science.gov (United States)

    Beam, Sherilee F.

    1994-12-01

    Computational Fluid Dynamics (or CFD) methods are very familiar to the research community. Even the general public has had some exposure to CFD images, primarily through the news media. However, very little attention has been paid to CST--Computational Structures Technology. Yet, no important design can be completed without it. During the first half of this century, researchers only dreamed of designing and building structures on a computer. Today their dreams have become practical realities as computational methods are used in all phases of design, fabrication and testing of engineering systems. Increasingly complex structures can now be built in even shorter periods of time. Over the past four decades, computer technology has been developing, and early finite element methods have grown from small in-house programs to numerous commercial software programs. When coupled with advanced computing systems, they help engineers make dramatic leaps in designing and testing concepts. The goals of CST include: predicting how a structure will behave under actual operating conditions; designing and complementing other experiments conducted on a structure; investigating microstructural damage or chaotic, unpredictable behavior; helping material developers in improving material systems; and being a useful tool in design systems optimization and sensitivity techniques. Applying CST to a structure problem requires five steps: (1) observe the specific problem; (2) develop a computational model for numerical simulation; (3) develop and assemble software and hardware for running the codes; (4) post-process and interpret the results; and (5) use the model to analyze and design the actual structure. Researchers in both industry and academia continue to make significant contributions to advance this technology with improvements in software, collaborative computing environments and supercomputing systems. As these environments and systems evolve, computational structures technology will

  9. Information technology and computational physics

    CERN Document Server

    Kóczy, László; Mesiar, Radko; Kacprzyk, Janusz

    2017-01-01

    A broad spectrum of modern Information Technology (IT) tools, techniques, main developments and still open challenges is presented. Emphasis is on new research directions in various fields of science and technology that are related to data analysis, data mining, knowledge discovery, information retrieval, clustering and classification, decision making and decision support, control, computational mathematics and physics, to name a few. Applications in many relevant fields are presented, notably in telecommunication, social networks, recommender systems, fault detection, robotics, image analysis and recognition, electronics, etc. The methods used by the authors range from high level formal mathematical tools and techniques, through algorithmic and computational tools, to modern metaheuristics.

  10. RISC Processors and High Performance Computing

    Science.gov (United States)

    Bailey, David H.; Saini, Subhash; Craw, James M. (Technical Monitor)

    1995-01-01

    This tutorial will discuss the top five RISC microprocessors and the parallel systems in which they are used. It will provide a unique cross-machine comparison not available elsewhere. The effective performance of these processors will be compared by citing standard benchmarks in the context of real applications. The latest NAS Parallel Benchmarks, both absolute performance and performance per dollar, will be listed. The next generation of the NPB will be described. The tutorial will conclude with a discussion of future directions in the field. Technology Transfer Considerations: All of these computer systems are commercially available internationally. Information about these processors is available in the public domain, mostly from the vendors themselves. The NAS Parallel Benchmarks and their results have been previously approved numerous times for public release, beginning back in 1991.

  11. (CICT) Computing, Information, and Communications Technology Overview

    Science.gov (United States)

    VanDalsem, William R.

    2003-01-01

    The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.

  12. Optical Computers and Space Technology

    Science.gov (United States)

    Abdeldayem, Hossin A.; Frazier, Donald O.; Penn, Benjamin; Paley, Mark S.; Witherow, William K.; Banks, Curtis; Hicks, Rosilen; Shields, Angela

    1995-01-01

    The rapidly increasing demand for greater speed and efficiency on the information superhighway requires significant improvements over conventional electronic logic circuits. Optical interconnections and optical integrated circuits are strong candidates to provide the way out of the extreme limitations imposed on the growth of speed and complexity of nowadays computations by the conventional electronic logic circuits. The new optical technology has increased the demand for high quality optical materials. NASA's recent involvement in processing optical materials in space has demonstrated that a new and unique class of high quality optical materials are processible in a microgravity environment. Microgravity processing can induce improved orders in these materials and could have a significant impact on the development of optical computers. We will discuss NASA's role in processing these materials and report on some of the associated nonlinear optical properties which are quite useful for optical computers technology.

  13. High performance computing and communications: FY 1996 implementation plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-16

    The High Performance Computing and Communications (HPCC) Program was formally authorized by passage of the High Performance Computing Act of 1991, signed on December 9, 1991. Twelve federal agencies, in collaboration with scientists and managers from US industry, universities, and research laboratories, have developed the Program to meet the challenges of advancing computing and associated communications technologies and practices. This plan provides a detailed description of the agencies` HPCC implementation plans for FY 1995 and FY 1996. This Implementation Plan contains three additional sections. Section 3 provides an overview of the HPCC Program definition and organization. Section 4 contains a breakdown of the five major components of the HPCC Program, with an emphasis on the overall directions and milestones planned for each one. Section 5 provides a detailed look at HPCC Program activities within each agency.

  14. Evolution of Cloud Computing and Enabling Technologies

    OpenAIRE

    Rabi Prasad Padhy; Manas Ranjan Patra

    2012-01-01

    We present an overview of the history of forecasting software over the past 25 years, concentrating especially on the interaction between computing and technologies from mainframe computing to cloud computing. The cloud computing is latest one. For delivering the vision of  various  of computing models, this paper lightly explains the architecture, characteristics, advantages, applications and issues of various computing models like PC computing, internet computing etc and related technologie...

  15. 6th International Conference on High Performance Scientific Computing

    CERN Document Server

    Phu, Hoang; Rannacher, Rolf; Schlöder, Johannes

    2017-01-01

    This proceedings volume highlights a selection of papers presented at the Sixth International Conference on High Performance Scientific Computing, which took place in Hanoi, Vietnam on March 16-20, 2015. The conference was jointly organized by the Heidelberg Institute of Theoretical Studies (HITS), the Institute of Mathematics of the Vietnam Academy of Science and Technology (VAST), the Interdisciplinary Center for Scientific Computing (IWR) at Heidelberg University, and the Vietnam Institute for Advanced Study in Mathematics, Ministry of Education The contributions cover a broad, interdisciplinary spectrum of scientific computing and showcase recent advances in theory, methods, and practical applications. Subjects covered numerical simulation, methods for optimization and control, parallel computing, and software development, as well as the applications of scientific computing in physics, mechanics, biomechanics and robotics, material science, hydrology, biotechnology, medicine, transport, scheduling, and in...

  16. Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing

    Science.gov (United States)

    Some, Raphael; Doyle, Richard; Bergman, Larry; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael

    2013-01-01

    Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and mission. Onboard computing can be aptly viewed as a "technology multiplier" in that advances provide direct dramatic improvements in flight functions and capabilities across the NASA mission classes, and enable new flight capabilities and mission scenarios, increasing science and exploration return. Space-qualified computing technology, however, has not advanced significantly in well over ten years and the current state of the practice fails to meet the near- to mid-term needs of NASA missions. Recognizing this gap, the NASA Game Changing Development Program (GCDP), under the auspices of the NASA Space Technology Mission Directorate, commissioned a study on space-based computing needs, looking out 15-20 years. The study resulted in a recommendation to pursue high-performance spaceflight computing (HPSC) for next-generation missions, and a decision to partner with the Air Force Research Lab (AFRL) in this development.

  17. Survey of computer vision technology for UVA navigation

    Science.gov (United States)

    Xie, Bo; Fan, Xiang; Li, Sijian

    2017-11-01

    carried out at high speed. The system is applied to rapid response system. (2) The visual system of distributed network. There are several discrete image data acquisition sensor in different locations, which transmit image data to the node processor to increase the sampling rate. (3) The visual system combined with observer. The system combines image sensors with the external observers to make up for lack of visual equipment. To some degree, these systems overcome lacks of the early visual system, including low frequency, low processing efficiency and strong noise. In the end, the difficulties of navigation based on computer version technology in practical application are briefly discussed. (1) Due to the huge workload of image operation , the real-time performance of the system is poor. (2) Due to the large environmental impact , the anti-interference ability of the system is poor.(3) Due to the ability to work in a particular environment, the system has poor adaptability.

  18. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    Science.gov (United States)

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  19. Review of Enabling Technologies to Facilitate Secure Compute Customization

    Energy Technology Data Exchange (ETDEWEB)

    Aderholdt, Ferrol [Tennessee Technological University; Caldwell, Blake A [ORNL; Hicks, Susan Elaine [ORNL; Koch, Scott M [ORNL; Naughton, III, Thomas J [ORNL; Pelfrey, Daniel S [ORNL; Pogge, James R [Tennessee Technological University; Scott, Stephen L [Tennessee Technological University; Shipman, Galen M [ORNL; Sorrillo, Lawrence [ORNL

    2014-12-01

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data for a variety of users, often requiring strong separation between job allocations. There are many challenges to establishing these secure enclaves within the shared infrastructure of high-performance computing (HPC) environments. The isolation mechanisms in the system software are the basic building blocks for enabling secure compute enclaves. There are a variety of approaches and the focus of this report is to review the different virtualization technologies that facilitate the creation of secure compute enclaves. The report reviews current operating system (OS) protection mechanisms and modern virtualization technologies to better understand the performance/isolation properties. We also examine the feasibility of running ``virtualized'' computing resources as non-privileged users, and providing controlled administrative permissions for standard users running within a virtualized context. Our examination includes technologies such as Linux containers (LXC [32], Docker [15]) and full virtualization (KVM [26], Xen [5]). We categorize these different approaches to virtualization into two broad groups: OS-level virtualization and system-level virtualization. The OS-level virtualization uses containers to allow a single OS kernel to be partitioned to create Virtual Environments (VE), e.g., LXC. The resources within the host's kernel are only virtualized in the sense of separate namespaces. In contrast, system-level virtualization uses hypervisors to manage multiple OS kernels and virtualize the physical resources (hardware) to create Virtual Machines (VM), e.g., Xen, KVM. This terminology of VE and VM, detailed in Section 2, is used throughout the report to distinguish between the two different approaches to providing virtualized execution

  20. 3rd International Conference on Computer & Communication Technologies

    CERN Document Server

    Bhateja, Vikrant; Raju, K; Janakiramaiah, B

    2017-01-01

    The book is a compilation of high-quality scientific papers presented at the 3rd International Conference on Computer & Communication Technologies (IC3T 2016). The individual papers address cutting-edge technologies and applications of soft computing, artificial intelligence and communication. In addition, a variety of further topics are discussed, which include data mining, machine intelligence, fuzzy computing, sensor networks, signal and image processing, human-computer interaction, web intelligence, etc. As such, it offers readers a valuable and unique resource.

  1. Philosophy of computing and information technology

    NARCIS (Netherlands)

    Brey, Philip A.E.; Soraker, Johnny; Meijers, A.

    2009-01-01

    Philosophy has been described as having taken a “computational turn,” referring to the ways in which computers and information technology throw new light upon traditional philosophical issues, provide new tools and concepts for philosophical reasoning, and pose theoretical and practical questions

  2. Computer Science and Technology Publications. NBS Publications List 84.

    Science.gov (United States)

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  3. FY 2000 Blue Book: High Performance Computing and Communications: Information Technology Frontiers for a New Millennium

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — As we near the dawn of a new millennium, advances made possible by computing, information, and communications research and development R and D ? once barely...

  4. High-performance computing using FPGAs

    CERN Document Server

    Benkrid, Khaled

    2013-01-01

    This book is concerned with the emerging field of High Performance Reconfigurable Computing (HPRC), which aims to harness the high performance and relative low power of reconfigurable hardware–in the form Field Programmable Gate Arrays (FPGAs)–in High Performance Computing (HPC) applications. It presents the latest developments in this field from applications, architecture, and tools and methodologies points of view. We hope that this work will form a reference for existing researchers in the field, and entice new researchers and developers to join the HPRC community.  The book includes:  Thirteen application chapters which present the most important application areas tackled by high performance reconfigurable computers, namely: financial computing, bioinformatics and computational biology, data search and processing, stencil computation e.g. computational fluid dynamics and seismic modeling, cryptanalysis, astronomical N-body simulation, and circuit simulation.     Seven architecture chapters which...

  5. High-performance computing on GPUs for resistivity logging of oil and gas wells

    Science.gov (United States)

    Glinskikh, V.; Dudaev, A.; Nechaev, O.; Surodina, I.

    2017-10-01

    We developed and implemented into software an algorithm for high-performance simulation of electrical logs from oil and gas wells using high-performance heterogeneous computing. The numerical solution of the 2D forward problem is based on the finite-element method and the Cholesky decomposition for solving a system of linear algebraic equations (SLAE). Software implementations of the algorithm used the NVIDIA CUDA technology and computing libraries are made, allowing us to perform decomposition of SLAE and find its solution on central processor unit (CPU) and graphics processor unit (GPU). The calculation time is analyzed depending on the matrix size and number of its non-zero elements. We estimated the computing speed on CPU and GPU, including high-performance heterogeneous CPU-GPU computing. Using the developed algorithm, we simulated resistivity data in realistic models.

  6. [Computer technologies in teaching pathological anatomy].

    Science.gov (United States)

    Ponomarev, A B; Fedorov, D N

    2015-01-01

    The paper gives experience with personal computers used at the Academician A.L. Strukov Department of Pathological Anatomy for more than 20 years. It shows the objective necessity of introducing computer technologies at all stages of acquiring skills in anatomical pathology, including lectures, students' free work, test check, etc.

  7. Air Force Science & Technology Issues & Opportunities Regarding High Performance Embedded Computing

    Science.gov (United States)

    2009-09-23

    price-performance advantage include: large scale simulations of neuromorphic computing models GOTCHA radar video SAR for wide area persistent...the handcuffs were not for me and that the military had so far got … Neuromorphic example: Robust recognition of occluded text Gotcha SAR PCID Image...Architecture 16 cores / chip 10 x 10 stacks / board50 chips / stack EDRAM AFPGA EDRAM AFPGA EDRAM AFPGA EDRAM AFPGA EDRAM AFPGA EDRAM AFPGA EDRAM AFPGA EDRAM

  8. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    Science.gov (United States)

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  9. Women and Computer Based Technologies: A Feminist Perspective.

    Science.gov (United States)

    Morritt, Hope

    The use of computer based technologies by professional women in education is examined through a feminist standpoint theory in this paper. The theory is grounded in eight claims which form the basis of the conceptual framework for the study. The experiences of nine women participants with computer based technologies were categorized using three…

  10. Is Computer Science Compatible with Technological Literacy?

    Science.gov (United States)

    Buckler, Chris; Koperski, Kevin; Loveland, Thomas R.

    2018-01-01

    Although technology education evolved over time, and pressure increased to infuse more engineering principles and increase links to STEM (science technology, engineering, and mathematics) initiatives, there has never been an official alignment between technology and engineering education and computer science. There is movement at the federal level…

  11. Technologies and tools for high-performance distributed computing. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Karonis, Nicholas T.

    2000-05-01

    In this project we studied the practical use of the MPI message-passing interface in advanced distributed computing environments. We built on the existing software infrastructure provided by the Globus Toolkit{trademark}, the MPICH portable implementation of MPI, and the MPICH-G integration of MPICH with Globus. As a result of this project we have replaced MPICH-G with its successor MPICH-G2, which is also an integration of MPICH with Globus. MPICH-G2 delivers significant improvements in message passing performance when compared to its predecessor MPICH-G and was based on superior software design principles resulting in a software base that was much easier to make the functional extensions and improvements we did. Using Globus services we replaced the default implementation of MPI's collective operations in MPICH-G2 with more efficient multilevel topology-aware collective operations which, in turn, led to the development of a new timing methodology for broadcasts [8]. MPICH-G2 was extended to include client/server functionality from the MPI-2 standard [23] to facilitate remote visualization applications and, through the use of MPI idioms, MPICH-G2 provided application-level control of quality-of-service parameters as well as application-level discovery of underlying Grid-topology information. Finally, MPICH-G2 was successfully used in a number of applications including an award-winning record-setting computation in numerical relativity. In the sections that follow we describe in detail the accomplishments of this project, we present experimental results quantifying the performance improvements, and conclude with a discussion of our applications experiences. This project resulted in a significant increase in the utility of MPICH-G2.

  12. High-performance computing for structural mechanics and earthquake/tsunami engineering

    CERN Document Server

    Hori, Muneo; Ohsaki, Makoto

    2016-01-01

    Huge earthquakes and tsunamis have caused serious damage to important structures such as civil infrastructure elements, buildings and power plants around the globe.  To quantitatively evaluate such damage processes and to design effective prevention and mitigation measures, the latest high-performance computational mechanics technologies, which include telascale to petascale computers, can offer powerful tools. The phenomena covered in this book include seismic wave propagation in the crust and soil, seismic response of infrastructure elements such as tunnels considering soil-structure interactions, seismic response of high-rise buildings, seismic response of nuclear power plants, tsunami run-up over coastal towns and tsunami inundation considering fluid-structure interactions. The book provides all necessary information for addressing these phenomena, ranging from the fundamentals of high-performance computing for finite element methods, key algorithms of accurate dynamic structural analysis, fluid flows ...

  13. Merging Technology and Emotions: Introduction to Affective Computing.

    Science.gov (United States)

    Brigham, Tara J

    2017-01-01

    Affective computing technologies are designed to sense and respond based on human emotions. This technology allows a computer system to process the information gathered from various sensors to assess the emotional state of an individual. The system then offers a distinct response based on what it "felt." While this is completely unlike how most people interact with electronics today, this technology is likely to trickle into future everyday life. This column will explain what affective computing is, some of its benefits, and concerns with its adoption. It will also provide an overview of its implication in the library setting and offer selected examples of how and where it is currently being used.

  14. Design Anthropology, Emerging Technologies and Alternative Computational Futures

    DEFF Research Database (Denmark)

    Smith, Rachel Charlotte

    Emerging technologies are providing a new field for design anthropological inquiry that unite experiences, imaginaries and materialities in complex way and demands new approaches to developing sustainable computational futures.......Emerging technologies are providing a new field for design anthropological inquiry that unite experiences, imaginaries and materialities in complex way and demands new approaches to developing sustainable computational futures....

  15. Performing quantum computing experiments in the cloud

    Science.gov (United States)

    Devitt, Simon J.

    2016-09-01

    Quantum computing technology has reached a second renaissance in the past five years. Increased interest from both the private and public sector combined with extraordinary theoretical and experimental progress has solidified this technology as a major advancement in the 21st century. As anticipated my many, some of the first realizations of quantum computing technology has occured over the cloud, with users logging onto dedicated hardware over the classical internet. Recently, IBM has released the Quantum Experience, which allows users to access a five-qubit quantum processor. In this paper we take advantage of this online availability of actual quantum hardware and present four quantum information experiments. We utilize the IBM chip to realize protocols in quantum error correction, quantum arithmetic, quantum graph theory, and fault-tolerant quantum computation by accessing the device remotely through the cloud. While the results are subject to significant noise, the correct results are returned from the chip. This demonstrates the power of experimental groups opening up their technology to a wider audience and will hopefully allow for the next stage of development in quantum information technology.

  16. Education & Technology: Reflections on Computing in Classrooms.

    Science.gov (United States)

    Fisher, Charles, Ed.; Dwyer, David C., Ed.; Yocam, Keith, Ed.

    This volume examines learning in the age of technology, describes changing practices in technology-rich classrooms, and proposes new ways to support teachers as they incorporate technology into their work. It commemorates the eleventh anniversary of the Apple Classrooms of Tomorrow (ACOT) Project, when Apple Computer, Inc., in partnership with a…

  17. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  18. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  19. The implementation of AI technologies in computer wargames

    Science.gov (United States)

    Tiller, John A.

    2004-08-01

    Computer wargames involve the most in-depth analysis of general game theory. The enumerated turns of a game like chess are dwarfed by the exponentially larger possibilities of even a simple computer wargame. Implementing challenging AI is computer wargames is an important goal in both the commercial and military environments. In the commercial marketplace, customers demand a challenging AI opponent when they play a computer wargame and are frustrated by a lack of competence on the part of the AI. In the military environment, challenging AI opponents are important for several reasons. A challenging AI opponent will force the military professional to avoid routine or set-piece approaches to situations and cause them to think much deeper about military situations before taking action. A good AI opponent would also include national characteristics of the opponent being simulated, thus providing the military professional with even more of a challenge in planning and approach. Implementing current AI technologies in computer wargames is a technological challenge. The goal is to join the needs of AI in computer wargames with the solutions of current AI technologies. This talk will address several of those issues, possible solutions, and currently unsolved problems.

  20. The state of ergonomics for mobile computing technology.

    Science.gov (United States)

    Dennerlein, Jack T

    2015-01-01

    Because mobile computing technologies, such as notebook computers, smart mobile phones, and tablet computers afford users many different configurations through their intended mobility, there is concern about their effects on musculoskeletal pain and a need for usage recommendations. Therefore the main goal of this paper to determine which best practices surrounding the use of mobile computing devices can be gleaned from current field and laboratory studies of mobile computing devices. An expert review was completed. Field studies have documented various user configurations, which often include non-neutral postures, that users adopt when using mobile technology, along with some evidence suggesting that longer duration of use is associated with more discomfort. It is therefore prudent for users to take advantage of their mobility and not get stuck in any given posture for too long. The use of accessories such as appropriate cases or riser stands, as well as external keyboards and pointing devices, can also improve postures and comfort. Overall, the state of ergonomics for mobile technology is a work in progress and there are more research questions to be addressed.

  1. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie; Baca, Michael J.; Partridge, L. Donald; Finnegan, Patrick Sean; Wolfley, Steven L.; Dagel, Daryl James; Spahn, Olga Blum; Harper, Jason C.; Pohl, Kenneth Roy; Mickel, Patrick R.; Lohn, Andrew; Marinella, Matthew

    2013-10-01

    The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we will instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.

  2. Information Technology in project-organized electronic and computer technology engineering education

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    1999-01-01

    This paper describes the integration of IT in the education of electronic and computer technology engineers at Institute of Electronic Systems, Aalborg Uni-versity, Denmark. At the Institute Information Technology is an important tool in the aspects of the education as well as for communication...

  3. The application of cloud computing to scientific workflows: a study of cost and performance.

    Science.gov (United States)

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  4. Performance Measurements in a High Throughput Computing Environment

    CERN Document Server

    AUTHOR|(CDS)2145966; Gribaudo, Marco

    The IT infrastructures of companies and research centres are implementing new technologies to satisfy the increasing need of computing resources for big data analysis. In this context, resource profiling plays a crucial role in identifying areas where the improvement of the utilisation efficiency is needed. In order to deal with the profiling and optimisation of computing resources, two complementary approaches can be adopted: the measurement-based approach and the model-based approach. The measurement-based approach gathers and analyses performance metrics executing benchmark applications on computing resources. Instead, the model-based approach implies the design and implementation of a model as an abstraction of the real system, selecting only those aspects relevant to the study. This Thesis originates from a project carried out by the author within the CERN IT department. CERN is an international scientific laboratory that conducts fundamental researches in the domain of elementary particle physics. The p...

  5. A High Performance VLSI Computer Architecture For Computer Graphics

    Science.gov (United States)

    Chin, Chi-Yuan; Lin, Wen-Tai

    1988-10-01

    A VLSI computer architecture, consisting of multiple processors, is presented in this paper to satisfy the modern computer graphics demands, e.g. high resolution, realistic animation, real-time display etc.. All processors share a global memory which are partitioned into multiple banks. Through a crossbar network, data from one memory bank can be broadcasted to many processors. Processors are physically interconnected through a hyper-crossbar network (a crossbar-like network). By programming the network, the topology of communication links among processors can be reconfigurated to satisfy specific dataflows of different applications. Each processor consists of a controller, arithmetic operators, local memory, a local crossbar network, and I/O ports to communicate with other processors, memory banks, and a system controller. Operations in each processor are characterized into two modes, i.e. object domain and space domain, to fully utilize the data-independency characteristics of graphics processing. Special graphics features such as 3D-to-2D conversion, shadow generation, texturing, and reflection, can be easily handled. With the current high density interconnection (MI) technology, it is feasible to implement a 64-processor system to achieve 2.5 billion operations per second, a performance needed in most advanced graphics applications.

  6. Emerging Computation and Information teChnologies for Education : Proceeding of 2012 International Conference on Emerging Computation and Information teChnologies for Education

    CERN Document Server

    Xu, Linli; Tian, Wenya

    2012-01-01

    The 2012 International Conference on Emerging Computation and Information teChnologies for Education (ECICE 2012) was  held on Jan. 15-16, 2012, Hangzhou, China.  The main results of the conference are presented in this proceedings book of carefully reviewed and accepted paper addressing the hottest issues in emerging computation and information technologies used for education. The volume covers a wide series of topics in the area, including Computer-Assisted Education, Educational Information Systems, Web-based Learning, etc.

  7. A study of computer graphics technology in application of communication resource management

    Science.gov (United States)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  8. Mathematics for engineering, technology and computing science

    CERN Document Server

    Martin, Hedley G

    1970-01-01

    Mathematics for Engineering, Technology and Computing Science is a text on mathematics for courses in engineering, technology, and computing science. It covers linear algebra, ordinary differential equations, and vector analysis, together with line and multiple integrals. This book consists of eight chapters and begins with a discussion on determinants and linear equations, with emphasis on how the value of a determinant is defined and how it may be obtained. Solution of linear equations and the dependence between linear equations are also considered. The next chapter introduces the reader to

  9. High performance computing and communications: FY 1995 implementation plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-04-01

    The High Performance Computing and Communications (HPCC) Program was formally established following passage of the High Performance Computing Act of 1991 signed on December 9, 1991. Ten federal agencies in collaboration with scientists and managers from US industry, universities, and laboratories have developed the HPCC Program to meet the challenges of advancing computing and associated communications technologies and practices. This plan provides a detailed description of the agencies` HPCC implementation plans for FY 1994 and FY 1995. This Implementation Plan contains three additional sections. Section 3 provides an overview of the HPCC Program definition and organization. Section 4 contains a breakdown of the five major components of the HPCC Program, with an emphasis on the overall directions and milestones planned for each one. Section 5 provides a detailed look at HPCC Program activities within each agency. Although the Department of Education is an official HPCC agency, its current funding and reporting of crosscut activities goes through the Committee on Education and Health Resources, not the HPCC Program. For this reason the Implementation Plan covers nine HPCC agencies.

  10. Advances in Computing and Information Technology : Proceedings of the Second International

    CERN Document Server

    Nagamalai, Dhinaharan; Chaki, Nabendu

    2012-01-01

    The international conference on Advances in Computing and Information technology (ACITY 2012) provides an excellent international forum for both academics and professionals for sharing knowledge and results in theory, methodology and applications of Computer Science and Information Technology. The Second International Conference on Advances in Computing and Information technology (ACITY 2012), held in Chennai, India, during July 13-15, 2012, covered a number of topics in all major fields of Computer Science and Information Technology including: networking and communications, network security and applications, web and internet computing, ubiquitous computing, algorithms, bioinformatics, digital image processing and pattern recognition, artificial intelligence, soft computing and applications. Upon a strength review process, a number of high-quality, presenting not only innovative ideas but also a founded evaluation and a strong argumentation of the same, were selected and collected in the present proceedings, ...

  11. High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

    International Nuclear Information System (INIS)

    Bouchard, Kristofer E.

    2016-01-01

    A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

  12. 10th International Conference on Computing and Information Technology

    CERN Document Server

    Unger, Herwig; Meesad, Phayung

    2014-01-01

    Computer and Information Technology (CIT) are now involved in governmental, industrial, and business domains more than ever. Thus, it is important for CIT personnel to continue academic research to improve technology and its adoption to modern applications. The up-to-date research and technologies must be distributed to researchers and CIT community continuously to aid future development. The 10th International Conference on Computing and Information Technology (IC 2 IT2014) organized by King Mongkut's University of Technology North Bangkok (KMUTNB) and partners provides an exchange of the state of the art and future developments in the two key areas of this process: Computer Networking and Data Mining. Behind the background of the foundation of ASEAN, it becomes clear that efficient languages, business principles and communication methods need to be adapted, unified and especially optimized to gain a maximum benefit to the users and customers of future IT systems.

  13. Applications of Computer Technology in Complex Craniofacial Reconstruction

    Directory of Open Access Journals (Sweden)

    Kristopher M. Day, MD

    2018-03-01

    Conclusion:. Modern 3D technology allows the surgeon to better analyze complex craniofacial deformities, precisely plan surgical correction with computer simulation of results, customize osteotomies, plan distractions, and print 3DPCI, as needed. The use of advanced 3D computer technology can be applied safely and potentially improve aesthetic and functional outcomes after complex craniofacial reconstruction. These techniques warrant further study and may be reproducible in various centers of care.

  14. DEFACTO: A Design Environment for Adaptive Computing Technology

    National Research Council Canada - National Science Library

    Hall, Mary

    2003-01-01

    This report describes the activities of the DEFACTO project, a Design Environment for Adaptive Computing Technology funded under the DARPA Adaptive Computing Systems and Just-In-Time-Hardware programs...

  15. Reconfigurable Computing As an Enabling Technology for Single-Photon-Counting Laser Altimetry

    Science.gov (United States)

    Powell, Wesley; Hicks, Edward; Pinchinat, Maxime; Dabney, Philip; McGarry, Jan; Murray, Paul

    2003-01-01

    Single-photon-counting laser altimetry is a new measurement technique offering significant advantages in vertical resolution, reducing instrument size, mass, and power, and reducing laser complexity as compared to analog or threshold detection laser altimetry techniques. However, these improvements come at the cost of a dramatically increased requirement for onboard real-time data processing. Reconfigurable computing has been shown to offer considerable performance advantages in performing this processing. These advantages have been demonstrated on the Multi-KiloHertz Micro-Laser Altimeter (MMLA), an aircraft based single-photon-counting laser altimeter developed by NASA Goddard Space Flight Center with several potential spaceflight applications. This paper describes how reconfigurable computing technology was employed to perform MMLA data processing in real-time under realistic operating constraints, along with the results observed. This paper also expands on these prior results to identify concepts for using reconfigurable computing to enable spaceflight single-photon-counting laser altimeter instruments.

  16. Strategic Planning for Computer-Based Educational Technology.

    Science.gov (United States)

    Bozeman, William C.

    1984-01-01

    Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…

  17. Optimal Selection Method of Process Patents for Technology Transfer Using Fuzzy Linguistic Computing

    Directory of Open Access Journals (Sweden)

    Gangfeng Wang

    2014-01-01

    Full Text Available Under the open innovation paradigm, technology transfer of process patents is one of the most important mechanisms for manufacturing companies to implement process innovation and enhance the competitive edge. To achieve promising technology transfers, we need to evaluate the feasibility of process patents and optimally select the most appropriate patent according to the actual manufacturing situation. Hence, this paper proposes an optimal selection method of process patents using multiple criteria decision-making and 2-tuple fuzzy linguistic computing to avoid information loss during the processes of evaluation integration. An evaluation index system for technology transfer feasibility of process patents is designed initially. Then, fuzzy linguistic computing approach is applied to aggregate the evaluations of criteria weights for each criterion and corresponding subcriteria. Furthermore, performance ratings for subcriteria and fuzzy aggregated ratings of criteria are calculated. Thus, we obtain the overall technology transfer feasibility of patent alternatives. Finally, a case study of aeroengine turbine manufacturing is presented to demonstrate the applicability of the proposed method.

  18. Computer simulation of steady-state performance of air-to-air heat pumps

    Energy Technology Data Exchange (ETDEWEB)

    Ellison, R D; Creswick, F A

    1978-03-01

    A computer model by which the performance of air-to-air heat pumps can be simulated is described. The intended use of the model is to evaluate analytically the improvements in performance that can be effected by various component improvements. The model is based on a trio of independent simulation programs originated at the Massachusetts Institute of Technology Heat Transfer Laboratory. The three programs have been combined so that user intervention and decision making between major steps of the simulation are unnecessary. The program was further modified by substituting a new compressor model and adding a capillary tube model, both of which are described. Performance predicted by the computer model is shown to be in reasonable agreement with performance data observed in our laboratory. Planned modifications by which the utility of the computer model can be enhanced in the future are described. User instructions and a FORTRAN listing of the program are included.

  19. Implications of Computer Technology. Harvard University Program on Technology and Society.

    Science.gov (United States)

    Taviss, Irene; Burbank, Judith

    Lengthy abstracts of a small number of selected books and articles on the implications of computer technology are presented, preceded by a brief state-of-the-art survey which traces the impact of computers on the structure of economic and political organizations and socio-cultural patterns. A summary statement introduces each of the three abstract…

  20. The Uses and Impacts of Mobile Computing Technology in Hot Spots Policing.

    Science.gov (United States)

    Koper, Christopher S; Lum, Cynthia; Hibdon, Julie

    2015-12-01

    Recent technological advances have much potential for improving police performance, but there has been little research testing whether they have made police more effective in reducing crime. To study the uses and crime control impacts of mobile computing technology in the context of geographically focused "hot spots" patrols. An experiment was conducted using 18 crime hot spots in a suburban jurisdiction. Nine of these locations were randomly selected to receive additional patrols over 11 weeks. Researchers studied officers' use of mobile information technology (IT) during the patrols using activity logs and interviews. Nonrandomized subgroup and multivariate analyses were employed to determine if and how the effects of the patrols varied based on these patterns. Officers used mobile computing technology primarily for surveillance and enforcement (e.g., checking automobile license plates and running checks on people during traffic stops and field interviews), and they noted both advantages and disadvantages to its use. Officers did not often use technology for strategic problem-solving and crime prevention. Given sufficient (but modest) dosages, the extra patrols reduced crime at the hot spots, but this effect was smaller in places where officers made greater use of technology. Basic applications of mobile computing may have little if any direct, measurable impact on officers' ability to reduce crime in the field. Greater training and emphasis on strategic uses of IT for problem-solving and crime prevention, and greater attention to its behavioral effects on officers, might enhance its application for crime reduction. © The Author(s) 2016.

  1. CLOUD COMPUTING TECHNOLOGY TRENDS

    Directory of Open Access Journals (Sweden)

    Cristian IVANUS

    2014-05-01

    Full Text Available Cloud computing has been a tremendous innovation, through which applications became available online, accessible through an Internet connection and using any computing device (computer, smartphone or tablet. According to one of the most recent studies conducted in 2012 by Everest Group and Cloud Connect, 57% of companies said they already use SaaS application (Software as a Service, and 38% reported using standard tools PaaS (Platform as a Service. However, in the most cases, the users of these solutions highlighted the fact that one of the main obstacles in the development of this technology is the fact that, in cloud, the application is not available without an Internet connection. The new challenge of the cloud system has become now the offline, specifically accessing SaaS applications without being connected to the Internet. This topic is directly related to user productivity within companies as productivity growth is one of the key promises of cloud computing system applications transformation. The aim of this paper is the presentation of some important aspects related to the offline cloud system and regulatory trends in the European Union (EU.

  2. Beyond computer literacy: supporting youth's positive development through technology.

    Science.gov (United States)

    Bers, Marina Umaschi

    2010-01-01

    In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for children to make a better world through the use of their computational skills, is just as important. The Positive Technological Development framework (PTD), a natural extension of the computer literacy and the technological fluency movements that have influenced the world of educational technology, adds psychosocial, civic, and ethical components to the cognitive ones. PTD examines the developmental tasks of a child growing up in our digital era and provides a model for developing and evaluating technology-rich youth programs. The explicit goal of PTD programs is to support children in the positive uses of technology to lead more fulfilling lives and make the world a better place. This article introduces the concept of PTD and presents examples of the Zora virtual world program for young people that the author developed following this framework.

  3. Technology in Note Taking and Assessment: The Effects of Congruence on Student Performance

    Directory of Open Access Journals (Sweden)

    Matthew E. Barrett

    2014-01-01

    Full Text Available This study examined the encoding specificity principle in relation to traditional and computer-based note taking and assessment formats in higher education. Students (N = 79 took lecture notes either by hand (n = 40 or by computer (n = 39 and then completed either a computer or a paper-based assessment. When note taking and assessment formats were congruent, students scored significantly higher on the assessment when compared to students whose note taking and assessment format were incongruent. These findings highlight the importance of research on how in-class technology may affect student performance, and suggest that faculty and administrators seek to coordinate and standardize the use of assessment and note taking technologies where possible.

  4. Many-core technologies: The move to energy-efficient, high-throughput x86 computing (TFLOPS on a chip)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    With Moore's Law alive and well, more and more parallelism is introduced into all computing platforms at all levels of integration and programming to achieve higher performance and energy efficiency. Especially in the area of High-Performance Computing (HPC) users can entertain a combination of different hardware and software parallel architectures and programming environments. Those technologies range from vectorization and SIMD computation over shared memory multi-threading (e.g. OpenMP) to distributed memory message passing (e.g. MPI) on cluster systems. We will discuss HPC industry trends and Intel's approach to it from processor/system architectures and research activities to hardware and software tools technologies. This includes the recently announced new Intel(r) Many Integrated Core (MIC) architecture for highly-parallel workloads and general purpose, energy efficient TFLOPS performance, some of its architectural features and its programming environment. At the end we will have a br...

  5. Development of superconductor electronics technology for high-end computing

    Energy Technology Data Exchange (ETDEWEB)

    Silver, A [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Kleinsasser, A [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Kerber, G [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Herr, Q [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Dorojevets, M [Department of Electrical and Computer Engineering, SUNY-Stony Brook, NY 11794-2350 (United States); Bunyk, P [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Abelson, L [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States)

    2003-12-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm{sup -2}, 1.25 {mu}m junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s{sup -1}, both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density.

  6. Development of superconductor electronics technology for high-end computing

    International Nuclear Information System (INIS)

    Silver, A; Kleinsasser, A; Kerber, G; Herr, Q; Dorojevets, M; Bunyk, P; Abelson, L

    2003-01-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm -2 , 1.25 μm junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s -1 , both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density

  7. High-Performance Secure Database Access Technologies for HEP Grids

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the

  8. High-Performance Secure Database Access Technologies for HEP Grids

    International Nuclear Information System (INIS)

    Vranicar, Matthew; Weicher, John

    2006-01-01

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist's computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that 'Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications'. There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure

  9. 9th International Conference on Advanced Computing & Communication Technologies

    CERN Document Server

    Mandal, Jyotsna; Auluck, Nitin; Nagarajaram, H

    2016-01-01

    This book highlights a collection of high-quality peer-reviewed research papers presented at the Ninth International Conference on Advanced Computing & Communication Technologies (ICACCT-2015) held at Asia Pacific Institute of Information Technology, Panipat, India during 27–29 November 2015. The book discusses a wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academia and industry present their original work and exchange ideas, information, techniques and applications in the field of Advanced Computing and Communication Technology.

  10. Teachers' Organization of Participation Structures for Teaching Science with Computer Technology

    Science.gov (United States)

    Subramaniam, Karthigeyan

    2016-08-01

    This paper describes a qualitative study that investigated the nature of the participation structures and how the participation structures were organized by four science teachers when they constructed and communicated science content in their classrooms with computer technology. Participation structures focus on the activity structures and processes in social settings like classrooms thereby providing glimpses into the complex dynamics of teacher-students interactions, configurations, and conventions during collective meaning making and knowledge creation. Data included observations, interviews, and focus group interviews. Analysis revealed that the dominant participation structure evident within participants' instruction with computer technology was ( Teacher) initiation-( Student and Teacher) response sequences-( Teacher) evaluate participation structure. Three key events characterized the how participants organized this participation structure in their classrooms: setting the stage for interactive instruction, the joint activity, and maintaining accountability. Implications include the following: (1) teacher educators need to tap into the knowledge base that underscores science teachers' learning to teach philosophies when computer technology is used in instruction. (2) Teacher educators need to emphasize the essential idea that learning and cognition is not situated within the computer technology but within the pedagogical practices, specifically the participation structures. (3) The pedagogical practices developed with the integration or with the use of computer technology underscored by the teachers' own knowledge of classroom contexts and curriculum needs to be the focus for how students learn science content with computer technology instead of just focusing on how computer technology solely supports students learning of science content.

  11. APA Summit on Medical Student Education Task Force on Informatics and Technology: learning about computers and applying computer technology to education and practice.

    Science.gov (United States)

    Hilty, Donald M; Hales, Deborah J; Briscoe, Greg; Benjamin, Sheldon; Boland, Robert J; Luo, John S; Chan, Carlyle H; Kennedy, Robert S; Karlinsky, Harry; Gordon, Daniel B; Yager, Joel; Yellowlees, Peter M

    2006-01-01

    This article provides a brief overview of important issues for educators regarding medical education and technology. The literature describes key concepts, prototypical technology tools, and model programs. A work group of psychiatric educators was convened three times by phone conference to discuss the literature. Findings were presented to and input was received from the 2005 Summit on Medical Student Education by APA and the American Directors of Medical Student Education in Psychiatry. Knowledge of, skills in, and attitudes toward medical informatics are important to life-long learning and modern medical practice. A needs assessment is a starting place, since student, faculty, institution, and societal factors bear consideration. Technology needs to "fit" into a curriculum in order to facilitate learning and teaching. Learning about computers and applying computer technology to education and clinical care are key steps in computer literacy for physicians.

  12. Modeling and performance analysis for composite network–compute service provisioning in software-defined cloud environments

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2015-08-01

    Full Text Available The crucial role of networking in Cloud computing calls for a holistic vision of both networking and computing systems that leads to composite network–compute service provisioning. Software-Defined Network (SDN is a fundamental advancement in networking that enables network programmability. SDN and software-defined compute/storage systems form a Software-Defined Cloud Environment (SDCE that may greatly facilitate composite network–compute service provisioning to Cloud users. Therefore, networking and computing systems need to be modeled and analyzed as composite service provisioning systems in order to obtain thorough understanding about service performance in SDCEs. In this paper, a novel approach for modeling composite network–compute service capabilities and a technique for evaluating composite network–compute service performance are developed. The analytic method proposed in this paper is general and agnostic to service implementation technologies; thus is applicable to a wide variety of network–compute services in SDCEs. The results obtained in this paper provide useful guidelines for federated control and management of networking and computing resources to achieve Cloud service performance guarantees.

  13. Cloud computing technologies applied in the virtual education of civil servants

    Directory of Open Access Journals (Sweden)

    Teodora GHERMAN

    2016-03-01

    Full Text Available From the perspective of education, e-learning through the use of Cloud Computing technologies represent one of the most important directions of educational software development, because Cloud Computing are in a rapid development and applies to all areas of the Information Society, including education. Systems require resources for virtual education on web platform (e-learning numerous hardware and software. The convenience of Internet learning, creating a learning environment based on web has become one of the strengths in virtual education research, including applied Cloud Computing technologies in virtual education of civil servants. The article presents Cloud Computing technologies as a platform for virtual education on web platforms, their advantages and disadvantages towards other technologies.

  14. The Acceptance of Computer Technology by Teachers in Early Childhood Education

    Science.gov (United States)

    Jeong, Hye In; Kim, Yeolib

    2017-01-01

    This study investigated kindergarten teachers' decision-making process regarding the acceptance of computer technology. We incorporated the Technology Acceptance Model framework, in addition to computer self-efficacy, subjective norm, and personal innovativeness in education technology as external variables. The data were obtained from 160…

  15. Computer science research and technology volume 3

    CERN Document Server

    Bauer, Janice P

    2011-01-01

    This book presents leading-edge research from across the globe in the field of computer science research, technology and applications. Each contribution has been carefully selected for inclusion based on the significance of the research to this fast-moving and diverse field. Some topics included are: network topology; agile programming; virtualization; and reconfigurable computing.

  16. Cloud manufacturing distributed computing technologies for global and sustainable manufacturing

    CERN Document Server

    Mehnen, Jörn

    2013-01-01

    Global networks, which are the primary pillars of the modern manufacturing industry and supply chains, can only cope with the new challenges, requirements and demands when supported by new computing and Internet-based technologies. Cloud Manufacturing: Distributed Computing Technologies for Global and Sustainable Manufacturing introduces a new paradigm for scalable service-oriented sustainable and globally distributed manufacturing systems.   The eleven chapters in this book provide an updated overview of the latest technological development and applications in relevant research areas.  Following an introduction to the essential features of Cloud Computing, chapters cover a range of methods and applications such as the factors that actually affect adoption of the Cloud Computing technology in manufacturing companies and new geometrical simplification method to stream 3-Dimensional design and manufacturing data via the Internet. This is further supported case studies and real life data for Waste Electrical ...

  17. A Financial Technology Entrepreneurship Program for Computer Science Students

    Science.gov (United States)

    Lawler, James P.; Joseph, Anthony

    2011-01-01

    Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…

  18. A Revolution in Information Technology - Cloud Computing

    OpenAIRE

    Divya BHATT

    2012-01-01

    What is the Internet? It is collection of “interconnected networks” represented as a Cloud in network diagrams and Cloud Computing is a metaphor for certain parts of the Internet. The IT enterprises and individuals are searching for a way to reduce the cost of computation, storage and communication. Cloud Computing is an Internet-based technology providing “On-Demand” solutions for addressing these scenarios that should be flexible enough for adaptation and responsive to requirements. The hug...

  19. The 9th international conference on computing and information technology

    CERN Document Server

    Unger, Herwig; Boonkrong, Sirapat; IC2IT2013

    2013-01-01

    This volume contains the papers of the 9th International Conference on Computing and Information Technology (IC2IT 2013) held at King Mongkut's University of Technology North Bangkok (KMUTNB), Bangkok, Thailand, on May 9th-10th, 2013. Traditionally, the conference is organized in conjunction with the National Conference on Computing and Information Technology, one of the leading Thai national events in the area of Computer Science and Engineering. The conference as well as this volume is structured into 3 main tracks on Data Networks/Communication, Data Mining/Machine Learning, and Human Interfaces/Image processing.  

  20. Discussion on the Technology and Method of Computer Network Security Management

    Science.gov (United States)

    Zhou, Jianlei

    2017-09-01

    With the rapid development of information technology, the application of computer network technology has penetrated all aspects of society, changed people's way of life work to a certain extent, brought great convenience to people. But computer network technology is not a panacea, it can promote the function of social development, but also can cause damage to the community and the country. Due to computer network’ openness, easiness of sharing and other characteristics, it had a very negative impact on the computer network security, especially the loopholes in the technical aspects can cause damage on the network information. Based on this, this paper will do a brief analysis on the computer network security management problems and security measures.

  1. A Heterogeneous High-Performance System for Computational and Computer Science

    Science.gov (United States)

    2016-11-15

    expand the research infrastructure at the institution but also to enhance the high -performance computing training provided to both undergraduate and... cloud computing, supercomputing, and the availability of cheap memory and storage led to enormous amounts of data to be sifted through in forensic... High -Performance Computing (HPC) tools that can be integrated with existing curricula and support our research to modernize and dramatically advance

  2. First steps in teaching computational thinking through mobile technology and robotics

    Directory of Open Access Journals (Sweden)

    Titipan Phetsrikran

    2017-07-01

    Full Text Available rogramming, or computational thinking, is becoming recognized as a skill that should be taught in primary and secondary schools. One technique for teaching programming is to use robotics, but usually this requires students to program via a PC. The purpose of this study is to investigate the potential for using an iPad application and robot that enables children to learn programming skills. This paper describes an application containing puzzles that involve creating a program to guide the physical robot from a start point to a goal. The application sends commands and controls the robots via Bluetooth and runs on the iPad with iOS. An initial experiment performed in a high school in Thailand explores how mobile technology and educational robotics can be applied to computational thinking in schools. The findings showed that the use of mobile technology opens up alternative styles of interaction in the classroom with potential for highly collaborative activities and greater focus on the learning domain.

  3. Mississippi Curriculum Framework for Computer Information Systems Technology. Computer Information Systems Technology (Program CIP: 52.1201--Management Information Systems & Business Data). Computer Programming (Program CIP: 52.1201). Network Support (Program CIP: 52.1290--Computer Network Support Technology). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…

  4. Missile signal processing common computer architecture for rapid technology upgrade

    Science.gov (United States)

    Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul

    2004-10-01

    Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application

  5. INFLUENCE OF DEVELOPMENT OF COMPUTER TECHNOLOGIES ON TEACHING

    Directory of Open Access Journals (Sweden)

    Olgica Bešić

    2012-09-01

    Full Text Available Our times are characterized by strong changes in technology that have become reality in many areas of society. When compared to production, transport, services, etc education, as a rule, slowly opens to new technologies. However, children at their homes and outside the schools live in a technologically rich environment, and they expect the change in education in accordance with the imperatives of the education for the twenty-first century. In this sense, systems for automated data processing, multimedia systems, then distance learning, virtual schools and other technologies are being introduced into education. They lead to an increase in students' activities, quality evaluation of their knowledge and finally to their progress, all in accordance with individual abilities and knowledge. Mathematics and computers often appear together in the teaching process. Taking into account the teaching of mathematics, computers and software packages have a significant role. The program requirements are not dominant. The emphasis is on mathematical content and the method of presentation. Computers are especially used in solving various mathematical tasks and self-learning of mathematics. Still, many problems that require solutions appear in the process: how to organise lectures, practice, textbooks, collected mathematical problems, written exams, how to assign and check homework. The answers to these questions are not simple and they will probably be sought continuously, with an increasing use of computers in the teaching process. In this paper I have tried to solve some of the questions above.

  6. Computer performance evaluation of FACOM 230-75 computer system, (2)

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1980-08-01

    In this report are described computer performance evaluations for FACOM230-75 computers in JAERI. The evaluations are performed on following items: (1) Cost/benefit analysis of timesharing terminals, (2) Analysis of the response time of timesharing terminals, (3) Analysis of throughout time for batch job processing, (4) Estimation of current potential demands for computer time, (5) Determination of appropriate number of card readers and line printers. These evaluations are done mainly from the standpoint of cost reduction of computing facilities. The techniques adapted are very practical ones. This report will be useful for those people who are concerned with the management of computing installation. (author)

  7. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  8. DEISA2: supporting and developing a European high-performance computing ecosystem

    International Nuclear Information System (INIS)

    Lederer, H

    2008-01-01

    The DEISA Consortium has deployed and operated the Distributed European Infrastructure for Supercomputing Applications. Through the EU FP7 DEISA2 project (funded for three years as of May 2008), the consortium is continuing to support and enhance the distributed high-performance computing infrastructure and its activities and services relevant for applications enabling, operation, and technologies, as these are indispensable for the effective support of computational sciences for high-performance computing (HPC). The service-provisioning model will be extended from one that supports single projects to one supporting virtual European communities. Collaborative activities will also be carried out with new European and other international initiatives. Of strategic importance is cooperation with the PRACE project, which is preparing for the installation of a limited number of leadership-class Tier-0 supercomputers in Europe. The key role and aim of DEISA will be to deliver a turnkey operational solution for a persistent European HPC ecosystem that will integrate national Tier-1 centers and the new Tier-0 centers

  9. The role of computer simulation in nuclear technologies development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.

    2001-01-01

    In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)

  10. Technology Performance Level Assessment Methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Jesse D.; Bull, Diana L; Malins, Robert Joseph; Costello, Ronan Patrick; Aurelien Babarit; Kim Nielsen; Claudio Bittencourt Ferreira; Ben Kennedy; Kathryn Dykes; Jochem Weber

    2017-04-01

    The technology performance level (TPL) assessments can be applied at all technology development stages and associated technology readiness levels (TRLs). Even, and particularly, at low TRLs the TPL assessment is very effective as it, holistically, considers a wide range of WEC attributes that determine the techno-economic performance potential of the WEC farm when fully developed for commercial operation. The TPL assessment also highlights potential showstoppers at the earliest possible stage of the WEC technology development. Hence, the TPL assessment identifies the technology independent “performance requirements.” In order to achieve a successful solution, the entirety of the performance requirements within the TPL must be considered because, in the end, all the stakeholder needs must be achieved. The basis for performing a TPL assessment comes from the information provided in a dedicated format, the Technical Submission Form (TSF). The TSF requests information from the WEC developer that is required to answer the questions posed in the TPL assessment document.

  11. InfoMall: An Innovative Strategy for High-Performance Computing and Communications Applications Development.

    Science.gov (United States)

    Mills, Kim; Fox, Geoffrey

    1994-01-01

    Describes the InfoMall, a program led by the Northeast Parallel Architectures Center (NPAC) at Syracuse University (New York). The InfoMall features a partnership of approximately 24 organizations offering linked programs in High Performance Computing and Communications (HPCC) technology integration, software development, marketing, education and…

  12. Using Computer Technology To Aid the Disabled Reader.

    Science.gov (United States)

    Balajthy, Ernest

    When matched for achievement level and educational objectives, computer technology can be particularly effective with at-risk students. Computer-assisted instructional software is the most widely available type of software. An exciting development pertinent to literacy education is the development of the "electronic book" (also called…

  13. High Performance Computing in Science and Engineering '16 : Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2016

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2016. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  14. The path toward HEP High Performance Computing

    International Nuclear Information System (INIS)

    Apostolakis, John; Brun, René; Gheata, Andrei; Wenzel, Sandro; Carminati, Federico

    2014-01-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a 'High Performance' implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on the development of a highperformance prototype for particle transport. Achieving a good concurrency level on the emerging parallel architectures without a complete redesign of the framework can only be done by parallelizing at event level, or with a much larger effort at track level. Apart the shareable data structures, this typically implies a multiplication factor in terms of memory consumption compared to the single threaded version, together with sub-optimal handling of event processing tails. Besides this, the low level instruction pipelining of modern processors cannot be used efficiently to speedup the program. We have implemented a framework that allows scheduling vectors of particles to an arbitrary number of computing resources in a fine grain parallel approach. The talk will review the current optimisation activities within the SFT group with a particular emphasis on the development perspectives towards a simulation framework able to profit

  15. Development of a body motion interactive system with a weight voting mechanism and computer vision technology

    Science.gov (United States)

    Lin, Chern-Sheng; Chen, Chia-Tse; Shei, Hung-Jung; Lay, Yun-Long; Chiu, Chuang-Chien

    2012-09-01

    This study develops a body motion interactive system with computer vision technology. This application combines interactive games, art performing, and exercise training system. Multiple image processing and computer vision technologies are used in this study. The system can calculate the characteristics of an object color, and then perform color segmentation. When there is a wrong action judgment, the system will avoid the error with a weight voting mechanism, which can set the condition score and weight value for the action judgment, and choose the best action judgment from the weight voting mechanism. Finally, this study estimated the reliability of the system in order to make improvements. The results showed that, this method has good effect on accuracy and stability during operations of the human-machine interface of the sports training system.

  16. [The automatic iris map overlap technology in computer-aided iridiagnosis].

    Science.gov (United States)

    He, Jia-feng; Ye, Hu-nian; Ye, Miao-yuan

    2002-11-01

    In the paper, iridology and computer-aided iridiagnosis technologies are briefly introduced and the extraction method of the collarette contour is then investigated. The iris map can be overlapped on the original iris image based on collarette contour extraction. The research on collarette contour extraction and iris map overlap is of great importance to computer-aided iridiagnosis technologies.

  17. High performance computer code for molecular dynamics simulations

    International Nuclear Information System (INIS)

    Levay, I.; Toekesi, K.

    2007-01-01

    Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions

  18. The role of computer simulation in nuclear technology development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, VV.; Ryazanov, D.K.; Tellin, A.I.

    2000-01-01

    In the report, the role and purpose of computer simulation in nuclear technology development is discussed. The authors consider such applications of computer simulation as: (a) Nuclear safety research; (b) Optimization of technical and economic parameters of acting nuclear plant; (c) Planning and support of reactor experiments; (d) Research and design new devices and technologies; (f) Design and development of 'simulators' for operating personnel training. Among marked applications, the following aspects of computer simulation are discussed in the report: (g) Neutron-physical, thermal and hydrodynamics models; (h) Simulation of isotope structure change and dam- age dose accumulation for materials under irradiation; (i) Simulation of reactor control structures. (authors)

  19. High performance fuel technology development

    Energy Technology Data Exchange (ETDEWEB)

    Koon, Yang Hyun; Kim, Keon Sik; Park, Jeong Yong; Yang, Yong Sik; In, Wang Kee; Kim, Hyung Kyu [KAERI, Daejeon (Korea, Republic of)

    2012-01-15

    {omicron} Development of High Plasticity and Annular Pellet - Development of strong candidates of ultra high burn-up fuel pellets for a PCI remedy - Development of fabrication technology of annular fuel pellet {omicron} Development of High Performance Cladding Materials - Irradiation test of HANA claddings in Halden research reactor and the evaluation of the in-pile performance - Development of the final candidates for the next generation cladding materials. - Development of the manufacturing technology for the dual-cooled fuel cladding tubes. {omicron} Irradiated Fuel Performance Evaluation Technology Development - Development of performance analysis code system for the dual-cooled fuel - Development of fuel performance-proving technology {omicron} Feasibility Studies on Dual-Cooled Annular Fuel Core - Analysis on the property of a reactor core with dual-cooled fuel - Feasibility evaluation on the dual-cooled fuel core {omicron} Development of Design Technology for Dual-Cooled Fuel Structure - Definition of technical issues and invention of concept for dual-cooled fuel structure - Basic design and development of main structure components for dual- cooled fuel - Basic design of a dual-cooled fuel rod.

  20. FY 1996 Blue Book: High Performance Computing and Communications: Foundations for America`s Information Future

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — The Federal High Performance Computing and Communications HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of...

  1. Guide to improving the performance of a manipulator system for nuclear fuel handling through computer controls. Final report

    International Nuclear Information System (INIS)

    Evans, J.M. Jr.; Albus, J.S.; Barbera, A.J.; Rosenthal, R.; Truitt, W.B.

    1975-11-01

    The Office of Developmental Automation and Control Technology of the Institute for Computer Sciences and Technology of the National Bureau of Standards provides advising services, standards and guidelines on interface and computer control systems, and performance specifications for the procurement and use of computer controlled manipulators and other computer based automation systems. These outputs help other agencies and industry apply this technology to increase productivity and improve work quality by removing men from hazardous environments. In FY 74 personnel from the Oak Ridge National Laboratory visited NBS to discuss the feasibility of using computer control techniques to improve the operation of remote control manipulators in nuclear fuel reprocessing. Subsequent discussions led to an agreement for NBS to develop a conceptual design for such a computer control system for the PaR Model 3000 manipulator in the Thorium Uranium Recycle Facility (TURF) at ORNL. This report provides the required analysis and conceptual design. Complete computer programs are included for testing of computer interfaces and for actual robot control in both point-to-point and continuous path modes

  2. Polymer waveguides for electro-optical integration in data centers and high-performance computers.

    Science.gov (United States)

    Dangel, Roger; Hofrichter, Jens; Horst, Folkert; Jubin, Daniel; La Porta, Antonio; Meier, Norbert; Soganci, Ibrahim Murat; Weiss, Jonas; Offrein, Bert Jan

    2015-02-23

    To satisfy the intra- and inter-system bandwidth requirements of future data centers and high-performance computers, low-cost low-power high-throughput optical interconnects will become a key enabling technology. To tightly integrate optics with the computing hardware, particularly in the context of CMOS-compatible silicon photonics, optical printed circuit boards using polymer waveguides are considered as a formidable platform. IBM Research has already demonstrated the essential silicon photonics and interconnection building blocks. A remaining challenge is electro-optical packaging, i.e., the connection of the silicon photonics chips with the system. In this paper, we present a new single-mode polymer waveguide technology and a scalable method for building the optical interface between silicon photonics chips and single-mode polymer waveguides.

  3. Restricted access processor - An application of computer security technology

    Science.gov (United States)

    Mcmahon, E. M.

    1985-01-01

    This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.

  4. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    Science.gov (United States)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  5. DOE research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-12-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models whose execution is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex; consequently, it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  6. High-performance computing — an overview

    Science.gov (United States)

    Marksteiner, Peter

    1996-08-01

    An overview of high-performance computing (HPC) is given. Different types of computer architectures used in HPC are discussed: vector supercomputers, high-performance RISC processors, various parallel computers like symmetric multiprocessors, workstation clusters, massively parallel processors. Software tools and programming techniques used in HPC are reviewed: vectorizing compilers, optimization and vector tuning, optimization for RISC processors; parallel programming techniques like shared-memory parallelism, message passing and data parallelism; and numerical libraries.

  7. Identification of risk factors of computer information technologies in education

    Directory of Open Access Journals (Sweden)

    Hrebniak M.P.

    2014-03-01

    Full Text Available The basic direction of development of secondary school and vocational training is computer training of schoolchildren and students, including distance forms of education and widespread usage of world information systems. The purpose of the work is to determine risk factors for schoolchildren and students, when using modern information and computer technologies. Results of researches allowed to establish dynamics of formation of skills using computer information technologies in education and characteristics of mental ability among schoolchildren and students during training in high school. Common risk factors, while operating CIT, are: intensification and formalization of intellectual activity, adverse ergonomic parameters, unfavorable working posture, excess of hygiene standards by chemical and physical characteristics. The priority preventive directions in applying computer information technology in education are: improvement of optimal visual parameters of activity, rationalization of ergonomic parameters, minimizing of adverse effects of chemical and physical conditions, rationalization of work and rest activity.

  8. High performance computing in Windows Azure cloud

    OpenAIRE

    Ambruš, Dejan

    2013-01-01

    High performance, security, availability, scalability, flexibility and lower costs of maintenance have essentially contributed to the growing popularity of cloud computing in all spheres of life, especially in business. In fact cloud computing offers even more than this. With usage of virtual computing clusters a runtime environment for high performance computing can be efficiently implemented also in a cloud. There are many advantages but also some disadvantages of cloud computing, some ...

  9. A brain-computer interface as input channel for a standard assistive technology software.

    Science.gov (United States)

    Zickler, Claudia; Riccio, Angela; Leotta, Francesco; Hillian-Tress, Sandra; Halder, Sebastian; Holz, Elisa; Staiger-Sälzer, Pit; Hoogerwerf, Evert-Jan; Desideri, Lorenzo; Mattia, Donatella; Kübler, Andrea

    2011-10-01

    Recently brain-computer interface (BCI) control was integrated into the commercial assistive technology product QualiWORLD (QualiLife Inc., Paradiso-Lugano, CH). Usability of the first prototype was evaluated in terms of effectiveness (accuracy), efficiency (information transfer rate and subjective workload/NASA Task Load Index) and user satisfaction (Quebec User Evaluation of Satisfaction with assistive Technology, QUEST 2.0) by four end-users with severe disabilities. Three assistive technology experts evaluated the device from a third person perspective. The results revealed high performance levels in communication and internet tasks. Users and assistive technology experts were quite satisfied with the device. However, none could imagine using the device in daily life without improvements. Main obstacles were the EEG-cap and low speed.

  10. The Adoption of Cloud Computing Technology for Library Services ...

    African Journals Online (AJOL)

    The study investigated the rationales for the adoption of cloud computing technology for library services in NOUN Library. Issues related to the existing computer network available in NOUN library such as LAN, WAN, rationales for the adoption of cloud computing in NOUN library such as the need to disclose their collections ...

  11. Advanced fuel technology and performance

    International Nuclear Information System (INIS)

    1985-10-01

    The purpose of the Advisory Group Meeting on Advanced Fuel Technology and Performance was to review the experience of advanced fuel fabrication technology, its performance, peculiarities of the back-end of the nuclear fuel cycle with regard to all types of reactors and to outline the future trends. As a result of the meeting recommendations were made for the future conduct of work on advanced fuel technology and performance. A separate abstract was prepared for each of the 20 papers in this issue

  12. Technological Capability and Firm Performance

    Directory of Open Access Journals (Sweden)

    Fernanda Maciel Reichert

    2014-08-01

    Full Text Available This research aims to investigate the relationship between investments in technological capability and economic performance in Brazilian firms. Based on economic development theory and on developed countries history, it is assumed that this relationship is positive. Through key indicators, 133 Brazilian firms have been analyzed. Given the economic circumstances of an emerging economy, which the majority of businesses are primarily based on low and medium-low-technology industries, it is not possible to affirm the existence of a positive relation between technological capability and firm performance. There are other elements that allow firms to achieve such results. Firms of lower technological intensity industries performed above average in the economic performance indicators, adversely, they invested below average in technological capability. These findings do not diminish the merit of firms’ and country’s success. They in fact confirm a historical tradition of a country that concentrates its efforts on basic industries.

  13. Technological Metaphors and Moral Education: The Hacker Ethic and the Computational Experience

    Science.gov (United States)

    Warnick, Bryan R.

    2004-01-01

    This essay is an attempt to understand how technological metaphors, particularly computer metaphors, are relevant to moral education. After discussing various types of technological metaphors, it is argued that technological metaphors enter moral thought through their "functional descriptions." The computer metaphor is then explored by turning to…

  14. Providing Assistive Technology Applications as a Service Through Cloud Computing.

    Science.gov (United States)

    Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.

  15. A Study of Thermal Performance of Contemporary Technology-Rich Educational Spaces

    Directory of Open Access Journals (Sweden)

    Sarah Elmasry

    2013-08-01

    Full Text Available One of the most dominant features of a classroom space is its high occupancy, which results in high internal heat gain (approximately 5 KW. Furthermore, installation of educational technologies, such as smart boards, projectors and computers in the spaces increases potential internal heat gain. Previous studies on office buildings indicate that with the introduction of IT equipment in spaces during the last decade, cooling load demands are increasing with an associated increase in summer electrical demand. Due to the fact that educational technologies in specific correspond to pedagogical practices within the space, a lot of variations due to occupancy patterns occur. Also, thermal loads caused by educational technologies are expected to be dependent on spatial configuration, for example, position with respect to the external walls, lighting equipment, mobility of devices. This study explores the thermal impact of educational technologies in 2 typical educational spaces in a facility of higher education; the classroom and the computer lab. The results indicate that a heat gain ranging between 0.06 and 0.095 KWh/m2 is generated in the rooms when educational technologies are in use. The second phase of this study is ongoing, and investigates thermal zones within the rooms due to distribution of educational technologies. Through simulation of thermal performance of the rooms, alternative room configurations are thus recommended in response to the observed thermal zones.

  16. Quantum Accelerators for High-performance Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  17. Technology in Paralympic sport: performance enhancement or essential for performance?

    Science.gov (United States)

    Burkett, Brendan

    2010-02-01

    People with disabilities often depend on assistive devices to enable activities of daily living as well as to compete in sport. Technological developments in sport can be controversial. To review, identify and describe current technological developments in assistive devices used in the summer Paralympic Games; and to prepare for the London 2012 Games, the future challenges and the role of technology are debated. A systematic review of the peer-reviewed literature and personal observations of technological developments at the Athens (2004) and Beijing (2008) Paralympic Games was conducted. Standard assistive devices can inhibit the Paralympians' abilities to perform the strenuous activities of their sports. Although many Paralympic sports only require technology similar to their Olympic counterparts, several unique technological modifications have been made in prosthetic and wheelchair devices. Technology is essential for the Paralympic athlete, and the potential technological advantage for a Paralympian, when competing against an Olympian, is unclear. Technology must match the individual requirements of the athlete with the sport in order for Paralympians to safely maximise their performance. Within the 'performance enhancement or essential for performance?' debate, any potential increase in mechanical performance from an assistive device must be considered holistically with the compensatory consequences the disability creates. To avoid potential technology controversies at the 2012 London Olympic and Paralympic Games, the role of technology in sport must be clarified.

  18. Computer technique for evaluating collimator performance

    International Nuclear Information System (INIS)

    Rollo, F.D.

    1975-01-01

    A computer program has been developed to theoretically evaluate the overall performance of collimators used with radioisotope scanners and γ cameras. The first step of the program involves the determination of the line spread function (LSF) and geometrical efficiency from the fundamental parameters of the collimator being evaluated. The working equations can be applied to any plane of interest. The resulting LSF is applied to subroutine computer programs which compute corresponding modulation transfer function and contrast efficiency functions. The latter function is then combined with appropriate geometrical efficiency data to determine the performance index function. The overall computer program allows one to predict from the physical parameters of the collimator alone how well the collimator will reproduce various sized spherical voids of activity in the image plane. The collimator performance program can be used to compare the performance of various collimator types, to study the effects of source depth on collimator performance, and to assist in the design of collimators. The theory of the collimator performance equation is discussed, a comparison between the experimental and theoretical LSF values is made, and examples of the application of the technique are presented

  19. Providing Learning Computing Labs using Hosting and Virtualization Technologies

    Directory of Open Access Journals (Sweden)

    Armide González

    2011-05-01

    Full Text Available This paper presents a computing hosting system to provide virtual computing laboratories for learning activities. This system is based on hosting and virtualization technologies. All the components used in its development are free software tools. The computing lab model provided by the system is a more sustainable and scalable alternative than the traditional academic computing lab, and it requires lower costs of installation and operation.

  20. DICOM standard in computer-aided medical technologies

    International Nuclear Information System (INIS)

    Plotnikov, A.V.; Prilutskij, D.A.; Selishchev, S.V.

    1997-01-01

    The paper outlines one of the promising standards to transmit images in medicine, in radiology in particular. the essence of the standard DICOM is disclosed and promises of its introduction into computer-aided medical technologies

  1. Use of Computer and Mobile Technologies in the Treatment of Depression.

    Science.gov (United States)

    Callan, Judith A; Wright, Jesse; Siegle, Greg J; Howland, Robert H; Kepler, Britney B

    2017-06-01

    Major depression (MDD) is a common and disabling disorder. Research has shown that most people with MDD receive either no treatment or inadequate treatment. Computer and mobile technologies may offer solutions for the delivery of therapies to untreated or inadequately treated individuals with MDD. The authors review currently available technologies and research aimed at relieving symptoms of MDD. These technologies include computer-assisted cognitive-behavior therapy (CCBT), web-based self-help, Internet self-help support groups, mobile psychotherapeutic interventions (i.e., mobile applications or apps), technology enhanced exercise, and biosensing technology. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. International Conference on Computer Science and Information Technology

    CERN Document Server

    Li, Xiaolong

    2014-01-01

    The main objective of CSAIT 2013 is to provide a forum for researchers, educators, engineers and government officials involved in the general areas of Computational Sciences and Information Technology to disseminate their latest research results and exchange views on the future research directions of these fields. A medium like this provides an opportunity to the academicians and industrial professionals to exchange and integrate practice of computer science, application of the academic ideas, improve the academic depth. The in-depth discussions on the subject provide an international communication platform for educational technology and scientific research for the world's universities, engineering field experts, professionals and business executives.

  3. Research on Key Technologies of Cloud Computing

    Science.gov (United States)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  4. Human factors with nonhumans - Factors that affect computer-task performance

    Science.gov (United States)

    Washburn, David A.

    1992-01-01

    There are two general strategies that may be employed for 'doing human factors research with nonhuman animals'. First, one may use the methods of traditional human factors investigations to examine the nonhuman animal-to-machine interface. Alternatively, one might use performance by nonhuman animals as a surrogate for or model of performance by a human operator. Each of these approaches is illustrated with data in the present review. Chronic ambient noise was found to have a significant but inconsequential effect on computer-task performance by rhesus monkeys (Macaca mulatta). Additional data supported the generality of findings such as these to humans, showing that rhesus monkeys are appropriate models of human psychomotor performance. It is argued that ultimately the interface between comparative psychology and technology will depend on the coordinated use of both strategies of investigation.

  5. What Physicists Should Know About High Performance Computing - Circa 2002

    Science.gov (United States)

    Frederick, Donald

    2002-08-01

    High Performance Computing (HPC) is a dynamic, cross-disciplinary field that traditionally has involved applied mathematicians, computer scientists, and others primarily from the various disciplines that have been major users of HPC resources - physics, chemistry, engineering, with increasing use by those in the life sciences. There is a technological dynamic that is powered by economic as well as by technical innovations and developments. This talk will discuss practical ideas to be considered when developing numerical applications for research purposes. Even with the rapid pace of development in the field, the author believes that these concepts will not become obsolete for a while, and will be of use to scientists who either are considering, or who have already started down the HPC path. These principles will be applied in particular to current parallel HPC systems, but there will also be references of value to desktop users. The talk will cover such topics as: computing hardware basics, single-cpu optimization, compilers, timing, numerical libraries, debugging and profiling tools and the emergence of Computational Grids.

  6. Mechanical Design Technology--Modified. (Computer Assisted Drafting, Computer Aided Design). Curriculum Grant 84/85.

    Science.gov (United States)

    Schoolcraft Coll., Livonia, MI.

    This document is a curriculum guide for a program in mechanical design technology (computer-assisted drafting and design developed at Schoolcraft College, Livonia, Michigan). The program helps students to acquire the skills of drafters and to interact with electronic equipment, with the option of becoming efficient in the computer-aided…

  7. University Students and Ethics of Computer Technology Usage: Human Resource Development

    Science.gov (United States)

    Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer

    2012-01-01

    The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

  8. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  9. FUNCTIONING FEATURES OF COMPUTER TECHNOLOGY WHILE FORMING PRIMARY SCHOOLCHILDREN’S COMMUNICATIVE COMPETENCE

    Directory of Open Access Journals (Sweden)

    Olena Beskorsa

    2017-04-01

    Full Text Available The article reveals the problem of functioning features of computer technology while forming primary schoolchildren’s communicative competence whose relevance is proved by the increasing role of a foreign language as a means of communication and modernization of foreign language education. There is a great deal of publications devoted to the issue of foreign language learning at primary school by N. Biriukevych, O. Kolominova, O. Metolkina, O. Petrenko, V. Redko, S. Roman. Implementing of innovative technology as well as computer one is to intensify the language learning process and to improve young learners’ communicative skills. The aim of the article is to identify computer technology functioning features while forming primary schoolchildren communicative competence. In this study we follow the definition of the computer technology as an information technology whose implementation may be accompanied with a computer as one of the tools, excluding the use of audio and video equipment, projectors and other technical tools. Using computer technologies is realized due to a number of tools which are divided into two main groups: electronic learning materials; computer testing software. The analysis of current textbooks and learning and methodological complexes shows that teachers prefer authentic electronic materials to the national ones. The most available English learning materials are on the Internet and they are free. The author of the article discloses several on-line English learning tools and depict the opportunities to use them while forming primary schoolchildren’s communicative competence. Special attention is also paid to multimedia technology, its functioning features and multimedia lesson structure. Computer testing software provides tools for current and control assessing results of mastering language material, communicative skills, and self-assessing in an interactive way. For making tests for assessing English skill

  10. VASCOMP 2. The V/STOL aircraft sizing and performance computer program. Volume 6: User's manual, revision 3

    Science.gov (United States)

    Schoen, A. H.; Rosenstein, H.; Stanzione, K.; Wisniewski, J. S.

    1980-01-01

    This report describes the use of the V/STOL Aircraft Sizing and Performance Computer Program (VASCOMP II). The program is useful in performing aircraft parametric studies in a quick and cost efficient manner. Problem formulation and data development were performed by the Boeing Vertol Company and reflects the present preliminary design technology. The computer program, written in FORTRAN IV, has a broad range of input parameters, to enable investigation of a wide variety of aircraft. User oriented features of the program include minimized input requirements, diagnostic capabilities, and various options for program flexibility.

  11. Performing an allreduce operation on a plurality of compute nodes of a parallel computer

    Science.gov (United States)

    Faraj, Ahmad [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for performing an allreduce operation on a plurality of compute nodes of a parallel computer. Each compute node includes at least two processing cores. Each processing core has contribution data for the allreduce operation. Performing an allreduce operation on a plurality of compute nodes of a parallel computer includes: establishing one or more logical rings among the compute nodes, each logical ring including at least one processing core from each compute node; performing, for each logical ring, a global allreduce operation using the contribution data for the processing cores included in that logical ring, yielding a global allreduce result for each processing core included in that logical ring; and performing, for each compute node, a local allreduce operation using the global allreduce results for each processing core on that compute node.

  12. International Conference on Emerging Research in Electronics, Computer Science and Technology

    CERN Document Server

    Sheshadri, Holalu; Padma, M

    2014-01-01

    PES College of Engineering is organizing an International Conference on Emerging Research in Electronics, Computer Science and Technology (ICERECT-12) in Mandya and merging the event with Golden Jubilee of the Institute. The Proceedings of the Conference presents high quality, peer reviewed articles from the field of Electronics, Computer Science and Technology. The book is a compilation of research papers from the cutting-edge technologies and it is targeted towards the scientific community actively involved in research activities.

  13. Spin-transfer torque magnetoresistive random-access memory technologies for normally off computing (invited)

    International Nuclear Information System (INIS)

    Ando, K.; Yuasa, S.; Fujita, S.; Ito, J.; Yoda, H.; Suzuki, Y.; Nakatani, Y.; Miyazaki, T.

    2014-01-01

    Most parts of present computer systems are made of volatile devices, and the power to supply them to avoid information loss causes huge energy losses. We can eliminate this meaningless energy loss by utilizing the non-volatile function of advanced spin-transfer torque magnetoresistive random-access memory (STT-MRAM) technology and create a new type of computer, i.e., normally off computers. Critical tasks to achieve normally off computers are implementations of STT-MRAM technologies in the main memory and low-level cache memories. STT-MRAM technology for applications to the main memory has been successfully developed by using perpendicular STT-MRAMs, and faster STT-MRAM technologies for applications to the cache memory are now being developed. The present status of STT-MRAMs and challenges that remain for normally off computers are discussed

  14. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  15. Research on the application in disaster reduction for using cloud computing technology

    Science.gov (United States)

    Tao, Liang; Fan, Yida; Wang, Xingling

    Cloud Computing technology has been rapidly applied in different domains recently, promotes the progress of the domain's informatization. Based on the analysis of the state of application requirement in disaster reduction and combining the characteristics of Cloud Computing technology, we present the research on the application of Cloud Computing technology in disaster reduction. First of all, we give the architecture of disaster reduction cloud, which consists of disaster reduction infrastructure as a service (IAAS), disaster reduction cloud application platform as a service (PAAS) and disaster reduction software as a service (SAAS). Secondly, we talk about the standard system of disaster reduction in five aspects. Thirdly, we indicate the security system of disaster reduction cloud. Finally, we draw a conclusion the use of cloud computing technology will help us to solve the problems for disaster reduction and promote the development of disaster reduction.

  16. US QCD computational performance studies with PERI

    International Nuclear Information System (INIS)

    Zhang, Y; Fowler, R; Huck, K; Malony, A; Porterfield, A; Reed, D; Shende, S; Taylor, V; Wu, X

    2007-01-01

    We report on some of the interactions between two SciDAC projects: The National Computational Infrastructure for Lattice Gauge Theory (USQCD), and the Performance Engineering Research Institute (PERI). Many modern scientific programs consistently report the need for faster computational resources to maintain global competitiveness. However, as the size and complexity of emerging high end computing (HEC) systems continue to rise, achieving good performance on such systems is becoming ever more challenging. In order to take full advantage of the resources, it is crucial to understand the characteristics of relevant scientific applications and the systems these applications are running on. Using tools developed under PERI and by other performance measurement researchers, we studied the performance of two applications, MILC and Chroma, on several high performance computing systems at DOE laboratories. In the case of Chroma, we discuss how the use of C++ and modern software engineering and programming methods are driving the evolution of performance tools

  17. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  18. How Computer Technology Expands Educational Options: A Rationale, Recommendations, and a Pamphlet for Administrators.

    Science.gov (United States)

    Kelch, Panette Evers; Karr-Kidwell, PJ

    The purpose of this paper is to provide a historical rationale on how computer technology, particularly the Internet, expands educational options for administrators and teachers. A review of the literature includes a brief history of computer technology and its growing use, and a discussion of computer technology for distance learning, for…

  19. Ubiquitous computing technology for just-in-time motivation of behavior change.

    Science.gov (United States)

    Intille, Stephen S

    2004-01-01

    This paper describes a vision of health care where "just-in-time" user interfaces are used to transform people from passive to active consumers of health care. Systems that use computational pattern recognition to detect points of decision, behavior, or consequences automatically can present motivational messages to encourage healthy behavior at just the right time. Further, new ubiquitous computing and mobile computing devices permit information to be conveyed to users at just the right place. In combination, computer systems that present messages at the right time and place can be developed to motivate physical activity and healthy eating. Computational sensing technologies can also be used to measure the impact of the motivational technology on behavior.

  20. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    Science.gov (United States)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  1. Computed Tomography Technology: Development and Applications for Defence

    International Nuclear Information System (INIS)

    Baheti, G. L.; Saxena, Nisheet; Tripathi, D. K.; Songara, K. C.; Meghwal, L. R.; Meena, V. L.

    2008-01-01

    Computed Tomography(CT) has revolutionized the field of Non-Destructive Testing and Evaluation (NDT and E). Tomography for industrial applications warrants design and development of customized solutions catering to specific visualization requirements. Present paper highlights Tomography Technology Solutions implemented at Defence Laboratory, Jodhpur (DLJ). Details on the technological developments carried out and their utilization for various Defence applications has been covered.

  2. Contemporary high performance computing from petascale toward exascale

    CERN Document Server

    Vetter, Jeffrey S

    2013-01-01

    Contemporary High Performance Computing: From Petascale toward Exascale focuses on the ecosystems surrounding the world's leading centers for high performance computing (HPC). It covers many of the important factors involved in each ecosystem: computer architectures, software, applications, facilities, and sponsors. The first part of the book examines significant trends in HPC systems, including computer architectures, applications, performance, and software. It discusses the growth from terascale to petascale computing and the influence of the TOP500 and Green500 lists. The second part of the

  3. Leveraging mobile computing and communication technologies in education

    DEFF Research Database (Denmark)

    Annan, Nana Kofi

    education and technology have evolved in tandem over the past years, this dissertation recognises the lapse that there is, in not being able to effectively leverage technology to improve education delivery by most educators. The study appreciates the enormousness of mobile computing and communication...... technologies in contributing to the development of tertiary education delivery, and has taken keen interest to investigate how the capacities of these technologies can be leveraged and incorporated effectively into the pedagogic framework of tertiary education. The purpose is to research into how...... of the results conducted after rigorous theoretical and empirical research unveiled the following: Mobile technologies can be incorporated into tertiary education if it has a strong theoretical underpinning, which links technology and pedagogy; the technology would not work if the user’s concerns in relation...

  4. Technology for computer-stabilized peak of NaI(Tl) gamma spectrum

    International Nuclear Information System (INIS)

    Chen Jianzhen; Guo Lanying; Ling Qiu; Qu Guopu; Zhao Lihong; Hu Chuangye

    2005-01-01

    An improved technology for spectrum stabilization of NaI(Tl) gamma spectrum was introduced. This technology is based on the system using a reference peak, which is equivalent gamma peak of 241 Am source. The computer seeks peak's position deviation and computes adjust value of programmable amplifier and controls programmable amplifier to stabilize spectrum by digital PID. This is a technology of spectrum stabilizing with 'hardware + reference-peak + software' and has high stability and fast speed of spectrum stabilizing. (author)

  5. Use of new computer technologies in elementary particle physics

    International Nuclear Information System (INIS)

    Gaines, I.; Nash, T.

    1987-01-01

    Elementary particle physics and computers have progressed together for as long as anyone can remember. The symbiosis is surprising considering the dissimilar objectives of these fields, but physics understanding cannot be had simply by detecting the passage of particles. It requires a selection of interesting events and their analysis in comparison with quantitative theoretical predictions. The extraordinary reach made by experimentalists into realms always further removed from everyday observation frequently encountered technology constraints. Pushing away such barriers has been an essential activity of the physicist since long before Rossi developed the first practical electronic AND gates as coincidence circuits in 1930. This article describes the latest episode of this history, the development of new computer technologies to meet the various and increasing appetite for computing of experimental (and theoretical) high energy physics

  6. Computers Put a Journalism School on Technology's Leading Edge.

    Science.gov (United States)

    Blum, Debra E.

    1992-01-01

    Since 1985, the University of Missouri at Columbia's School of Journalism has been developing a high-technology environment for student work, including word processing, electronic imaging, networked personal computers, and telecommunications. Some faculty worry that the emphasis on technology may overshadow the concepts, principles, and substance…

  7. COMPUGIRLS: Stepping Stone to Future Computer-Based Technology Pathways

    Science.gov (United States)

    Lee, Jieun; Husman, Jenefer; Scott, Kimberly A.; Eggum-Wilkens, Natalie D.

    2015-01-01

    The COMPUGIRLS: Culturally relevant technology program for adolescent girls was developed to promote underrepresented girls' future possible selves and career pathways in computer-related technology fields. We hypothesized that the COMPUGIRLS would promote academic possible selves and self-regulation to achieve these possible selves. We compared…

  8. Factors Influencing Cloud-Computing Technology Adoption in Developing Countries

    Science.gov (United States)

    Hailu, Alemayehu

    2012-01-01

    Adoption of new technology has complicating components both from the selection, as well as decision-making criteria and process. Although new technology such as cloud computing provides great benefits especially to the developing countries, it has challenges that may complicate the selection decision and subsequent adoption process. This study…

  9. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  10. The Computer Industry. High Technology Industries: Profiles and Outlooks.

    Science.gov (United States)

    International Trade Administration (DOC), Washington, DC.

    A series of meetings was held to assess future problems in United States high technology, particularly in the fields of robotics, computers, semiconductors, and telecommunications. This report, which focuses on the computer industry, includes a profile of this industry and the papers presented by industry speakers during the meetings. The profile…

  11. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing

    Science.gov (United States)

    Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide

    2015-09-01

    The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.

  12. The Diffusion of Computer-Based Technology in K-12 Schools: Teachers' Perspectives

    Science.gov (United States)

    Colandrea, John Louis

    2012-01-01

    Because computer technology represents a major financial outlay for school districts and is an efficient method of preparing and delivering lessons, studying the process of teacher adoption of computer use is beneficial and adds to the current body of knowledge. Because the teacher is the ultimate user of computer technology for lesson preparation…

  13. Proceedings of the workshop on molten salts technology and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hayashi, Hirokazu; Minato, Kazuo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-12-01

    Applications of molten salts technology to separation and synthesis of materials have been studied eagerly, which would develop new fields of materials science. Research Group for Actinides Science, Department of Materials Science, Japan Atomic Energy Research Institute (JAERI), together with Reprocessing and Recycle Technology Division, Atomic Energy Society of Japan, organized the Workshop on Molten Salts Technology and Computer Simulation at Tokai Research Establishment, JAERI on July 18, 2001. In the workshop eleven lectures were made and lively discussions were there on the fundamentals and applications of the molten salts technology that covered the structure and basic properties of molten salts, the pyrochemical reprocessing technology and the relevant computer simulation. The 10 of the presented papers are indexed individually. (J.P.N.)

  14. NCI's High Performance Computing (HPC) and High Performance Data (HPD) Computing Platform for Environmental and Earth System Data Science

    Science.gov (United States)

    Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2015-04-01

    The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially

  15. High-performance computing for airborne applications

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Manuzatto, Andrea; Fairbanks, Tom; Dallmann, Nicholas; Desgeorges, Rose

    2010-01-01

    Recently, there has been attempts to move common satellite tasks to unmanned aerial vehicles (UAVs). UAVs are significantly cheaper to buy than satellites and easier to deploy on an as-needed basis. The more benign radiation environment also allows for an aggressive adoption of state-of-the-art commercial computational devices, which increases the amount of data that can be collected. There are a number of commercial computing devices currently available that are well-suited to high-performance computing. These devices range from specialized computational devices, such as field-programmable gate arrays (FPGAs) and digital signal processors (DSPs), to traditional computing platforms, such as microprocessors. Even though the radiation environment is relatively benign, these devices could be susceptible to single-event effects. In this paper, we will present radiation data for high-performance computing devices in a accelerated neutron environment. These devices include a multi-core digital signal processor, two field-programmable gate arrays, and a microprocessor. From these results, we found that all of these devices are suitable for many airplane environments without reliability problems.

  16. Parental Perceptions and Recommendations of Computing Majors: A Technology Acceptance Model Approach

    Science.gov (United States)

    Powell, Loreen; Wimmer, Hayden

    2017-01-01

    Currently, there are more technology related jobs then there are graduates in supply. The need to understand user acceptance of computing degrees is the first step in increasing enrollment in computing fields. Additionally, valid measurement scales for predicting user acceptance of Information Technology degree programs are required. The majority…

  17. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  18. Use of several Cloud Computing approaches for climate modelling: performance, costs and opportunities

    Science.gov (United States)

    Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.

    2017-04-01

    Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on

  19. Computer-Aided Modeling of Lipid Processing Technology

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel

    2011-01-01

    increase along with growing interest in biofuels, the oleochemical industry faces in the upcoming years major challenges in terms of design and development of better products and more sustainable processes to make them. Computer-aided methods and tools for process synthesis, modeling and simulation...... are widely used for design, analysis, and optimization of processes in the chemical and petrochemical industries. These computer-aided tools have helped the chemical industry to evolve beyond commodities toward specialty chemicals and ‘consumer oriented chemicals based products’. Unfortunately...... to develop systematic computer-aided methods (property models) and tools (database) related to the prediction of the necessary physical properties suitable for design and analysis of processes employing lipid technologies. The methods and tools include: the development of a lipid-database (CAPEC...

  20. Evaluating Technology Resistance and Technology Satisfaction on Students' Performance

    Science.gov (United States)

    Norzaidi, Mohd Daud; Salwani, Mohamed Intan

    2009-01-01

    Purpose: Using the extended task-technology fit (TTF) model, this paper aims to examine technology resistance, technology satisfaction and internet usage on students' performance. Design/methodology/approach: The study was conducted at Universiti Teknologi MARA, Johor, Malaysia and questionnaires were distributed to 354 undergraduate students.…

  1. Computational Fluid Dynamics (CFD) Technology Programme 1995- 1999

    Energy Technology Data Exchange (ETDEWEB)

    Haekkinen, R.J.; Hirsch, C.; Krause, E.; Kytoemaa, H.K. [eds.

    1997-12-31

    The report is a mid-term evaluation of the Computational Fluid Dynamics (CFD) Technology Programme started by Technology Development Centre Finland (TEKES) in 1995 as a five-year initiative to be concluded in 1999. The main goal of the programme is to increase the know-how and application of CFD in Finnish industry, to coordinate and thus provide a better basis for co-operation between national CFD activities and encouraging research laboratories and industry to establish co-operation with the international CFD community. The projects of the programme focus on the following areas: (1) studies of modeling the physics and dynamics of the behaviour of fluid material, (2) expressing the physical models in a numerical mode and developing a computer codes, (3) evaluating and testing current physical models and developing new ones, (4) developing new numerical algorithms, solvers, and pre- and post-processing software, and (5) applying the new computational tools to problems relevant to their ultimate industrial use. The report consists of two sections. The first considers issues concerning the whole programme and the second reviews each project

  2. High Performance Computing in Science and Engineering '14

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2015-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS). The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and   engineers. The book comes with a wealth of color illustrations and tables of results.  

  3. Analysis of parallel computing performance of the code MCNP

    International Nuclear Information System (INIS)

    Wang Lei; Wang Kan; Yu Ganglin

    2006-01-01

    Parallel computing can reduce the running time of the code MCNP effectively. With the MPI message transmitting software, MCNP5 can achieve its parallel computing on PC cluster with Windows operating system. Parallel computing performance of MCNP is influenced by factors such as the type, the complexity level and the parameter configuration of the computing problem. This paper analyzes the parallel computing performance of MCNP regarding with these factors and gives measures to improve the MCNP parallel computing performance. (authors)

  4. A Quantitative Exploration of Preservice Teachers' Intent to Use Computer-based Technology

    Science.gov (United States)

    Kim, Kioh; Jain, Sachin; Westhoff, Guy; Rezabek, Landra

    2008-01-01

    Based on Bandura's (1977) social learning theory, the purpose of this study is to identify the relationship of preservice teachers' perceptions of faculty modeling of computer-based technology and preservice teachers' intent of using computer-based technology in educational settings. There were 92 participants in this study; they were enrolled in…

  5. The Impact Of Cloud Computing Technology On The Audit Process And The Audit Profession

    Directory of Open Access Journals (Sweden)

    Yati Nurhajati

    2015-08-01

    Full Text Available In the future cloud computing audits will become increasingly The use of that technology has influenced of the audit process and be a new challenge for both external and the Internal Auditors to understand IT and learn how to use cloud computing and cloud services that hire in cloud service provider CSP and considering the risks of cloud computing and how to audit cloud computing by risk based audit approach. The wide range of unique risks and depend on the type and model of the cloud solution the uniqueness of the client environmentand the specifics of data or an application make this an complicated subject. The internal audit function is well positioned through its role as a guarantor function of the organization to assist management and the board of the Committee to identify and consider the risks in using cloud computing technology for internal audit can help determine whether the risk has been managed appropriately in a cloud computing environment. Assesses the current impact of cloud computing technology on the audit process and discusses the implications of cloud computing future technological trends for the auditing profession . More specifically Provides a summary of how that information technology has impacted the audit framework.

  6. How does technological regime affect performance of technology development projects?

    NARCIS (Netherlands)

    Song, Michael; Hooshangi, Soheil; Zhao, Y. Lisa; Halman, Johannes I.M.

    2014-01-01

    In this study, we examine how technological regime affects the performance of technology development projects (i.e., project quality, sales, and profit). Technological regime is defined as the set of attributes of a technological environment where the innovative activities of firms take place.

  7. High-performance computing on the Intel Xeon Phi how to fully exploit MIC architectures

    CERN Document Server

    Wang, Endong; Shen, Bo; Zhang, Guangyong; Lu, Xiaowei; Wu, Qing; Wang, Yajuan

    2014-01-01

    The aim of this book is to explain to high-performance computing (HPC) developers how to utilize the Intel® Xeon Phi™ series products efficiently. To that end, it introduces some computing grammar, programming technology and optimization methods for using many-integrated-core (MIC) platforms and also offers tips and tricks for actual use, based on the authors' first-hand optimization experience.The material is organized in three sections. The first section, "Basics of MIC", introduces the fundamentals of MIC architecture and programming, including the specific Intel MIC programming environment

  8. The correlation between a passion for computer games and the school performance of younger schoolchildren.

    Directory of Open Access Journals (Sweden)

    Maliy D.V.

    2015-07-01

    Full Text Available Today computer games occupy a significant place in children’s lives and fundamentally affect the process of the formation and development of their personalities. A number of present-day researchers assert that computer games have a developmental effect on players. Others share the point of view that computer games have negative effects on the cognitive and emotional spheres of a child and claim that children with low self-esteem who neglect their schoolwork and have difficulties in communication are particularly passionate about computer games. This article reviews theoretical and experimental pedagogical and psychological studies of the nature of the correlation between a passion for computer games and the school performance of younger schoolchildren. Our analysis of foreign and Russian psychology studies regarding the problem of playing activities mediated by information and computer technologies allowed us to single out the main criteria for children’s passion for computer games and school performance. This article presents the results of a pilot study of the nature of the correlation between a passion for computer games and the school performance of younger schoolchildren. The research involved 32 pupils (12 girls and 20 boys aged 10-11 years in the 4th grade. The general hypothesis was that there are divergent correlations between the passion of younger schoolchildren for computer games and their school performance. A questionnaire survey administered to the pupils allowed us to obtain information about the amount of time they devoted to computer games, their preferences for computer-game genres, and the extent of their passion for games. To determine the level of school performance we analyzed class registers. To establish the correlation between a passion for computer games and the school performance of younger schoolchildren, as well as to determine the effect of a passion for computer games on the personal qualities of the children

  9. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    Science.gov (United States)

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  10. Fast magnetic field computation in fusion technology using GPU technology

    Energy Technology Data Exchange (ETDEWEB)

    Chiariello, Andrea Gaetano [Ass. EURATOM/ENEA/CREATE, Dipartimento di Ingegneria Industriale e dell’Informazione, Seconda Università di Napoli, Via Roma 29, Aversa (CE) (Italy); Formisano, Alessandro, E-mail: Alessandro.Formisano@unina2.it [Ass. EURATOM/ENEA/CREATE, Dipartimento di Ingegneria Industriale e dell’Informazione, Seconda Università di Napoli, Via Roma 29, Aversa (CE) (Italy); Martone, Raffaele [Ass. EURATOM/ENEA/CREATE, Dipartimento di Ingegneria Industriale e dell’Informazione, Seconda Università di Napoli, Via Roma 29, Aversa (CE) (Italy)

    2013-10-15

    Highlights: ► The paper deals with high accuracy numerical simulations of high field magnets. ► The porting of existing codes of High Performance Computing architectures allowed to obtain a relevant speedup while not reducing computational accuracy. ► Some examples of applications, referred to ITER-like magnets, are reported. -- Abstract: One of the main issues in the simulation of Tokamaks functioning is the reliable and accurate computation of actual field maps in the plasma chamber. In this paper a tool able to accurately compute magnetic field maps produced by active coils of any 3D shape, wound with high number of conductors, is presented. Under linearity assumption, the coil winding is modeled by means of “sticks”, following each conductor's shape, and the contribution of each stick is computed using high speed Graphic Computing Units (GPU's). Relevant speed enhancements with respect to standard parallel computing environment are achieved in this way.

  11. Application of local computer networks in nuclear-physical experiments and technology

    International Nuclear Information System (INIS)

    Foteev, V.A.

    1986-01-01

    The bases of construction, comparative performance and potentialities of local computer networks with respect to their application in physical experiments are considered. The principle of operation of local networks is shown on the basis of the Ethernet network and the results of analysis of their operating performance are given. The examples of operating local networks in the area of nuclear-physics research and nuclear technology are presented as follows: networks of Japan Atomic Energy Research Institute, California University and Los Alamos National Laboratory, network realization according to the DECnet and Fast-bus programs, home network configurations of the USSR Academy of Sciences and JINR Neutron Physical Laboratory etc. It is shown that local networks allows significantly raise productivity in the sphere of data processing

  12. Performance Evaluation Methods for Assistive Robotic Technology

    Science.gov (United States)

    Tsui, Katherine M.; Feil-Seifer, David J.; Matarić, Maja J.; Yanco, Holly A.

    Robots have been developed for several assistive technology domains, including intervention for Autism Spectrum Disorders, eldercare, and post-stroke rehabilitation. Assistive robots have also been used to promote independent living through the use of devices such as intelligent wheelchairs, assistive robotic arms, and external limb prostheses. Work in the broad field of assistive robotic technology can be divided into two major research phases: technology development, in which new devices, software, and interfaces are created; and clinical, in which assistive technology is applied to a given end-user population. Moving from technology development towards clinical applications is a significant challenge. Developing performance metrics for assistive robots poses a related set of challenges. In this paper, we survey several areas of assistive robotic technology in order to derive and demonstrate domain-specific means for evaluating the performance of such systems. We also present two case studies of applied performance measures and a discussion regarding the ubiquity of functional performance measures across the sampled domains. Finally, we present guidelines for incorporating human performance metrics into end-user evaluations of assistive robotic technologies.

  13. Portable Computer Technology (PCT) Research and Development Program Phase 2

    Science.gov (United States)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  14. [APPLICATION OF COMPUTER-ASSISTED TECHNOLOGY IN ANALYSIS OF REVISION REASON OF UNICOMPARTMENTAL KNEE ARTHROPLASTY].

    Science.gov (United States)

    Jia, Di; Li, Yanlin; Wang, Guoliang; Gao, Huanyu; Yu, Yang

    2016-01-01

    To conclude the revision reason of unicompartmental knee arthroplasty (UKA) using computer-assisted technology so as to provide reference for reducing the revision incidence and improving the level of surgical technique and rehabilitation. The relevant literature on analyzing revision reason of UKA using computer-assisted technology in recent years was extensively reviewed. The revision reasons by computer-assisted technology are fracture of the medial tibial plateau, progressive osteoarthritis of reserved compartment, dislocation of mobile bearing, prosthesis loosening, polyethylene wear, and unexplained persistent pain. Computer-assisted technology can be used to analyze the revision reason of UKA and guide the best operating method and rehabilitation scheme by simulating the operative process and knee joint activities.

  15. The Adoption of Grid Computing Technology by Organizations: A Quantitative Study Using Technology Acceptance Model

    Science.gov (United States)

    Udoh, Emmanuel E.

    2010-01-01

    Advances in grid technology have enabled some organizations to harness enormous computational power on demand. However, the prediction of widespread adoption of the grid technology has not materialized despite the obvious grid advantages. This situation has encouraged intense efforts to close the research gap in the grid adoption process. In this…

  16. The relationship between perceived stress and computer technology attitude: an application on health sciences students.

    Science.gov (United States)

    Ozyurek, Pakize; Oztasan, Nuray; Kilic, Ibrahim

    2015-02-01

    The aim of this study is to define attitudes of students in health sciences towards perceived personal stress and computer technologies, and to present the relationship between stress and computer technology attitudes. In this scope, this study has a descriptive nature and thus a questionnaire has been applied on 764 students from Afyon Kocatepe University Health Sciences High School, Turkey for data gathering. Descriptive statistics, independent samples, t test, one way ANOVA, and regression analysis have been used for data analysis. In the study, it is seen that female (=3,78) have a more positive attitude towards computer technology than male students (=3,62). according to the results of regression analysis of the study, the regression model between computer technology attitude (CTA) and perceived stress (PS) has been found meaningful (F=16,291; ptechnology attitude and perceived stress (when computer technology altitude increases, perceived stress decreases), and an increase of one unit in computer attitude results in 0.275 decrease in perceived stress. it can be concluded that correct and proper use of computer technologies can be accepted as a component of overcoming stress methods.

  17. Development and validation of the computer technology literacy self-assessment scale for Taiwanese elementary school students.

    Science.gov (United States)

    Chang, Chiung-Sui

    2008-01-01

    The purpose of this study was to describe the development and validation of an instrument to identify various dimensions of the computer technology literacy self-assessment scale (CTLS) for elementary school students. The instrument included five CTLS dimensions (subscales): the technology operation skills, the computer usages concepts, the attitudes toward computer technology, the learning with technology, and the Internet operation skills. Participants were 1,539 elementary school students in Taiwan. Data analysis indicated that the instrument developed in the study had satisfactory validity and reliability. Correlations analysis supported the legitimacy of using multiple dimensions in representing students' computer technology literacy. Significant differences were found between male and female students, and between grades on some CTLS dimensions. Suggestions are made for use of the instrument to examine complicated interplays between students' computer behaviors and their computer technology literacy.

  18. Proceedings of the international conference on advances in computer and communication technology

    International Nuclear Information System (INIS)

    Bakal, J.W.; Kunte, A.S.; Walinjkar, P.B.; Karnani, N.K.

    2012-02-01

    A nation's development is coupled with advancement and adoption of new technologies. During the past decade advancements in computer and communication technologies have grown multi fold. For the growth of any country it is necessary to keep pace with the latest innovations in technology. International Conference on Advances in Computer and Communication Technology organised by Institution of Electronics and Telecommunication Engineers, Mumbai Centre is an attempt to provide a platform for scientists, engineering students, educators and experts to share their knowledge and discuss the efforts put by them in the field of R and D. The papers relevant to INIS are indexed separately

  19. Performing stencil computations

    Energy Technology Data Exchange (ETDEWEB)

    Donofrio, David

    2018-01-16

    A method and apparatus for performing stencil computations efficiently are disclosed. In one embodiment, a processor receives an offset, and in response, retrieves a value from a memory via a single instruction, where the retrieving comprises: identifying, based on the offset, one of a plurality of registers of the processor; loading an address stored in the identified register; and retrieving from the memory the value at the address.

  20. High performance computing in linear control

    International Nuclear Information System (INIS)

    Datta, B.N.

    1993-01-01

    Remarkable progress has been made in both theory and applications of all important areas of control. The theory is rich and very sophisticated. Some beautiful applications of control theory are presently being made in aerospace, biomedical engineering, industrial engineering, robotics, economics, power systems, etc. Unfortunately, the same assessment of progress does not hold in general for computations in control theory. Control Theory is lagging behind other areas of science and engineering in this respect. Nowadays there is a revolution going on in the world of high performance scientific computing. Many powerful computers with vector and parallel processing have been built and have been available in recent years. These supercomputers offer very high speed in computations. Highly efficient software, based on powerful algorithms, has been developed to use on these advanced computers, and has also contributed to increased performance. While workers in many areas of science and engineering have taken great advantage of these hardware and software developments, control scientists and engineers, unfortunately, have not been able to take much advantage of these developments

  1. Towards the development of run times leveraging virtualization for high performance computing

    International Nuclear Information System (INIS)

    Diakhate, F.

    2010-12-01

    In recent years, there has been a growing interest in using virtualization to improve the efficiency of data centers. This success is rooted in virtualization's excellent fault tolerance and isolation properties, in the overall flexibility it brings, and in its ability to exploit multi-core architectures efficiently. These characteristics also make virtualization an ideal candidate to tackle issues found in new compute cluster architectures. However, in spite of recent improvements in virtualization technology, overheads in the execution of parallel applications remain, which prevent its use in the field of high performance computing. In this thesis, we propose a virtual device dedicated to message passing between virtual machines, so as to improve the performance of parallel applications executed in a cluster of virtual machines. We also introduce a set of techniques facilitating the deployment of virtualized parallel applications. These functionalities have been implemented as part of a runtime system which allows to benefit from virtualization's properties in a way that is as transparent as possible to the user while minimizing performance overheads. (author)

  2. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  3. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  4. Fiscal 1998 research report on super compiler technology; 1998 nendo super konpaira technology no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For next-generation super computing systems, research was made on parallel and distributed compiler technology for enhancing an effective performance, and concerned software and architectures for enhancing a performance in coordination with compilers. As for parallel compiler technology, the researches of scalable automated parallel compiler technology, parallel tuning tools, and an operating system to use multi-processor resources effectively are pointed out to be important as concrete technical development issues. In addition, by developing these research results to the architecture technology of single-chip multi-processors, the possibility of development and expansion of the PC, WS and HPC (high-performance computer) markets, and creation of new industries is pointed out. Although wide-area distributed computing is being watched as next-generation computing industry, concrete industrial fields using such computing are now not clear, staying in the groping research stage. (NEDO)

  5. Training to use a commercial brain-computer interface as access technology: a case study.

    Science.gov (United States)

    Taherian, Sarvnaz; Selitskiy, Dmitry; Pau, James; Davies, T Claire; Owens, R Glynn

    2016-01-01

    This case study describes how an individual with spastic quadriplegic cerebral palsy was trained over a period of four weeks to use a commercial electroencephalography (EEG)-based brain-computer interface (BCI). The participant spent three sessions exploring the system, and seven sessions playing a game focused on EEG feedback training of left and right arm motor imagery and a customised, training game paradigm was employed. The participant showed improvement in the production of two distinct EEG patterns. The participant's performance was influenced by motivation, fatigue and concentration. Six weeks post-training the participant could still control the BCI and used this to type a sentence using an augmentative and alternative communication application on a wirelessly linked device. The results from this case study highlight the importance of creating a dynamic, relevant and engaging training environment for BCIs. Implications for Rehabilitation Customising a training paradigm to suit the users' interests can influence adherence to assistive technology training. Mood, fatigue, physical illness and motivation influence the usability of a brain-computer interface. Commercial brain-computer interfaces, which require little set up time, may be used as access technology for individuals with severe disabilities.

  6. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  7. High Performance Computing Multicast

    Science.gov (United States)

    2012-02-01

    A History of the Virtual Synchrony Replication Model,” in Replication: Theory and Practice, Charron-Bost, B., Pedone, F., and Schiper, A. (Eds...Performance Computing IP / IPv4 Internet Protocol (version 4.0) IPMC Internet Protocol MultiCast LAN Local Area Network MCMD Dr. Multicast MPI

  8. An Adaptive Middleware for Improved Computational Performance

    DEFF Research Database (Denmark)

    Bonnichsen, Lars Frydendal

    , we are improving computational performance by exploiting modern hardware features, such as dynamic voltage-frequency scaling and transactional memory. Adapting software is an iterative process, requiring that we continually revisit it to meet new requirements or realities; a time consuming process......The performance improvements in computer systems over the past 60 years have been fueled by an exponential increase in energy efficiency. In recent years, the phenomenon known as the end of Dennard’s scaling has slowed energy efficiency improvements — but improving computer energy efficiency...... is more important now than ever. Traditionally, most improvements in computer energy efficiency have come from improvements in lithography — the ability to produce smaller transistors — and computer architecture - the ability to apply those transistors efficiently. Since the end of scaling, we have seen...

  9. Computer technology applications in industrial and organizational psychology.

    Science.gov (United States)

    Crespin, Timothy R; Austin, James T

    2002-08-01

    This article reviews computer applications developed and utilized by industrial-organizational (I-O) psychologists, both in practice and in research. A primary emphasis is on applications developed for Internet usage, because this "network of networks" changes the way I-O psychologists work. The review focuses on traditional and emerging topics in I-O psychology. The first topic involves information technology applications in measurement, defined broadly across levels of analysis (persons, groups, organizations) and domains (abilities, personality, attitudes). Discussion then focuses on individual learning at work, both in formal training and in coping with continual automation of work. A section on job analysis follows, illustrating the role of computers and the Internet in studying jobs. Shifting focus to the group level of analysis, we briefly review how information technology is being used to understand and support cooperative work. Finally, special emphasis is given to the emerging "third discipline" in I-O psychology research-computational modeling of behavioral events in organizations. Throughout this review, themes of innovation and dissemination underlie a continuum between research and practice. The review concludes by setting a framework for I-O psychology in a computerized and networked world.

  10. TEACHERS’ COMPUTER SELF-EFFICACY AND THEIR USE OF EDUCATIONAL TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Vehbi TUREL

    2014-10-01

    Full Text Available This study examined the use of educational technology by primary and subject teachers (i.e. secondary and high school teachers in a small town in the eastern part of Turkey in the spring of 2012. The study examined the primary, secondary and high school teachers’ Ø personal and computer related (demographic characteristics, Ø their computer self-efficacy perceptions, Ø their computer-using level in certain software, Ø their frequency of computer use for teaching, administrative and communication objectives, and Ø their use of educational technology preferences for preparation and teaching purposes. In this study, all primary, secondary and high school teachers in the small town were given the questionnaires to complete. 158 teachers (n=158 completed and returned them. The study was mostly quantitative and partly qualitative. The quantitative results were analysed with SPSS (i.e. mean, Std. Deviation, frequency, percentage, ANOVA. The qualitative data were analysed with examining the participants’ responses gathered from the open-ended questions and focussing on the shared themes among the responses. The results reveal that the teachers think that they have good computer self-efficacy perceptions, their level in certain programs is good, and they often use computers for a wide range of purposes. There are also statistical differences between; Ø their computer self-efficacy perceptions, Ø frequency of computer use for certain purposes, and Ø computer level in certain programs in terms of different independent variables.

  11. The Impact of Information Technology on Library Anxiety: The Role of Computer Attitudes

    Directory of Open Access Journals (Sweden)

    Qun G. Jiao

    2017-09-01

    Full Text Available Over the past two decades, computer-based technologies have become dominant forces to shape and reshape the products and services the academic library has to offer. The applicationo of library technologies has had a profound impact on the way library resources are being used. Although many students continue to experience high levels of library anxiety, it is likely that the new technologies in the library have led to them experiencing other forms of negative affective states that may be, in part, a function of their attitude towards computers. This study investigates whether students' computer attitudes predict levels of library anxiety.

  12. High-performance scientific computing in the cloud

    Science.gov (United States)

    Jorissen, Kevin; Vila, Fernando; Rehr, John

    2011-03-01

    Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.

  13. National Survey of Computer Aided Manufacturing in Industrial Technology Programs.

    Science.gov (United States)

    Heidari, Farzin

    The current status of computer-aided manufacturing in the 4-year industrial technology programs in the United States was studied. All industrial technology department chairs were mailed a questionnaire divided into program information, equipment information, and general comments sections. The questionnaire was designed to determine the subjects…

  14. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  15. Current Capabilities at SNL for the Integration of Small Modular Reactors onto Smart Microgrids Using Sandia's Smart Microgrid Technology High Performance Computing and Advanced Manufacturing.

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, Salvador B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    Smart grids are a crucial component for enabling the nation’s future energy needs, as part of a modernization effort led by the Department of Energy. Smart grids and smart microgrids are being considered in niche applications, and as part of a comprehensive energy strategy to help manage the nation’s growing energy demands, for critical infrastructures, military installations, small rural communities, and large populations with limited water supplies. As part of a far-reaching strategic initiative, Sandia National Laboratories (SNL) presents herein a unique, three-pronged approach to integrate small modular reactors (SMRs) into microgrids, with the goal of providing economically-competitive, reliable, and secure energy to meet the nation’s needs. SNL’s triad methodology involves an innovative blend of smart microgrid technology, high performance computing (HPC), and advanced manufacturing (AM). In this report, Sandia’s current capabilities in those areas are summarized, as well as paths forward that will enable DOE to achieve its energy goals. In the area of smart grid/microgrid technology, Sandia’s current computational capabilities can model the entire grid, including temporal aspects and cyber security issues. Our tools include system development, integration, testing and evaluation, monitoring, and sustainment.

  16. High performance workflow implementation for protein surface characterization using grid technology

    Directory of Open Access Journals (Sweden)

    Clematis Andrea

    2005-12-01

    Full Text Available Abstract Background This study concerns the development of a high performance workflow that, using grid technology, correlates different kinds of Bioinformatics data, starting from the base pairs of the nucleotide sequence to the exposed residues of the protein surface. The implementation of this workflow is based on the Italian Grid.it project infrastructure, that is a network of several computational resources and storage facilities distributed at different grid sites. Methods Workflows are very common in Bioinformatics because they allow to process large quantities of data by delegating the management of resources to the information streaming. Grid technology optimizes the computational load during the different workflow steps, dividing the more expensive tasks into a set of small jobs. Results Grid technology allows efficient database management, a crucial problem for obtaining good results in Bioinformatics applications. The proposed workflow is implemented to integrate huge amounts of data and the results themselves must be stored into a relational database, which results as the added value to the global knowledge. Conclusion A web interface has been developed to make this technology accessible to grid users. Once the workflow has started, by means of the simplified interface, it is possible to follow all the different steps throughout the data processing. Eventually, when the workflow has been terminated, the different features of the protein, like the amino acids exposed on the protein surface, can be compared with the data present in the output database.

  17. Department of Energy research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-08-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programmatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models, the execution of which is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex, and consequently it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  18. Computer Technologies in Education. Proceedings of the International Conference on Computer Technologies in Education (Kiev, Ukraine, September 14-17, 1993).

    Science.gov (United States)

    Petrushin, V., Ed.; Dovgiallo, A., Ed.

    The conference reported in this document provides a meeting place for researchers from around the world, where the emphasis is on new ideas connected to computer technologies in education. This volume contains 140 extended abstracts selected by the program committee and organized into the following categories: (1) educational informational…

  19. Computing, information, and communications: Technologies for the 21. Century

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-11-01

    To meet the challenges of a radically new and technologically demanding century, the Federal Computing, Information, and Communications (CIC) programs are investing in long-term research and development (R and D) to advance computing, information, and communications in the United States. CIC R and D programs help Federal departments and agencies to fulfill their evolving missions, assure the long-term national security, better understand and manage the physical environment, improve health care, help improve the teaching of children, provide tools for lifelong training and distance learning to the workforce, and sustain critical US economic competitiveness. One of the nine committees of the National Science and Technology Council (NSTC), the Committee on Computing, Information, and Communications (CCIC)--through its CIC R and D Subcommittee--coordinates R and D programs conducted by twelve Federal departments and agencies in cooperation with US academia and industry. These R and D programs are organized into five Program Component Areas: (1) HECC--High End Computing and Computation; (2) LSN--Large Scale Networking, including the Next Generation Internet Initiative; (3) HCS--High Confidence Systems; (4) HuCS--Human Centered Systems; and (5) ETHR--Education, Training, and Human Resources. A brief synopsis of FY 1997 accomplishments and FY 1998 goals by PCA is presented. This report, which supplements the President`s Fiscal Year 1998 Budget, describes the interagency CIC programs.

  20. Mobile Computing: The Emerging Technology, Sensing, Challenges and Applications

    International Nuclear Information System (INIS)

    Bezboruah, T.

    2010-12-01

    The mobile computing is a computing system in which a computer and all necessary accessories like files and software are taken out to the field. It is a system of computing through which it is being able to use a computing device even when someone being mobile and therefore changing location. The portability is one of the important aspects of mobile computing. The mobile phones are being used to gather scientific data from remote and isolated places that could not be possible to retrieve by other means. The scientists are initiating to use mobile devices and web-based applications to systematically explore interesting scientific aspects of their surroundings, ranging from climate change, environmental pollution to earthquake monitoring. This mobile revolution enables new ideas and innovations to spread out more quickly and efficiently. Here we will discuss in brief about the mobile computing technology, its sensing, challenges and the applications. (author)

  1. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  2. Evaluating Educational Technologies: Interactive White Boards and Tablet Computers in the EFL Classroom

    OpenAIRE

    NFOR, Samuel

    2018-01-01

    One of the objectives outlined in "Trends and Development in Education, Science and Technology Policies": MEXT 2011 by the Ministry of Education, Culture, Sports, Science and Technology of Japan is for all elementary and junior high students to use electronic versions of printed textbooks in the coming years. Students will use digital textbooks on tablet personal computers in classrooms with interactive whiteboards (IWB). This paper considers IWB and tablet computers (tablets) technologies fo...

  3. Closing the Technological Gender Gap: Feminist Pedagogy in the Computer-Assisted Classroom.

    Science.gov (United States)

    Hesse-Biber, Sharlene; Gilbert, Melissa Kesler

    1994-01-01

    Asserts that, although computers are playing an increasingly important role in the classroom, a technological gender gap serves as a barrier to the effective use of computers by women instructors in higher education. Encourages women to seize computer tools for their own educational purposes and argues for enhancing women's computer learning. (CFR)

  4. New tools and technology for the study of human performance in simulator experiments

    International Nuclear Information System (INIS)

    Droeivoldsmo, Asgeir

    2004-04-01

    The Halden Virtual Reality Centre has for the last four years reported a number of experiments in the area of real world application of virtual and augmented reality technology. The insights from these studies have been reviewed and reported as part of a PhD-thesis submitted at the Norwegian University of Science and Technology. This report is based on the thesis and contains a theoretical discussion of how the virtual and augmented reality technology could be used to extend human operator performance in control rooms to include co-operation with plant floor personnel and interaction with not already built equipment. This thesis suggests that new tools and technology can be used for production of relevant data and insights from the study of human performance in simulator and field experiments. It examines some of the theoretical perspectives behind data collection and human performance assessment, and argues for a high resemblance of the real world and use of subject matter expertise in simulator studies. A model is proposed, suggesting that human performance measurement should be tightly coupled to the topic of study and have a close connection to the time line. This coupling requires new techniques for continuous data collection, and eye movement tracking has been identified as a promising basis for this type of measures. One way of improving realism is to create virtual environments allowing for controlling more of the environment surrounding the test subjects. New application areas for virtual environments are discussed for use in control room and field studies. The combination of wearable computing, virtual and augmented (the use of computers to overlay virtual information onto the real world) reality provides many new possibilities to present information to operators. In two experiments, virtual and augmented reality techniques were used to visualise radiation fields for operators in a contaminated nuclear environment. This way the operators could train for

  5. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    Energy Technology Data Exchange (ETDEWEB)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V. [Institute of Informatics Problems, Russian Academy of Sciences (Russian Federation); Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S. [Telecommunication Systems Department, Peoples’ Friendship University of Russia (Russian Federation)

    2015-03-10

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.

  6. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    International Nuclear Information System (INIS)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S.

    2015-01-01

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures

  7. Saudi high school students' attitudes and barriers toward the use of computer technologies in learning English.

    Science.gov (United States)

    Sabti, Ahmed Abdulateef; Chaichan, Rasha Sami

    2014-01-01

    This study examines the attitudes of Saudi Arabian high school students toward the use of computer technologies in learning English. The study also discusses the possible barriers that affect and limit the actual usage of computers. Quantitative approach is applied in this research, which involved 30 Saudi Arabia students of a high school in Kuala Lumpur, Malaysia. The respondents comprised 15 males and 15 females with ages between 16 years and 18 years. Two instruments, namely, Scale of Attitude toward Computer Technologies (SACT) and Barriers affecting Students' Attitudes and Use (BSAU) were used to collect data. The Technology Acceptance Model (TAM) of Davis (1989) was utilized. The analysis of the study revealed gender differences in attitudes toward the use of computer technologies in learning English. Female students showed high and positive attitudes towards the use of computer technologies in learning English than males. Both male and female participants demonstrated high and positive perception of Usefulness and perceived Ease of Use of computer technologies in learning English. Three barriers that affected and limited the use of computer technologies in learning English were identified by the participants. These barriers are skill, equipment, and motivation. Among these barriers, skill had the highest effect, whereas motivation showed the least effect.

  8. Primary Trainee Teachers' Attitudes to and Use of Computer and Technology in Mathematics: The Case of Turkey

    Science.gov (United States)

    Dogan, Mustafa

    2010-01-01

    This study explores Turkish primary mathematics trainee teachers' attitudes to computer and technology. A survey was conducted with a self constructed questionnaire. Piloting, factor and reliability ([alpha] = 0.94) analyses were performed. The final version of the questionnaire has three parts with a total of 48 questions including a Likert type…

  9. The status of training and education in information and computer technology of Australian nurses: a national survey.

    Science.gov (United States)

    Eley, Robert; Fallon, Tony; Soar, Jeffrey; Buikstra, Elizabeth; Hegney, Desley

    2008-10-01

    A study was undertaken of the current knowledge and future training requirements of nurses in information and computer technology to inform policy to meet national goals for health. The role of the modern clinical nurse is intertwined with information and computer technology and adoption of such technology forms an important component of national strategies in health. The majority of nurses are expected to use information and computer technology during their work; however, the full extent of their knowledge and experience is unclear. Self-administered postal survey. A 78-item questionnaire was distributed to 10,000 Australian Nursing Federation members to identify the nurses' use of information and computer technology. Eighteen items related to nurses' training and education in information and computer technology. Response rate was 44%. Computers were used by 86.3% of respondents as part of their work-related activities. Between 4-17% of nurses had received training in each of 11 generic computer skills and software applications during their preregistration/pre-enrolment and between 12-30% as continuing professional education. Nurses who had received training believed that it was adequate to meet the needs of their job and was given at an appropriate time. Almost half of the respondents indicated that they required more training to better meet the information and computer technology requirements of their jobs and a quarter believed that their level of computer literacy was restricting their career development. Nurses considered that the vast majority of employers did not encourage information and computer technology training and, for those for whom training was available, workload was the major barrier to uptake. Nurses favoured introduction of a national competency standard in information and computer technology. For the considerable benefits of information and computer technology to be incorporated fully into the health system, employers must pay more attention

  10. TRANSFORMING RURAL SECONDARY SCHOOLS IN ZIMBABWE THROUGH TECHNOLOGY: LIVED EXPERIENCES OF STUDENT COMPUTER USERS

    Directory of Open Access Journals (Sweden)

    Gomba Clifford

    2016-04-01

    Full Text Available A technological divide exists in Zimbabwe between urban and rural schools that puts rural based students at a disadvantage. In Zimbabwe, the government, through the president donated computers to most rural schools in a bid to bridge the digital divide between rural and urban schools. The purpose of this phenomenological study was to understand the experiences of Advanced Level students using computers at two rural boarding Catholic High Schools in Zimbabwe. The study was guided by two research questions: (1 How do Advanced level students in the rural areas use computers at their school? and (2 What is the experience of using computers for Advanced Level students in the rural areas of Zimbabwe? By performing this study, it was possible to understand from the students’ experiences whether computer usage was for educational learning or not. The results of the phenomenological study showed that students’ experiences can be broadly classified into five themes, namely worthwhile (interesting experience, accessibility issues, teachers’ monopoly, research and social use, and Internet availability. The participants proposed teachers use computers, but not monopolize computer usage. The solution to the computer shortage may be solved by having donors and government help in the acquisitioning of more computers.

  11. The impact of changing computing technology on EPRI [Electric Power Research Institute] nuclear analysis codes

    International Nuclear Information System (INIS)

    Breen, R.J.

    1988-01-01

    The Nuclear Reload Management Program of the Nuclear Power Division (NPD) of the Electric Power Research Institute (EPRI) has the responsibility for initiating and managing applied research in selected nuclear engineering analysis functions for nuclear utilities. The computer systems that result from the research projects consist of large FORTRAN programs containing elaborate computational algorithms used to access such areas as core physics, fuel performance, thermal hydraulics, and transient analysis. This paper summarizes a study of computing technology trends sponsored by the NPD. The approach taken was to interview hardware and software vendors, industry observers, and utility personnel focusing on expected changes that will occur in the computing industry over the next 3 to 5 yr. Particular emphasis was placed on how these changes will impact engineering/scientific computer code development, maintenance, and use. In addition to the interviews, a workshop was held with attendees from EPRI, Power Computing Company, industry, and utilities. The workshop provided a forum for discussing issues and providing input into EPRI's long-term computer code planning process

  12. Spacecraft computer technology at Southwest Research Institute

    Science.gov (United States)

    Shirley, D. J.

    1993-01-01

    Southwest Research Institute (SwRI) has developed and delivered spacecraft computers for a number of different near-Earth-orbit spacecraft including shuttle experiments and SDIO free-flyer experiments. We describe the evolution of the basic SwRI spacecraft computer design from those weighing in at 20 to 25 lb and using 20 to 30 W to newer models weighing less than 5 lb and using only about 5 W, yet delivering twice the processing throughput. Because of their reduced size, weight, and power, these newer designs are especially applicable to planetary instrument requirements. The basis of our design evolution has been the availability of more powerful processor chip sets and the development of higher density packaging technology, coupled with more aggressive design strategies in incorporating high-density FPGA technology and use of high-density memory chips. In addition to reductions in size, weight, and power, the newer designs also address the necessity of survival in the harsh radiation environment of space. Spurred by participation in such programs as MSTI, LACE, RME, Delta 181, Delta Star, and RADARSAT, our designs have evolved in response to program demands to be small, low-powered units, radiation tolerant enough to be suitable for both Earth-orbit microsats and for planetary instruments. Present designs already include MIL-STD-1750 and Multi-Chip Module (MCM) technology with near-term plans to include RISC processors and higher-density MCM's. Long term plans include development of whole-core processors on one or two MCM's.

  13. Building a High Performance Computing Infrastructure for Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Belov, S; Kaplin, V; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2011-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies (ICT), and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of the computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for the particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. Recently a dedicated optical network with the initial bandwidth of 10 Gbps connecting these three facilities was built in order to make it possible to share the computing resources among the research communities of participating institutes, thus providing a common platform for building the computing infrastructure for various scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technologies based on XEN and KVM platforms. The solution implemented was tested thoroughly within the computing environment of KEDR detector experiment which is being carried out at BINP, and foreseen to be applied to the use cases of other HEP experiments in the upcoming future.

  14. The potential impact of computer-aided assessment technology in ...

    African Journals Online (AJOL)

    The potential impact of computer-aided assessment technology in higher education. ... Further more 'Increased number of students in Higher Education and the ... benefits, limitations, impacts on student learning and strategies for developing ...

  15. Pervasive Computing and Communication Technologies for U-Learning

    Science.gov (United States)

    Park, Young C.

    2014-01-01

    The development of digital information transfer, storage and communication methods influences a significant effect on education. The assimilation of pervasive computing and communication technologies marks another great step forward, with Ubiquitous Learning (U-learning) emerging for next generation learners. In the evolutionary view the 5G (or…

  16. Very High-Performance Embedded Computing Will Allow Ambitious Space Science Investigation

    National Research Council Canada - National Science Library

    Pignol, Michel

    2005-01-01

    .... developed on radiation tolerant technologies. Unfortunately, the microprocessors today available on such technologies have the computing throughput which was available about 10 years ago on the commercial market...

  17. VOCATIONAL TRAINING OF COMPETITIVE ENGINEERS THROUGH THE USE OF COMPUTER TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Yu. G. Loboda

    2017-11-01

    Full Text Available The article is concerned with the increasing importance of computer technologies and the need to educatean engineer at the level of modern advances in science and technology. New methods and technologies of learning based on the training of competitive engineers are considered. The teaching and methodological support of the didactic process of the professional training for the engineering students is presented and characterized: the motivation tasks for laboratory training and for calculating-and-graphic assignments, the individual card for the organization of the student's work, the methodical guidances. Examples of motivational tasks used for laboratory training are provided: the solution of the search tasks of linear programming, the simulation modeling of the optimal tax rates. The most effective ways to organize learning process using computer technologies for the vocational training of engineering students are indicated.

  18. Integrated Geo Hazard Management System in Cloud Computing Technology

    Science.gov (United States)

    Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.

    2016-11-01

    Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.

  19. Seven Years after the Manifesto: Literature Review and Research Directions for Technologies in Animal Computer Interaction

    Directory of Open Access Journals (Sweden)

    Ilyena Hirskyj-Douglas

    2018-06-01

    Full Text Available As technologies diversify and become embedded in everyday lives, the technologies we expose to animals, and the new technologies being developed for animals within the field of Animal Computer Interaction (ACI are increasing. As we approach seven years since the ACI manifesto, which grounded the field within Human Computer Interaction and Computer Science, this thematic literature review looks at the technologies developed for (non-human animals. Technologies that are analysed include tangible and physical, haptic and wearable, olfactory, screen technology and tracking systems. The conversation explores what exactly ACI is whilst questioning what it means to be animal by considering the impact and loop between machine and animal interactivity. The findings of this review are expected to form the first grounding foundation of ACI technologies informing future research in animal computing as well as suggesting future areas for exploration.

  20. Hacking the brain: Brain-computer interfacing technology and the ethics of neurosecurity

    NARCIS (Netherlands)

    Ienca, M.; Haselager, W.F.G.

    2016-01-01

    Brain-computer interfacing technologies are used as assistive technologies for patients as well as healthy subjects to control devices solely by brain activity. Yet the risks associated with the misuse of these technologies remain largely unexplored. Recent findings have shown that BCIs are

  1. RESEARCH OF ENGINEERING TRAFFIC IN COMPUTER UZ NETWORK USING MPLS TE TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    V. M. Pakhomovа

    2014-12-01

    Full Text Available Purpose. In railway transport of Ukraine one requires the use of computer networks of different technologies: Ethernet, Token Bus, Token Ring, FDDI and others. In combined computer networks on the railway transport it is necessary to use packet switching technology in multiprotocol networks MPLS (MultiProtocol Label Switching more effectively. They are based on the use of tags. Packet network must transmit different types of traffic with a given quality of service. The purpose of the research is development a methodology for determining the sequence of destination flows for the considered fragment of computer network of UZ. Methodology. When optimizing traffic management in MPLS networks has the important role of technology traffic engineering (Traffic Engineering, TE. The main mechanism of TE in MPLS is the use of unidirectional tunnels (MPLS TE tunnel to specify the path of the specified traffic. The mathematical model of the problem of traffic engineering in computer network of UZ technology MPLS TE was made. Computer UZ network is represented with the directed graph, their vertices are routers of computer network, and each arc simulates communication between nodes. As an optimization criterion serves the minimum value of the maximum utilization of the TE-tunnel. Findings. The six options destination flows were determined; rational sequence of flows was found, at which the maximum utilization of TE-tunnels considered a simplified fragment of a computer UZ network does not exceed 0.5. Originality. The method of solving the problem of traffic engineering in Multiprotocol network UZ technology MPLS TE was proposed; for different classes its own way is laid, depending on the bandwidth and channel loading. Practical value. Ability to determine the values of the maximum coefficient of use of TE-tunnels in computer UZ networks based on developed software model «TraffEng». The input parameters of the model: number of routers, channel capacity, the

  2. Technology computer aided design simulation for VLSI MOSFET

    CERN Document Server

    Sarkar, Chandan Kumar

    2013-01-01

    Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and

  3. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...

  4. Misleading Performance Claims in Parallel Computations

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.

    2009-05-29

    In a previous humorous note entitled 'Twelve Ways to Fool the Masses,' I outlined twelve common ways in which performance figures for technical computer systems can be distorted. In this paper and accompanying conference talk, I give a reprise of these twelve 'methods' and give some actual examples that have appeared in peer-reviewed literature in years past. I then propose guidelines for reporting performance, the adoption of which would raise the level of professionalism and reduce the level of confusion, not only in the world of device simulation but also in the larger arena of technical computing.

  5. Proceedings of the Sixth Seminar on Computation in Nuclear Science and Technology

    International Nuclear Information System (INIS)

    1997-01-01

    National Atomic Energy Agency (BATAN) had held the Sixth Seminar on Computation in Nuclear Science and Technology on January 16-17, 1996. The seminar is an event for information exchange among interest society in computation, modeling, and simulation. Similar as previous seminar in the past year, there were also non-BATAN and university-bound presenters and participants whose interest are in the field of science and technology. Examining the papers presented in this seminar, it shows that beside digging the so called classical computation methods, some papers brought relatively new topics like the determination and influence of chaos, neural network method, and expert system. Judging from the variety of the topics, one can conclude that interests in computation and its application are growing stronger in Indonesia

  6. First International Symposium on Applied Computing and Information Technology (ACIT 2013)

    CERN Document Server

    Applied Computing and Information Technology

    2014-01-01

    This book presents the selected results of the 1st International Symposium on Applied Computers and Information Technology (ACIT 2013) held on August 31 – September 4, 2013 in Matsue City, Japan, which brought together researchers, scientists, engineers, industry practitioners, and students to discuss all aspects of  Applied Computers & Information Technology, and its practical challenges. This book includes the best 12 papers presented at the conference, which were chosen based on review scores submitted by members of the program committee and underwent further rigorous rounds of review.  

  7. Effect of Physical Education Teachers' Computer Literacy on Technology Use in Physical Education

    Science.gov (United States)

    Kretschmann, Rolf

    2015-01-01

    Teachers' computer literacy has been identified as a factor that determines their technology use in class. The aim of this study was to investigate the relationship between physical education (PE) teachers' computer literacy and their technology use in PE. The study group consisted of 57 high school level in-service PE teachers. A survey was used…

  8. Performance variation in motor imagery brain-computer interface: a brief review.

    Science.gov (United States)

    Ahn, Minkyu; Jun, Sung Chan

    2015-03-30

    Brain-computer interface (BCI) technology has attracted significant attention over recent decades, and has made remarkable progress. However, BCI still faces a critical hurdle, in that performance varies greatly across and even within subjects, an obstacle that degrades the reliability of BCI systems. Understanding the causes of these problems is important if we are to create more stable systems. In this short review, we report the most recent studies and findings on performance variation, especially in motor imagery-based BCI, which has found that low-performance groups have a less-developed brain network that is incapable of motor imagery. Further, psychological and physiological states influence performance variation within subjects. We propose a possible strategic approach to deal with this variation, which may contribute to improving the reliability of BCI. In addition, the limitations of current work and opportunities for future studies are discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    CERN Document Server

    Anisenkov, Alexey; The ATLAS collaboration; Alandes Pradillo, Maria

    2016-01-01

    AGIS is the information system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing (ADC) applications and services. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others.

  10. 11th International Conference on Computing and Information Technology

    CERN Document Server

    Meesad, Phayung; Boonkrong, Sirapat

    2015-01-01

    This book presents recent research work and results in the area of communication and information technologies. The book includes the main results of the 11th International Conference on Computing and Information Technology (IC2IT) held during July 2nd-3rd, 2015 in Bangkok, Thailand. The book is divided into the two main parts Data Mining and Machine Learning as well as Data Network and Communications. New algorithms and methods of data mining asr discussed as well as innovative applications and state-of-the-art technologies on data mining, machine learning and data networking.

  11. High-Performance Compute Infrastructure in Astronomy: 2020 Is Only Months Away

    Science.gov (United States)

    Berriman, B.; Deelman, E.; Juve, G.; Rynge, M.; Vöckler, J. S.

    2012-09-01

    By 2020, astronomy will be awash with as much as 60 PB of public data. Full scientific exploitation of such massive volumes of data will require high-performance computing on server farms co-located with the data. Development of this computing model will be a community-wide enterprise that has profound cultural and technical implications. Astronomers must be prepared to develop environment-agnostic applications that support parallel processing. The community must investigate the applicability and cost-benefit of emerging technologies such as cloud computing to astronomy, and must engage the Computer Science community to develop science-driven cyberinfrastructure such as workflow schedulers and optimizers. We report here the results of collaborations between a science center, IPAC, and a Computer Science research institute, ISI. These collaborations may be considered pathfinders in developing a high-performance compute infrastructure in astronomy. These collaborations investigated two exemplar large-scale science-driver workflow applications: 1) Calculation of an infrared atlas of the Galactic Plane at 18 different wavelengths by placing data from multiple surveys on a common plate scale and co-registering all the pixels; 2) Calculation of an atlas of periodicities present in the public Kepler data sets, which currently contain 380,000 light curves. These products have been generated with two workflow applications, written in C for performance and designed to support parallel processing on multiple environments and platforms, but with different compute resource needs: the Montage image mosaic engine is I/O-bound, and the NASA Star and Exoplanet Database periodogram code is CPU-bound. Our presentation will report cost and performance metrics and lessons-learned for continuing development. Applicability of Cloud Computing: Commercial Cloud providers generally charge for all operations, including processing, transfer of input and output data, and for storage of data

  12. Research on application of computer technologies in jewelry process

    Directory of Open Access Journals (Sweden)

    Junbo Xia

    2017-06-01

    Full Text Available Jewelry production is a process of precious raw materials and low losses in processing. The traditional manual mode is unable to meet the needs of enterprises in reality, while the involvement of computer technology can just solve this practical problem. At present, the problem of restricting the application for computer in jewelry production is mainly a failure to find a production model that can serve the whole industry chain with the computer as the core of production. This paper designs a “synchronous and diversified” production model with “computer aided design technology” and “rapid prototyping technology” as the core, and tests with actual production cases, and achieves certain results, which are forward-looking and advanced.

  13. An integrated impact assessment and weighting methodology: evaluation of the environmental consequences of computer display technology substitution.

    Science.gov (United States)

    Zhou, Xiaoying; Schoenung, Julie M

    2007-04-01

    Computer display technology is currently in a state of transition, as the traditional technology of cathode ray tubes is being replaced by liquid crystal display flat-panel technology. Technology substitution and process innovation require the evaluation of the trade-offs among environmental impact, cost, and engineering performance attributes. General impact assessment methodologies, decision analysis and management tools, and optimization methods commonly used in engineering cannot efficiently address the issues needed for such evaluation. The conventional Life Cycle Assessment (LCA) process often generates results that can be subject to multiple interpretations, although the advantages of the LCA concept and framework obtain wide recognition. In the present work, the LCA concept is integrated with Quality Function Deployment (QFD), a popular industrial quality management tool, which is used as the framework for the development of our integrated model. The problem of weighting is addressed by using pairwise comparison of stakeholder preferences. Thus, this paper presents a new integrated analytical approach, Integrated Industrial Ecology Function Deployment (I2-EFD), to assess the environmental behavior of alternative technologies in correlation with their performance and economic characteristics. Computer display technology is used as the case study to further develop our methodology through the modification and integration of various quality management tools (e.g., process mapping, prioritization matrix) and statistical methods (e.g., multi-attribute analysis, cluster analysis). Life cycle thinking provides the foundation for our methodology, as we utilize a published LCA report, which stopped at the characterization step, as our starting point. Further, we evaluate the validity and feasibility of our methodology by considering uncertainty and conducting sensitivity analysis.

  14. 2nd International Conference on Computer and Communication Technologies

    CERN Document Server

    Raju, K; Mandal, Jyotsna; Bhateja, Vikrant

    2016-01-01

    The book is about all aspects of computing, communication, general sciences and educational research covered at the Second International Conference on Computer & Communication Technologies held during 24-26 July 2015 at Hyderabad. It hosted by CMR Technical Campus in association with Division – V (Education & Research) CSI, India. After a rigorous review only quality papers are selected and included in this book. The entire book is divided into three volumes. Three volumes cover a variety of topics which include medical imaging, networks, data mining, intelligent computing, software design, image processing, mobile computing, digital signals and speech processing, video surveillance and processing, web mining, wireless sensor networks, circuit analysis, fuzzy systems, antenna and communication systems, biomedical signal processing and applications, cloud computing, embedded systems applications and cyber security and digital forensic. The readers of these volumes will be highly benefited from the te...

  15. High performance graphics processor based computed tomography reconstruction algorithms for nuclear and other large scale applications.

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Edward S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Orr, Laurel J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Thompson, Kyle R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-09-01

    The goal of this work is to develop a fast computed tomography (CT) reconstruction algorithm based on graphics processing units (GPU) that achieves significant improvement over traditional central processing unit (CPU) based implementations. The main challenge in developing a CT algorithm that is capable of handling very large datasets is parallelizing the algorithm in such a way that data transfer does not hinder performance of the reconstruction algorithm. General Purpose Graphics Processing (GPGPU) is a new technology that the Science and Technology (S&T) community is starting to adopt in many fields where CPU-based computing is the norm. GPGPU programming requires a new approach to algorithm development that utilizes massively multi-threaded environments. Multi-threaded algorithms in general are difficult to optimize since performance bottlenecks occur that are non-existent in single-threaded algorithms such as memory latencies. If an efficient GPU-based CT reconstruction algorithm can be developed; computational times could be improved by a factor of 20. Additionally, cost benefits will be realized as commodity graphics hardware could potentially replace expensive supercomputers and high-end workstations. This project will take advantage of the CUDA programming environment and attempt to parallelize the task in such a way that multiple slices of the reconstruction volume are computed simultaneously. This work will also take advantage of the GPU memory by utilizing asynchronous memory transfers, GPU texture memory, and (when possible) pinned host memory so that the memory transfer bottleneck inherent to GPGPU is amortized. Additionally, this work will take advantage of GPU-specific hardware (i.e. fast texture memory, pixel-pipelines, hardware interpolators, and varying memory hierarchy) that will allow for additional performance improvements.

  16. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  17. Assessment of the fit of removable partial denture fabricated by computer-aided designing/computer aided manufacturing technology.

    Science.gov (United States)

    Arafa, Khalid A O

    2018-01-01

    To assess the level of evidence that supports the quality of fit for removable partial denture (RPD) fabricated by computer-aided designing/computer aided manufacturing (CAD/CAM) and rapid prototyping (RP) technology. Methods: An electronic search was performed in Google Scholar, PubMed, and Cochrane library search engines, using Boolean operators. All articles published in English and published in the period from 1950 until April 2017 were eligible to be included in this review. The total number of articles contained the search terms in any part of the article (including titles, abstracts, or article texts) were screened, which resulted in 214 articles. After exclusion of irrelevant and duplicated articles, 12 papers were included in this systematic review.  Results: All the included studies were case reports, except one study, which was a case series that recruited 10 study participants. The visual and tactile examination in the cast or clinically in the patient's mouth was the most-used method for assessment of the fit of RPDs. From all included studies, only one has assessed the internal fit between RPDs and oral tissues using silicone registration material. The vast majority of included studies found that the fit of RPDs ranged from satisfactory to excellent fit. Conclusion: Despite the lack of clinical trials that provide strong evidence, the available evidence supported the claim of good fit of RPDs fabricated by new technologies using CAD/CAM.

  18. Fast Performance Computing Model for Smart Distributed Power Systems

    Directory of Open Access Journals (Sweden)

    Umair Younas

    2017-06-01

    Full Text Available Plug-in Electric Vehicles (PEVs are becoming the more prominent solution compared to fossil fuels cars technology due to its significant role in Greenhouse Gas (GHG reduction, flexible storage, and ancillary service provision as a Distributed Generation (DG resource in Vehicle to Grid (V2G regulation mode. However, large-scale penetration of PEVs and growing demand of energy intensive Data Centers (DCs brings undesirable higher load peaks in electricity demand hence, impose supply-demand imbalance and threaten the reliability of wholesale and retail power market. In order to overcome the aforementioned challenges, the proposed research considers smart Distributed Power System (DPS comprising conventional sources, renewable energy, V2G regulation, and flexible storage energy resources. Moreover, price and incentive based Demand Response (DR programs are implemented to sustain the balance between net demand and available generating resources in the DPS. In addition, we adapted a novel strategy to implement the computational intensive jobs of the proposed DPS model including incoming load profiles, V2G regulation, battery State of Charge (SOC indication, and fast computation in decision based automated DR algorithm using Fast Performance Computing resources of DCs. In response, DPS provide economical and stable power to DCs under strict power quality constraints. Finally, the improved results are verified using case study of ISO California integrated with hybrid generation.

  19. HPCToolkit: performance tools for scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M [Department of Computer Science, Rice University, Houston, TX 77005 (United States)

    2008-07-15

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei.

  20. HPCToolkit: performance tools for scientific computing

    International Nuclear Information System (INIS)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M

    2008-01-01

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei

  1. Computer task performance by subjects with Duchenne muscular dystrophy.

    Science.gov (United States)

    Malheiros, Silvia Regina Pinheiro; da Silva, Talita Dias; Favero, Francis Meire; de Abreu, Luiz Carlos; Fregni, Felipe; Ribeiro, Denise Cardoso; de Mello Monteiro, Carlos Bandeira

    2016-01-01

    Two specific objectives were established to quantify computer task performance among people with Duchenne muscular dystrophy (DMD). First, we compared simple computational task performance between subjects with DMD and age-matched typically developing (TD) subjects. Second, we examined correlations between the ability of subjects with DMD to learn the computational task and their motor functionality, age, and initial task performance. The study included 84 individuals (42 with DMD, mean age of 18±5.5 years, and 42 age-matched controls). They executed a computer maze task; all participants performed the acquisition (20 attempts) and retention (five attempts) phases, repeating the same maze. A different maze was used to verify transfer performance (five attempts). The Motor Function Measure Scale was applied, and the results were compared with maze task performance. In the acquisition phase, a significant decrease was found in movement time (MT) between the first and last acquisition block, but only for the DMD group. For the DMD group, MT during transfer was shorter than during the first acquisition block, indicating improvement from the first acquisition block to transfer. In addition, the TD group showed shorter MT than the DMD group across the study. DMD participants improved their performance after practicing a computational task; however, the difference in MT was present in all attempts among DMD and control subjects. Computational task improvement was positively influenced by the initial performance of individuals with DMD. In turn, the initial performance was influenced by their distal functionality but not their age or overall functionality.

  2. The Advantages and Disadvantages of Computer Technology in Second Language Acquisition

    Science.gov (United States)

    Lai, Cheng-Chieh; Kritsonis, William Allan

    2006-01-01

    The purpose of this article is to discuss the advantages and disadvantages of computer technology and Computer Assisted Language Learning (CALL) programs for current second language learning. According to the National Clearinghouse for English Language Acquisition & Language Instruction Educational Programs' report (2002), more than nine million…

  3. Human performance models for computer-aided engineering

    Science.gov (United States)

    Elkind, Jerome I. (Editor); Card, Stuart K. (Editor); Hochberg, Julian (Editor); Huey, Beverly Messick (Editor)

    1989-01-01

    This report discusses a topic important to the field of computational human factors: models of human performance and their use in computer-based engineering facilities for the design of complex systems. It focuses on a particular human factors design problem -- the design of cockpit systems for advanced helicopters -- and on a particular aspect of human performance -- vision and related cognitive functions. By focusing in this way, the authors were able to address the selected topics in some depth and develop findings and recommendations that they believe have application to many other aspects of human performance and to other design domains.

  4. Performative Computation-aided Design Optimization

    Directory of Open Access Journals (Sweden)

    Ming Tang

    2012-12-01

    Full Text Available This article discusses a collaborative research and teaching project between the University of Cincinnati, Perkins+Will’s Tech Lab, and the University of North Carolina Greensboro. The primary investigation focuses on the simulation, optimization, and generation of architectural designs using performance-based computational design approaches. The projects examine various design methods, including relationships between building form, performance and the use of proprietary software tools for parametric design.

  5. COMPUTER-ASSISTED CONTROL OF ACADEMIC PERFORMANCE IN ENGINEERING GRAPHICS WITHIN THE FRAMEWORK OF DISTANCE LEARNING PROGRAMMES

    Directory of Open Access Journals (Sweden)

    Tel'noy Viktor Ivanovich

    2012-10-01

    Full Text Available Development of computer-assisted computer technologies and their integration into the academic activity with a view to the control of the academic performance within the framework of distance learning programmes represent the subject matter of the article. The article is a brief overview of the software programme designated for the monitoring of the academic performance of students enrolled in distance learning programmes. The software is developed on Delphi 7.0 for Windows operating system. The strength of the proposed software consists in the availability of the two modes of its operation that differ in the principle of the problem selection and timing parameters. Interim academic performance assessment is to be performed through the employment of computerized testing procedures that contemplate the use of a data base of testing assignments implemented in the eLearning Server media. Identification of students is to be performed through the installation of video cameras at workplaces of students.

  6. Beyond Computer Literacy: Technology Integration and Curriculum Transformation

    Science.gov (United States)

    Safar, Ammar H.; AlKhezzi, Fahad A.

    2013-01-01

    Personal computers, the Internet, smartphones, and other forms of information and communication technology (ICT) have changed our world, our job, our personal lives, as well as how we manage our knowledge and time effectively and efficiently. Research findings in the past decades have acknowledged and affirmed that the content the ICT medium…

  7. Basic Technology Competencies, Attitude towards Computer Assisted Education and Usage of Technologies in Turkish Lesson: A Correlation

    Science.gov (United States)

    Özdemir, Serpil

    2017-01-01

    The present research was done to determine the basic technology competency of Turkish teachers, their attitude towards computer-assisted education, and their technology operation level in Turkish lessons, and to designate the relationship between them. 85 Turkish teachers studying in public schools in Bartin participated in the research. The…

  8. Real-time Tsunami Inundation Prediction Using High Performance Computers

    Science.gov (United States)

    Oishi, Y.; Imamura, F.; Sugawara, D.

    2014-12-01

    Recently off-shore tsunami observation stations based on cabled ocean bottom pressure gauges are actively being deployed especially in Japan. These cabled systems are designed to provide real-time tsunami data before tsunamis reach coastlines for disaster mitigation purposes. To receive real benefits of these observations, real-time analysis techniques to make an effective use of these data are necessary. A representative study was made by Tsushima et al. (2009) that proposed a method to provide instant tsunami source prediction based on achieving tsunami waveform data. As time passes, the prediction is improved by using updated waveform data. After a tsunami source is predicted, tsunami waveforms are synthesized from pre-computed tsunami Green functions of linear long wave equations. Tsushima et al. (2014) updated the method by combining the tsunami waveform inversion with an instant inversion of coseismic crustal deformation and improved the prediction accuracy and speed in the early stages. For disaster mitigation purposes, real-time predictions of tsunami inundation are also important. In this study, we discuss the possibility of real-time tsunami inundation predictions, which require faster-than-real-time tsunami inundation simulation in addition to instant tsunami source analysis. Although the computational amount is large to solve non-linear shallow water equations for inundation predictions, it has become executable through the recent developments of high performance computing technologies. We conducted parallel computations of tsunami inundation and achieved 6.0 TFLOPS by using 19,000 CPU cores. We employed a leap-frog finite difference method with nested staggered grids of which resolution range from 405 m to 5 m. The resolution ratio of each nested domain was 1/3. Total number of grid points were 13 million, and the time step was 0.1 seconds. Tsunami sources of 2011 Tohoku-oki earthquake were tested. The inundation prediction up to 2 hours after the

  9. COMPUTERS: Teraflops for Europe; EEC Working Group on High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1991-03-15

    In little more than a decade, simulation on high performance computers has become an essential tool for theoretical physics, capable of solving a vast range of crucial problems inaccessible to conventional analytic mathematics. In many ways, computer simulation has become the calculus for interacting many-body systems, a key to the study of transitions from isolated to collective behaviour.

  10. COMPUTERS: Teraflops for Europe; EEC Working Group on High Performance Computing

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    In little more than a decade, simulation on high performance computers has become an essential tool for theoretical physics, capable of solving a vast range of crucial problems inaccessible to conventional analytic mathematics. In many ways, computer simulation has become the calculus for interacting many-body systems, a key to the study of transitions from isolated to collective behaviour

  11. The Impact of Iranian Teachers Cultural Values on Computer Technology Acceptance

    Science.gov (United States)

    Sadeghi, Karim; Saribagloo, Javad Amani; Aghdam, Samad Hanifepour; Mahmoudi, Hojjat

    2014-01-01

    This study was conducted with the aim of testing the technology acceptance model and the impact of Hofstede cultural values (masculinity/femininity, uncertainty avoidance, individualism/collectivism, and power distance) on computer technology acceptance among teachers at Urmia city (Iran) using the structural equation modeling approach. From among…

  12. Understanding the Critics of Educational Technology: Gender Inequities and Computers 1983-1993.

    Science.gov (United States)

    Mangione, Melissa

    Although many view computers purely as technological tools to be utilized in the classroom and workplace, attention has been drawn to the social differences computers perpetuate, including those of race, class, and gender. This paper focuses on gender and computing by examining recent analyses in regards to content, form, and usage concerns. The…

  13. A Detailed Analysis over Some Important Issues towards Using Computer Technology into the EFL Classrooms

    Science.gov (United States)

    Gilakjani, Abbas Pourhosein

    2014-01-01

    Computer technology has changed the ways we work, learn, interact and spend our leisure time. Computer technology has changed every aspect of our daily life--how and where we get our news, how we order goods and services, and how we communicate. This study investigates some of the significant issues concerning the use of computer technology…

  14. Performance-Based Technology Selection Filter description report

    International Nuclear Information System (INIS)

    O'Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL)

  15. Performance-Based Technology Selection Filter description report

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  16. Cloud computing and patient engagement: leveraging available technology.

    Science.gov (United States)

    Noblin, Alice; Cortelyou-Ward, Kendall; Servan, Rosa M

    2014-01-01

    Cloud computing technology has the potential to transform medical practices and improve patient engagement and quality of care. However, issues such as privacy and security and "fit" can make incorporation of the cloud an intimidating decision for many physicians. This article summarizes the four most common types of clouds and discusses their ideal uses, how they engage patients, and how they improve the quality of care offered. This technology also can be used to meet Meaningful Use requirements 1 and 2; and, if speculation is correct, the cloud will provide the necessary support needed for Meaningful Use 3 as well.

  17. Application of computer virtual simulation technology in 3D animation production

    Science.gov (United States)

    Mo, Can

    2017-11-01

    In the continuous development of computer technology, the application system of virtual simulation technology has been further optimized and improved. It also has been widely used in various fields of social development, such as city construction, interior design, industrial simulation and tourism teaching etc. This paper mainly introduces the virtual simulation technology used in 3D animation. Based on analyzing the characteristics of virtual simulation technology, the application ways and means of this technology in 3D animation are researched. The purpose is to provide certain reference for the 3D effect promotion days after.

  18. High performance parallel computers for science

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1989-01-01

    This paper reports that Fermilab's Advanced Computer Program (ACP) has been developing cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 Mflops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction

  19. Development of Integrated Assessment Technology of Risk and Performance

    International Nuclear Information System (INIS)

    Yang, Jun Eon; Kang, Dae Il; Kang, Hyun Gook

    2010-04-01

    The main idea and contents are summarized as below 1) Development of new risk/performance assessment system innovating old labor-intensive risk assessment structure - New consolidated risk assessment technology from various hazard(flood, fire, seismic in NPP) - BOP model development for performance monitoring - Consolidated risk/performance management system for consistency and efficiency of NPP 2) Resolution technology for pending issues in PSA - Base technology for PSA of digital I and C system - Base technology for seismic PSA reflecting domestic seismic characteristics and aging effect - Uncertainty reduction technology for level 2 PSA and best estimation of containment failure frequency 3) Next generation risk/performance assessment technology - Human-induced error reduction technology for efficient operation of a NPP

  20. High performance statistical computing with parallel R: applications to biology and climate modelling

    International Nuclear Information System (INIS)

    Samatova, Nagiza F; Branstetter, Marcia; Ganguly, Auroop R; Hettich, Robert; Khan, Shiraj; Kora, Guruprasad; Li, Jiangtian; Ma, Xiaosong; Pan, Chongle; Shoshani, Arie; Yoginath, Srikanth

    2006-01-01

    Ultrascale computing and high-throughput experimental technologies have enabled the production of scientific data about complex natural phenomena. With this opportunity, comes a new problem - the massive quantities of data so produced. Answers to fundamental questions about the nature of those phenomena remain largely hidden in the produced data. The goal of this work is to provide a scalable high performance statistical data analysis framework to help scientists perform interactive analyses of these raw data to extract knowledge. Towards this goal we have been developing an open source parallel statistical analysis package, called Parallel R, that lets scientists employ a wide range of statistical analysis routines on high performance shared and distributed memory architectures without having to deal with the intricacies of parallelizing these routines

  1. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  2. The Role of Wireless Computing Technology in the Design of Schools.

    Science.gov (United States)

    Nair, Prakash

    This document discusses integrating computers logically and affordably into a school building's infrastructure through the use of wireless technology. It begins by discussing why wireless networks using mobile computers are preferable to desktop machines in each classoom. It then explains the features of a wireless local area network (WLAN) and…

  3. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    International Nuclear Information System (INIS)

    Chow, J

    2015-01-01

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant

  4. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    Energy Technology Data Exchange (ETDEWEB)

    Chow, J [Princess Margaret Cancer Center, Toronto, ON (Canada)

    2015-06-15

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.

  5. Quantum Accelerators for High-Performance Computing Systems

    OpenAIRE

    Britt, Keith A.; Mohiyaddin, Fahd A.; Humble, Travis S.

    2017-01-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantu...

  6. The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC

    Science.gov (United States)

    Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan

    2016-04-01

    The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.

  7. Designing Pervasive Computing Technology - In a Nomadic Work Perspective

    DEFF Research Database (Denmark)

    Kristensen, Jannie Friis

    2002-01-01

    In my thesis work I am investigating how the design of pervasive/ubiquitous computing technology, relate to the flexible and individual work practice of nomadic workers. Through empirical studies and with an experimental systems development approach, the work is focused on: a) Supporting...

  8. In the Clouds: The Implications of Cloud Computing for Higher Education Information Technology Governance and Decision Making

    Science.gov (United States)

    Dulaney, Malik H.

    2013-01-01

    Emerging technologies challenge the management of information technology in organizations. Paradigm changing technologies, such as cloud computing, have the ability to reverse the norms in organizational management, decision making, and information technology governance. This study explores the effects of cloud computing on information technology…

  9. Learning Style and Task Performance in Synchronous Computer-Mediated Communication: A Case Study of Iranian EFL Learners

    Science.gov (United States)

    Hedayati, Mohsen; Foomani, Elham Mohammadi

    2015-01-01

    The study reported here explores whether English as a foreign Language (EFL) learners' preferred ways of learning (i.e., learning styles) affect their task performance in computer-mediated communication (CMC). As Ellis (2010) points out, while the increasing use of different sorts of technology is witnessed in language learning contexts, it is…

  10. A Model for the Acceptance of Cloud Computing Technology Using DEMATEL Technique and System Dynamics Approach

    Directory of Open Access Journals (Sweden)

    seyyed mohammad zargar

    2018-03-01

    Full Text Available Cloud computing is a new method to provide computing resources and increase computing power in organizations. Despite the many benefits this method shares, it has not been universally used because of some obstacles including security issues and has become a concern for IT managers in organization. In this paper, the general definition of cloud computing is presented. In addition, having reviewed previous studies, the researchers identified effective variables on technology acceptance and, especially, cloud computing technology. Then, using DEMATEL technique, the effectiveness and permeability of the variable were determined. The researchers also designed a model to show the existing dynamics in cloud computing technology using system dynamics approach. The validity of the model was confirmed through evaluation methods in dynamics model by using VENSIM software. Finally, based on different conditions of the proposed model, a variety of scenarios were designed. Then, the implementation of these scenarios was simulated within the proposed model. The results showed that any increase in data security, government support and user training can lead to the increase in the adoption and use of cloud computing technology.

  11. Performance Aspects of Synthesizable Computing Systems

    DEFF Research Database (Denmark)

    Schleuniger, Pascal

    Embedded systems are used in a broad range of applications that demand high performance within severely constrained mechanical, power, and cost requirements. Embedded systems implemented in ASIC technology tend to provide the highest performance, lowest power consumption and lowest unit cost. How...

  12. The Technology Fix: The Promise and Reality of Computers in Our Schools

    Science.gov (United States)

    Pflaum, William D.

    2004-01-01

    During the technology boom of the 1980s and 1990s, computers seemed set to revolutionize education. Do any of these promises sound familiar? (1) Technology would help all students learn better, thanks to multimedia programs capable of adapting to individual needs, learning styles, and skill levels; (2) Technology would transform the teacher's role…

  13. Effect of Computer-Assisted Learning on Students' Dental Anatomy Waxing Performance.

    Science.gov (United States)

    Kwon, So Ran; Hernández, Marcela; Blanchette, Derek R; Lam, Matthew T; Gratton, David G; Aquilino, Steven A

    2015-09-01

    The aim of this study was to evaluate the impact of computer-assisted learning on first-year dental students' waxing abilities and self-evaluation skills. Additionally, this study sought to determine how well digital evaluation software performed compared to faculty grading with respect to students' technical scores on a practical competency examination. First-year students at one U.S. dental school were assigned to one of three groups: control (n=40), E4D Compare (n=20), and Sirona prepCheck (n=19). Students in the control group were taught by traditional teaching methodologies, and the technology-assisted groups received both traditional training and supplementary feedback from the corresponding digital system. Five outcomes were measured: visual assessment score, self-evaluation score, and digital assessment scores at 0.25 mm, 0.30 mm, and 0.35 mm tolerance. The scores from visual assessment and self-evaluation were examined for differences among groups using the Kruskal-Wallis test. Correlation between the visual assessment and digital scores was measured using Pearson and Spearman rank correlation coefficients. At completion of the course, students were asked to complete a survey on the use of these digital technologies. All 79 students in the first-year class participated in the study, for a 100% response rate. The results showed that the visual assessment and self-evaluation scores did not differ among groups (p>0.05). Overall correlations between visual and digital assessment scores were modest though statistically significant (5% level of significance). Analysis of survey responses completed by students in the technology groups showed that profiles for the two groups were similar and not favorable towards digital technology. The study concluded that technology-assisted training did not affect these students' waxing performance or self-evaluation skills and that visual scores given by faculty and digital assessment scores correlated moderately.

  14. Preparing for Further Introduction of Computing Technology in Vancouver Community College Instruction. Report of the Instructional Computing Committee.

    Science.gov (United States)

    Vancouver Community Coll., British Columbia.

    After examining the impact of changing technology on postsecondary instruction and on the tools needed for instruction, this report analyzes the status and offers recommendations concerning the future of instructional computing at Vancouver Community College (VCC) in British Columbia. Section I focuses on the use of computers in community college…

  15. Accessible high performance computing solutions for near real-time image processing for time critical applications

    Science.gov (United States)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  16. Extending the horizons advances in computing, optimization, and decision technologies

    CERN Document Server

    Joseph, Anito; Mehrotra, Anuj; Trick, Michael

    2007-01-01

    Computer Science and Operations Research continue to have a synergistic relationship and this book represents the results of cross-fertilization between OR/MS and CS/AI. It is this interface of OR/CS that makes possible advances that could not have been achieved in isolation. Taken collectively, these articles are indicative of the state-of-the-art in the interface between OR/MS and CS/AI and of the high caliber of research being conducted by members of the INFORMS Computing Society. EXTENDING THE HORIZONS: Advances in Computing, Optimization, and Decision Technologies is a volume that presents the latest, leading research in the design and analysis of algorithms, computational optimization, heuristic search and learning, modeling languages, parallel and distributed computing, simulation, computational logic and visualization. This volume also emphasizes a variety of novel applications in the interface of CS, AI, and OR/MS.

  17. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    Science.gov (United States)

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  18. Leading survey and research report for fiscal 1999. Survey and research on supercompiler technology; 1999 nendo supercompiler technology no chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Survey and research are conducted into the global computing technology and the next-generation parallel computer for their compiler technology and programming environment-related technology, which is for the preparation of basic key technologies for the embodiment of high-performance computing for the next generation, and efforts are exerted to extract and define technological problems and to deliberate a research system to achieve the goal. This fiscal year's achievements are mentioned below. Two territories were provided to be respectively covered by a Parallel Compiler Working Group and a Global Computing Working Group whose activities centered about overseas surveys and short-term reception of researchers from abroad. The Parallel Compiler Working Group was engaged in (1) the technological survey of the latest parallel compiler technology and, in its effort to execute researches under the project, in (2) the materialization of the contents of technology research and development and in (3) the materialization of a technology research and development system. The Global Computing Working Group was engaged in (1) the technological survey of the latest high-performance global computing and in (2) the survey of fields to accept global computing application. (NEDO)

  19. Leading survey and research report for fiscal 1999. Survey and research on supercompiler technology; 1999 nendo supercompiler technology no chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Survey and research are conducted into the global computing technology and the next-generation parallel computer for their compiler technology and programming environment-related technology, which is for the preparation of basic key technologies for the embodiment of high-performance computing for the next generation, and efforts are exerted to extract and define technological problems and to deliberate a research system to achieve the goal. This fiscal year's achievements are mentioned below. Two territories were provided to be respectively covered by a Parallel Compiler Working Group and a Global Computing Working Group whose activities centered about overseas surveys and short-term reception of researchers from abroad. The Parallel Compiler Working Group was engaged in (1) the technological survey of the latest parallel compiler technology and, in its effort to execute researches under the project, in (2) the materialization of the contents of technology research and development and in (3) the materialization of a technology research and development system. The Global Computing Working Group was engaged in (1) the technological survey of the latest high-performance global computing and in (2) the survey of fields to accept global computing application. (NEDO)

  20. CPE--A New Perspective: The Impact of the Technology Revolution. Proceedings of the Computer Performance Evaluation Users Group Meeting (19th, San Francisco, California, October 25-28, 1983). Final Report. Reports on Computer Science and Technology.

    Science.gov (United States)

    Mobray, Deborah, Ed.

    Papers on local area networks (LANs), modelling techniques, software improvement, capacity planning, software engineering, microcomputers and end user computing, cost accounting and chargeback, configuration and performance management, and benchmarking presented at this conference include: (1) "Theoretical Performance Analysis of Virtual…

  1. Technology Evaluation Report 17. Videoconferencing in Theatre and Performance Studies

    Directory of Open Access Journals (Sweden)

    Mark Childs

    2003-04-01

    Full Text Available Previous reports in this series have indicated the growing acceptance of video-conferencing in education delivery. The current report compares a series of video-conferencing methods in an activity requiring precision of expression and communication: theatre and performance studies. The Accessing and Networking with National and International Expertise (ANNIE project is a two-year project undertaken jointly by the University of Warwick and the University of Kent at Canterbury, running from March 2001 to March 2003. The project's aim is to enhance students' learning experience in theatre studies by enabling access to research-based teaching and to workshops led by practitioners of national and international standing. Various technologies have been used, particularly ISDN video-conferencing, computer-mediated conferencing, and the Internet. This report concludes that video-conferencing methods will gain acceptance in education, as academic schools themselves are able to operate commonly available technology the assistance of specialised service units.

  2. Second language writing anxiety, computer anxiety, and performance in a classroom versus a web-based environment

    Directory of Open Access Journals (Sweden)

    Effie Dracopoulos

    2011-04-01

    Full Text Available This study examined the impact of writing anxiety and computer anxiety on language learning for 45 ESL adult learners enrolled in an English grammar and writing course. Two sections of the course were offered in a traditional classroom setting whereas two others were given in a hybrid form that involved distance learning. Contrary to previous research, writing anxiety showed no correlation with learning performance, whereas computer anxiety only yielded a positive correlation with performance in the case of classroom learners. There were no significant differences across learning environments on any measures. These observations are discussed in light of the role computer technologies now play in our society as well as the merging of socio-demographic profiles between classroom and distance learners. Our data suggest that comparisons of profiles between classroom and distance learners may not be an issue worth investigating anymore in language studies, at least in developed countries.

  3. Efficacy of computer technology-based HIV prevention interventions: a meta-analysis.

    Science.gov (United States)

    Noar, Seth M; Black, Hulda G; Pierce, Larson B

    2009-01-02

    To conduct a meta-analysis of computer technology-based HIV prevention behavioral interventions aimed at increasing condom use among a variety of at-risk populations. Systematic review and meta-analysis of existing published and unpublished studies testing computer-based interventions. Meta-analytic techniques were used to compute and aggregate effect sizes for 12 randomized controlled trials that met inclusion criteria. Variables that had the potential to moderate intervention efficacy were also tested. The overall mean weighted effect size for condom use was d = 0.259 (95% confidence interval = 0.201, 0.317; Z = 8.74, P partners, and incident sexually transmitted diseases. In addition, interventions were significantly more efficacious when they were directed at men or women (versus mixed sex groups), utilized individualized tailoring, used a Stages of Change model, and had more intervention sessions. Computer technology-based HIV prevention interventions have similar efficacy to more traditional human-delivered interventions. Given their low cost to deliver, ability to customize intervention content, and flexible dissemination channels, they hold much promise for the future of HIV prevention.

  4. High-Performance Java Codes for Computational Fluid Dynamics

    Science.gov (United States)

    Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.

  5. Applications of computational intelligence in biomedical technology

    CERN Document Server

    Majernik, Jaroslav; Pancerz, Krzysztof; Zaitseva, Elena

    2016-01-01

    This book presents latest results and selected applications of Computational Intelligence in Biomedical Technologies. Most of contributions deal with problems of Biomedical and Medical Informatics, ranging from theoretical considerations to practical applications. Various aspects of development methods and algorithms in Biomedical and Medical Informatics as well as Algorithms for medical image processing, modeling methods are discussed. Individual contributions also cover medical decision making support, estimation of risks of treatments, reliability of medical systems, problems of practical clinical applications and many other topics  This book is intended for scientists interested in problems of Biomedical Technologies, for researchers and academic staff, for all dealing with Biomedical and Medical Informatics, as well as PhD students. Useful information is offered also to IT companies, developers of equipment and/or software for medicine and medical professionals.  .

  6. Feasibility of school-based computer-assisted robotic gaming technology for upper limb rehabilitation of children with cerebral palsy.

    Science.gov (United States)

    Preston, Nick; Weightman, Andrew; Gallagher, Justin; Holt, Raymond; Clarke, Michael; Mon-Williams, Mark; Levesley, Martin; Bhakta, Bipinchandra

    2016-01-01

    We investigated the feasibility of using computer-assisted arm rehabilitation (CAAR) computer games in schools. Outcomes were children's preference for single player or dual player mode, and changes in arm activity and kinematics. Nine boys and two girls with cerebral palsy (6-12 years, mean 9 years) played assistive technology computer games in single-user mode or with school friends in an AB-BA design. Preference was determined by recording the time spent playing each mode and by qualitative feedback. We used the ABILHAND-kids and Canadian Occupational Performance Measure to evaluate activity limitation, and a portable laptop-based device to capture arm kinematics. No difference was recorded between single-user and dual-user modes (median daily use 9.27 versus 11.2 min, p = 0.214). Children reported dual-user mode was preferable. There were no changes in activity limitation (ABILHAND-kids, p = 0.424; COPM, p = 0.484) but we found significant improvements in hand speed (p = 0.028), smoothness (p = 0.005) and accuracy (p = 0.007). School timetables prohibit extensive use of rehabilitation technology but there is potential for its short-term use to supplement a rehabilitation program. The restricted access to the rehabilitation games was sufficient to improve arm kinematics but not arm activity. Implications for Rehabilitation School premises and teaching staff present no obstacles to the installation of rehabilitation gaming technology. Twelve minutes per day is the average amount of time that the school time table permits children to use rehabilitation gaming equipment (without disruption to academic attendance). The use of rehabilitation gaming technology for an average of 12 minutes daily does not appear to benefit children's functional performance, but there are improvements in the kinematics of children's upper limb.

  7. Establishment of computer aided technology for operation, maintenance, and core management

    International Nuclear Information System (INIS)

    Iguchi, Masaki; Isomura, Kazutoshi; Okawa, Tsuyoshi; Sakurai, Naoto

    2003-01-01

    In Fugen, the accumulated know-how of skilled operators, maintenance engineers, and core management engineers have been systematized by using the latest computer technology. These computerized systems have enhanced the technology of operating, maintenance and core management. This report describes the development of a reactor feed water control system with fuzzy logic, a refueling support system, and an automatic refueling planning system. Since operation of reactor feedwater control at low power requires a delicate operational technique and the knowledge and experience of operators, the application of a fuzzy algorithm was deemed effective in Fugen. Its good performance comparable to that of experienced operators can be realized. The fuel-handling operation takes proposed plans, fuel management and efficient operation by skilled operators. AI technology was applied to fuel-handling support system using past operation results and experience of skilled operators. This system is as capable of fuel-handling as skilled operators. Planning an adequate fuel loading pattern is time-consuming even for expert core management engineers. The Automatic Refueling Planning System (ARPS) was developed using Genetic Algorithms (GA) and a Simulated Annealing (SA). It has been verified that long-term fuel loading patterns of the Fugen NPS evaluated by ARPS are equivalent to that of an expert core management engineer. (author)

  8. Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology.

    Science.gov (United States)

    Zao, John K; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping

    2014-01-01

    EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system.

  9. Pervasive Brain Monitoring and Data Sharing based on Multi-tier Distributed Computing and Linked Data Technology

    Directory of Open Access Journals (Sweden)

    John Kar-Kin Zao

    2014-06-01

    Full Text Available EEG-based Brain-computer interfaces (BCI are facing grant challenges in their real-world applications. The technical difficulties in developing truly wearable multi-modal BCI systems that are capable of making reliable real-time prediction of users’ cognitive states under dynamic real-life situations may appear at times almost insurmountable. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report our attempt to develop a pervasive on-line BCI system by employing state-of-art technologies such as multi-tier fog and cloud computing, semantic Linked Data search and adaptive prediction/classification models. To verify our approach, we implement a pilot system using wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end fog servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC as the far-end cloud servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch and the UCSD Movement Disorder Center to use our system in real-life personal stress and in-home Parkinson’s disease patient monitoring experiments. We shall proceed to develop a necessary BCI ontology and add automatic semantic annotation and progressive model refinement capability to our system.

  10. Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology

    Science.gov (United States)

    Zao, John K.; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping

    2014-01-01

    EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system. PMID:24917804

  11. Computer-Assisted Language Learning : proceedings of the seventh Twente Workshop on Language Technology

    NARCIS (Netherlands)

    Appelo, L.; de Jong, Franciska M.G.

    1994-01-01

    TWLT is an acronym of Twente Workshop(s) on Language Technology. These workshops on natural language theory and technology are organised bij Project Parlevink (sometimes with the help of others) a language theory and technology project conducted at the Department of Computer Science of the

  12. Computational Structures Technology for Airframes and Propulsion Systems

    International Nuclear Information System (INIS)

    Noor, A.K.; Housner, J.M.; Starnes, J.H. Jr.; Hopkins, D.A.; Chamis, C.C.

    1992-05-01

    This conference publication contains the presentations and discussions from the joint University of Virginia (UVA)/NASA Workshops. The presentations included NASA Headquarters perspectives on High Speed Civil Transport (HSCT), goals and objectives of the UVA Center for Computational Structures Technology (CST), NASA and Air Force CST activities, CST activities for airframes and propulsion systems in industry, and CST activities at Sandia National Laboratory

  13. Designing and Implementing Performance Technology for Teachers

    Directory of Open Access Journals (Sweden)

    Joi L. Moore

    2004-06-01

    Full Text Available This paper synthesizes research findings from a performance analysis of teacher tasks and the implementation of performance technology. These findings are aligned with design and implementation theories to provide understanding of the complex factors and events that occur during the implementation process. The article describes the necessary elements and conditions for designing and implementing performance tools in school environments that will encourage usage, efficient performance, and positive attitudes. Two models provide a visual representation of causal relationships between the implementation factors and the technology user. Although the implementation process can become complex because of the simultaneous events and phases, it can be properly managed through good communication and strategic involvement of teachers during the design and development process. The models may be able to assist technology designers and advocates with presenting innovations to teachers who are frequently asked to try technical solutions for performance support or improvement.

  14. Application verification research of cloud computing technology in the field of real time aerospace experiment

    Science.gov (United States)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  15. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  16. Grid computing and collaboration technology in support of fusion energy sciences

    International Nuclear Information System (INIS)

    Schissel, D.P.

    2005-01-01

    Science research in general and magnetic fusion research in particular continue to grow in size and complexity resulting in a concurrent growth in collaborations between experimental sites and laboratories worldwide. The simultaneous increase in wide area network speeds has made it practical to envision distributed working environments that are as productive as traditionally collocated work. In computing power, it has become reasonable to decouple production and consumption resulting in the ability to construct computing grids in a similar manner as the electrical power grid. Grid computing, the secure integration of computer systems over high speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. For human interaction, advanced collaborative environments are being researched and deployed to have distributed group work that is as productive as traditional meetings. The DOE Scientific Discovery through Advanced Computing Program initiative has sponsored several collaboratory projects, including the National Fusion Collaboratory Project, to utilize recent advances in grid computing and advanced collaborative environments to further research in several specific scientific domains. For fusion, the collaborative technology being deployed is being used in present day research and is also scalable to future research, in particular, to the International Thermonuclear Experimental Reactor experiment that will require extensive collaboration capability worldwide. This paper briefly reviews the concepts of grid computing and advanced collaborative environments and gives specific examples of how these technologies are being used in fusion research today

  17. Condition Monitoring Through Advanced Sensor and Computational Technology

    International Nuclear Information System (INIS)

    Kim, Jung Taek; Park, Won Man; Kim, Jung Soo; Seong, Soeng Hwan; Hur, Sub; Cho, Jae Hwan; Jung, Hyung Gue

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties

  18. Green Cloud Computing: A Literature Survey

    Directory of Open Access Journals (Sweden)

    Laura-Diana Radu

    2017-11-01

    Full Text Available Cloud computing is a dynamic field of information and communication technologies (ICTs, introducing new challenges for environmental protection. Cloud computing technologies have a variety of application domains, since they offer scalability, are reliable and trustworthy, and offer high performance at relatively low cost. The cloud computing revolution is redesigning modern networking, and offering promising environmental protection prospects as well as economic and technological advantages. These technologies have the potential to improve energy efficiency and to reduce carbon footprints and (e-waste. These features can transform cloud computing into green cloud computing. In this survey, we review the main achievements of green cloud computing. First, an overview of cloud computing is given. Then, recent studies and developments are summarized, and environmental issues are specifically addressed. Finally, future research directions and open problems regarding green cloud computing are presented. This survey is intended to serve as up-to-date guidance for research with respect to green cloud computing.

  19. [Research and application of computer-aided technology in restoration of maxillary defect].

    Science.gov (United States)

    Cheng, Xiaosheng; Liao, Wenhe; Hu, Qingang; Wang, Qian; Dai, Ning

    2008-08-01

    This paper presents a new method of designing restoration model of maxillectomy defect through Computer aided technology. Firstly, 3D maxillectomy triangle mesh model is constructed from Helical CT data. Secondly, the triangle mesh model is transformed into initial computer-aided design (CAD) model of maxillectomy through reverse engineering software. Thirdly, the 3D virtual restoration model of maxillary defect is obtained after designing and adjusting the initial CAD model through CAD software according to the patient's practical condition. Therefore, the 3D virtual restoration can be fitted very well with the broken part of maxilla. The exported design data can be manufactured using rapid prototyping technology and foundry technology. Finally, the result proved that this method is effective and feasible.

  20. Embedded High Performance Scalable Computing Systems

    National Research Council Canada - National Science Library

    Ngo, David

    2003-01-01

    The Embedded High Performance Scalable Computing Systems (EHPSCS) program is a cooperative agreement between Sanders, A Lockheed Martin Company and DARPA that ran for three years, from Apr 1995 - Apr 1998...

  1. An Exploratory Study of the Implementation of Computer Technology in an American Islamic Private School

    Science.gov (United States)

    Saleem, Mohammed M.

    2009-01-01

    This exploratory study of the implementation of computer technology in an American Islamic private school leveraged the case study methodology and ethnographic methods informed by symbolic interactionism and the framework of the Muslim Diaspora. The study focused on describing the implementation of computer technology and identifying the…

  2. Quantum Computers: A New Paradigm in Information Technology

    OpenAIRE

    Mahesh S. Raisinghani

    2001-01-01

    The word 'quantum' comes from the Latin word quantus meaning 'how much'. Quantum computing is a fundamentally new mode of information processing that can be performed only by harnessing physical phenomena unique to quantum mechanics (especially quantum interference). Paul Benioff of the Argonne National Laboratory first applied quantum theory to computers in 1981 and David Deutsch of Oxford proposed quantum parallel computers in 1985, years before the realization of qubits in 1995. However, i...

  3. Computational intelligence for technology enhanced learning

    Energy Technology Data Exchange (ETDEWEB)

    Xhafa, Fatos [Polytechnic Univ. of Catalonia, Barcelona (Spain). Dept. of Languages and Informatics Systems; Caballe, Santi; Daradoumis, Thanasis [Open Univ. of Catalonia, Barcelona (Spain). Dept. of Computer Sciences Multimedia and Telecommunications; Abraham, Ajith [Machine Intelligence Research Labs (MIR Labs), Auburn, WA (United States). Scientific Network for Innovation and Research Excellence; Juan Perez, Angel Alejandro (eds.) [Open Univ. of Catalonia, Barcelona (Spain). Dept. of Information Sciences

    2010-07-01

    E-Learning has become one of the most wide spread ways of distance teaching and learning. Technologies such as Web, Grid, and Mobile and Wireless networks are pushing teaching and learning communities to find new and intelligent ways of using these technologies to enhance teaching and learning activities. Indeed, these new technologies can play an important role in increasing the support to teachers and learners, to shorten the time to learning and teaching; yet, it is necessary to use intelligent techniques to take advantage of these new technologies to achieve the desired support to teachers and learners and enhance learners' performance in distributed learning environments. The chapters of this volume bring advances in using intelligent techniques for technology enhanced learning as well as development of e-Learning applications based on such techniques and supported by technology. Such intelligent techniques include clustering and classification for personalization of learning, intelligent context-aware techniques, adaptive learning, data mining techniques and ontologies in e-Learning systems, among others. Academics, scientists, software developers, teachers and tutors and students interested in e-Learning will find this book useful for their academic, research and practice activity. (orig.)

  4. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    Science.gov (United States)

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.

  5. 12th International Conference on Computing and Information Technology

    CERN Document Server

    Boonkrong, Sirapat; Unger, Herwig

    2016-01-01

    This proceedings book presents recent research work and results in the area of communication and information technologies. The chapters of this book contain the main, well-selected and reviewed contributions of scientists who met at the 12th International Conference on Computing and Information Technology (IC2IT) held during 7th - 8th July 2016 in Khon Kaen, Thailand The book is divided into three parts: “User Centric Data Mining and Text Processing”, “Data Mining Algoritms and their Applications” and “Optimization of Complex Networks”.

  6. Heterogeneous Gpu&Cpu Cluster For High Performance Computing In Cryptography

    Directory of Open Access Journals (Sweden)

    Michał Marks

    2012-01-01

    Full Text Available This paper addresses issues associated with distributed computing systems andthe application of mixed GPU&CPU technology to data encryption and decryptionalgorithms. We describe a heterogenous cluster HGCC formed by twotypes of nodes: Intel processor with NVIDIA graphics processing unit and AMDprocessor with AMD graphics processing unit (formerly ATI, and a novel softwareframework that hides the heterogeneity of our cluster and provides toolsfor solving complex scientific and engineering problems. Finally, we present theresults of numerical experiments. The considered case study is concerned withparallel implementations of selected cryptanalysis algorithms. The main goal ofthe paper is to show the wide applicability of the GPU&CPU technology tolarge scale computation and data processing.

  7. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  8. Computational Fluid Dynamics (CFD) Computations With Zonal Navier-Stokes Flow Solver (ZNSFLOW) Common High Performance Computing Scalable Software Initiative (CHSSI) Software

    National Research Council Canada - National Science Library

    Edge, Harris

    1999-01-01

    ...), computational fluid dynamics (CFD) 6 project. Under the project, a proven zonal Navier-Stokes solver was rewritten for scalable parallel performance on both shared memory and distributed memory high performance computers...

  9. Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges

    Science.gov (United States)

    Millán, J. d. R.; Rupp, R.; Müller-Putz, G. R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; Neuper, C.; Müller, K.-R.; Mattia, D.

    2010-01-01

    In recent years, new research has brought the field of electroencephalogram (EEG)-based brain–computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, “Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user–machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human–computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices. PMID:20877434

  10. Enabling high performance computational science through combinatorial algorithms

    International Nuclear Information System (INIS)

    Boman, Erik G; Bozdag, Doruk; Catalyurek, Umit V; Devine, Karen D; Gebremedhin, Assefaw H; Hovland, Paul D; Pothen, Alex; Strout, Michelle Mills

    2007-01-01

    The Combinatorial Scientific Computing and Petascale Simulations (CSCAPES) Institute is developing algorithms and software for combinatorial problems that play an enabling role in scientific and engineering computations. Discrete algorithms will be increasingly critical for achieving high performance for irregular problems on petascale architectures. This paper describes recent contributions by researchers at the CSCAPES Institute in the areas of load balancing, parallel graph coloring, performance improvement, and parallel automatic differentiation

  11. Enabling high performance computational science through combinatorial algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Erik G [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Bozdag, Doruk [Biomedical Informatics, and Electrical and Computer Engineering, Ohio State University (United States); Catalyurek, Umit V [Biomedical Informatics, and Electrical and Computer Engineering, Ohio State University (United States); Devine, Karen D [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Gebremedhin, Assefaw H [Computer Science and Center for Computational Science, Old Dominion University (United States); Hovland, Paul D [Mathematics and Computer Science Division, Argonne National Laboratory (United States); Pothen, Alex [Computer Science and Center for Computational Science, Old Dominion University (United States); Strout, Michelle Mills [Computer Science, Colorado State University (United States)

    2007-07-15

    The Combinatorial Scientific Computing and Petascale Simulations (CSCAPES) Institute is developing algorithms and software for combinatorial problems that play an enabling role in scientific and engineering computations. Discrete algorithms will be increasingly critical for achieving high performance for irregular problems on petascale architectures. This paper describes recent contributions by researchers at the CSCAPES Institute in the areas of load balancing, parallel graph coloring, performance improvement, and parallel automatic differentiation.

  12. Structured Innovation of High-Performance Wave Energy Converter Technology: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Jochem W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Laird, Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2018-01-25

    Wave energy converter (WEC) technology development has not yet delivered the desired commercial maturity nor, and more importantly, the techno-economic performance. The reasons for this have been recognized and fundamental requirements for successful WEC technology development have been identified. This paper describes a multi-year project pursued in collaboration by the National Renewable Energy Laboratory and Sandia National Laboratories to innovate and develop new WEC technology. It specifies the project strategy, shows how this differs from the state-of-the-art approach and presents some early project results. Based on the specification of fundamental functional requirements of WEC technology, structured innovation and systemic problem solving methodologies are applied to invent and identify new WEC technology concepts. Using Technology Performance Levels (TPL) as an assessment metric of the techno-economic performance potential, high performance technology concepts are identified and selected for further development. System performance is numerically modelled and optimized and key performance aspects are empirically validated. The project deliverables are WEC technology specifications of high techno-economic performance technologies of TPL 7 or higher at TRL 3 with some key technology challenges investigated at higher TRL. These wave energy converter technology specifications will be made available to industry for further, full development and commercialisation (TRL 4 - TRL 9).

  13. Impact of Collaborative Work on Technology Acceptance: A Case Study from Virtual Computing

    Science.gov (United States)

    Konak, Abdullah; Kulturel-Konak, Sadan; Nasereddin, Mahdi; Bartolacci, Michael R.

    2017-01-01

    Aim/Purpose: This paper utilizes the Technology Acceptance Model (TAM) to examine the extent to which acceptance of Remote Virtual Computer Laboratories (RVCLs) is affected by students' technological backgrounds and the role of collaborative work. Background: RVCLs are widely used in information technology and cyber security education to provide…

  14. Integrating Human Performance and Technology

    International Nuclear Information System (INIS)

    Farris, Ronald K.; Medema, Heather

    2012-01-01

    Human error is a significant factor in the cause and/or complication of events that occur in the commercial nuclear industry. In recent years, great gains have been made using Human Performance (HU) tools focused on targeting individual behaviors. However, the cost of improving HU is growing and resistance to add yet another HU tool certainly exists, particularly for those tools that increase the paperwork for operations. Improvements in HU that are the result of leveraging existing technology, such as hand-held mobile technologies, have the potential to reduce human error in controlling system configurations, safety tag-outs, and other verifications. Operator rounds, valve lineup verifications, containment closure verifications, safety and equipment protection, and system tagging can be supported by field-deployable wireless technologies. These devices can also support the availability of critical component data in the main control room and other locations. This research pilot project reviewing wireless hand-held technology is part of the Light Water Reactor Sustainability Program (LWRSP), a research and development (R and D) program sponsored by the U. S. Department of Energy (DOE). The project is being performed in close collaboration with industry R and D programs to provide the technical foundations for licensing, and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRSP vision is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the life of the current nuclear reactor fleet. (author)

  15. Integrating Human Performance and Technology

    Energy Technology Data Exchange (ETDEWEB)

    Ronald K. Farris; Heather Medema

    2012-05-01

    Human error is a significant factor in the cause and/or complication of events that occur in the commercial nuclear industry. In recent years, great gains have been made using Human Performance (HU) tools focused on targeting individual behaviors. However, the cost of improving HU is growing and resistance to add yet another HU tool certainly exists, particularly for those tools that increase the paperwork for operations. Improvements in HU that are the result of leveraging existing technology, such as hand-held mobile technologies, have the potential to reduce human error in controlling system configurations, safety tag-outs, and other verifications. Operator rounds, valve line-up verifications, containment closure verifications, safety & equipment protection, and system tagging can be supported by field-deployable wireless technologies. These devices can also support the availability of critical component data in the main control room and other locations. This research pilot project reviewing wireless hand-held technology is part of the Light Water Reactor Sustainability Program (LWRSP), a research and development (R&D) program sponsored by the U. S. Department of Energy (DOE). The project is being performed in close collaboration with industry R&D programs to provide the technical foundations for licensing, and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRSP vision is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the life of the current nuclear reactor fleet.

  16. Data Mining Based on Cloud-Computing Technology

    Directory of Open Access Journals (Sweden)

    Ren Ying

    2016-01-01

    Full Text Available There are performance bottlenecks and scalability problems when traditional data-mining system is used in cloud computing. In this paper, we present a data-mining platform based on cloud computing. Compared with a traditional data mining system, this platform is highly scalable, has massive data processing capacities, is service-oriented, and has low hardware cost. This platform can support the design and applications of a wide range of distributed data-mining systems.

  17. Computer technology for self-management: a scoping review.

    Science.gov (United States)

    Jacelon, Cynthia S; Gibbs, Molly A; Ridgway, John Ve

    2016-05-01

    The purpose of this scoping review of literature is to explore the types of computer-based systems used for self-management of chronic disease, the goals and success of these systems, the value added by technology integration and the target audience for these systems. Technology is changing the way health care is provided and the way that individuals manage their health. Individuals with chronic diseases are now able to use computer-based systems to self-manage their health. These systems have the ability to remind users of daily activities, and to help them recognise when symptoms are worsening and intervention is indicated. However, there are many questions about the types of systems available, the goals of these systems and the success with which individuals with chronic illness are using them. This is a scoping review in which the Cumulative Index of Nursing and Allied Health Literature, PubMed and IEEE Xplore databases were searched. A total of 303 articles were reviewed, 89 articles were read in-depth and 30 were included in the scoping review. The Substitution, Augmentation, Modification, Redefinition model was used to evaluate the value added by the technology integration. Research on technology for self-management was conducted in 13 countries. Data analysis identified five kinds of platforms on which the systems were based, some systems were focused on a specific disease management processes, others were not. For individuals to effectively use systems to maintain maximum wellness, the systems must have a strong component of self-management and provide the user with meaningful information regarding their health states. Clinicians should choose systems for their clients based on the design, components and goals of the systems. © 2016 John Wiley & Sons Ltd.

  18. Fusion of smart, multimedia and computer gaming technologies research, systems and perspectives

    CERN Document Server

    Favorskaya, Margarita; Jain, Lakhmi; Howlett, Robert

    2015-01-01

      This monograph book is focused on the recent advances in smart, multimedia and computer gaming technologies. The Contributions include:   ·         Smart Gamification and Smart Serious Games. ·         Fusion of secure IPsec-based Virtual Private Network, mobile computing and rich multimedia technology. ·         Teaching and Promoting Smart Internet of Things Solutions Using the Serious-game Approach. ·         Evaluation of Student Knowledge using an e-Learning Framework. ·         The iTEC Eduteka. ·         3D Virtual Worlds as a Fusion of Immersing, Visualizing, Recording, and Replaying Technologies. ·         Fusion of multimedia and mobile technology in audioguides for Museums and Exhibitions: from Bluetooth Push to Web Pull. The book is directed to researchers, students and software developers working in the areas of education and information technologies.  

  19. Function Follows Performance in Evolutionary Computational Processing

    DEFF Research Database (Denmark)

    Pasold, Anke; Foged, Isak Worre

    2011-01-01

    As the title ‘Function Follows Performance in Evolutionary Computational Processing’ suggests, this paper explores the potentials of employing multiple design and evaluation criteria within one processing model in order to account for a number of performative parameters desired within varied...

  20. Computer Self-Efficacy, Computer Anxiety, Performance and Personal Outcomes of Turkish Physical Education Teachers

    Science.gov (United States)

    Aktag, Isil

    2015-01-01

    The purpose of this study is to determine the computer self-efficacy, performance outcome, personal outcome, and affect and anxiety level of physical education teachers. Influence of teaching experience, computer usage and participation of seminars or in-service programs on computer self-efficacy level were determined. The subjects of this study…

  1. Examining the Relationship between Technological, Organizational, and Environmental Factors and Cloud Computing Adoption

    Science.gov (United States)

    Tweel, Abdeneaser

    2012-01-01

    High uncertainties related to cloud computing adoption may hinder IT managers from making solid decisions about adopting cloud computing. The problem addressed in this study was the lack of understanding of the relationship between factors related to the adoption of cloud computing and IT managers' interest in adopting this technology. In…

  2. Improving student retention in computer engineering technology

    Science.gov (United States)

    Pierozinski, Russell Ivan

    The purpose of this research project was to improve student retention in the Computer Engineering Technology program at the Northern Alberta Institute of Technology by reducing the number of dropouts and increasing the graduation rate. This action research project utilized a mixed methods approach of a survey and face-to-face interviews. The participants were male and female, with a large majority ranging from 18 to 21 years of age. The research found that participants recognized their skills and capability, but their capacity to remain in the program was dependent on understanding and meeting the demanding pace and rigour of the program. The participants recognized that curriculum delivery along with instructor-student interaction had an impact on student retention. To be successful in the program, students required support in four domains: academic, learning management, career, and social.

  3. A high performance scientific cloud computing environment for materials simulations

    Science.gov (United States)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  4. Load Disaggregation Technologies: Real World and Laboratory Performance

    Energy Technology Data Exchange (ETDEWEB)

    Mayhorn, Ebony T.; Sullivan, Greg P.; Petersen, Joseph M.; Butner, Ryan S.; Johnson, Erica M.

    2016-09-28

    Low cost interval metering and communication technology improvements over the past ten years have enabled the maturity of load disaggregation (or non-intrusive load monitoring) technologies to better estimate and report energy consumption of individual end-use loads. With the appropriate performance characteristics, these technologies have the potential to enable many utility and customer facing applications such as billing transparency, itemized demand and energy consumption, appliance diagnostics, commissioning, energy efficiency savings verification, load shape research, and demand response measurement. However, there has been much skepticism concerning the ability of load disaggregation products to accurately identify and estimate energy consumption of end-uses; which has hindered wide-spread market adoption. A contributing factor is that common test methods and metrics are not available to evaluate performance without having to perform large scale field demonstrations and pilots, which can be costly when developing such products. Without common and cost-effective methods of evaluation, more developed disaggregation technologies will continue to be slow to market and potential users will remain uncertain about their capabilities. This paper reviews recent field studies and laboratory tests of disaggregation technologies. Several factors are identified that are important to consider in test protocols, so that the results reflect real world performance. Potential metrics are examined to highlight their effectiveness in quantifying disaggregation performance. This analysis is then used to suggest performance metrics that are meaningful and of value to potential users and that will enable researchers/developers to identify beneficial ways to improve their technologies.

  5. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  6. Proceedings of the Twelfth Seminar on Computation in Nuclear Science and Technology

    International Nuclear Information System (INIS)

    Arbie, Bakri; Ardisasmita, Syamsa; Bunyamin, M.; Karsono, M.; Sangadji; Aziz, Ferhat; Marsodi; Su'ud, Zaki; Suhartanto, Heru

    2001-07-01

    The proceedings on Seminar Computation in Nuclear Science and Technologyis routine activity that held on Center for Development of Informatics and Computation Technology. The aims of proceeding is to be able to Exchange Information for interest in computation, Modelling and Simulation. The Seminar is attended by BATAN's on University Research in nuclear science activity. This proceedings used for another research. There are 26 papers which have separated index

  7. Dynamic Capabilities for Managing Emerging Technologies : Organizational and Managerial Antecedents of Effective Adoption of Cloud Computing

    NARCIS (Netherlands)

    S. Khanagha (Saeed)

    2015-01-01

    markdownabstract__Abstract__ The advancement of information and communication technologies has brought a digital age, where massive computing power, high speed and ubiquitous access to internet and more recently Cloud Computing Technology are expected to transform a wide range of organizations,

  8. Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    Science.gov (United States)

    Carroll, Chester C.; Youngblood, John N.; Saha, Aindam

    1987-01-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  9. 3rd ACIS International Conference on Applied Computing and Information Technology

    CERN Document Server

    2016-01-01

    This edited book presents scientific results of the 3nd International Conference on Applied Computing and Information Technology (ACIT 2015) which was held on July 12-16, 2015 in Okayama, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.

  10. Human-computer interaction handbook fundamentals, evolving technologies and emerging applications

    CERN Document Server

    Sears, Andrew

    2007-01-01

    This second edition of The Human-Computer Interaction Handbook provides an updated, comprehensive overview of the most important research in the field, including insights that are directly applicable throughout the process of developing effective interactive information technologies. It features cutting-edge advances to the scientific knowledge base, as well as visionary perspectives and developments that fundamentally transform the way in which researchers and practitioners view the discipline. As the seminal volume of HCI research and practice, The Human-Computer Interaction Handbook feature

  11. Advanced intelligent computational technologies and decision support systems

    CERN Document Server

    Kountchev, Roumen

    2014-01-01

    This book offers a state of the art collection covering themes related to Advanced Intelligent Computational Technologies and Decision Support Systems which can be applied to fields like healthcare assisting the humans in solving problems. The book brings forward a wealth of ideas, algorithms and case studies in themes like: intelligent predictive diagnosis; intelligent analyzing of medical images; new format for coding of single and sequences of medical images; Medical Decision Support Systems; diagnosis of Down’s syndrome; computational perspectives for electronic fetal monitoring; efficient compression of CT Images; adaptive interpolation and halftoning for medical images; applications of artificial neural networks for real-life problems solving; present and perspectives for Electronic Healthcare Record Systems; adaptive approaches for noise reduction in sequences of CT images etc.

  12. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  13. Survey of LWR environmental control technology performance and cost

    International Nuclear Information System (INIS)

    Heeb, C.M.; Aaberg, R.L.; Cole, B.M.; Engel, R.L.; Kennedy, W.E. Jr.; Lewallen, M.A.

    1980-03-01

    This study attempts to establish a ranking for species that are routinely released to the environment for a projected nuclear power growth scenario. Unlike comparisons made to existing standards, which are subject to frequent revision, the ranking of releases can be used to form a more logical basis for identifying the areas where further development of control technology could be required. This report describes projections of releases for several fuel cycle scenarios, identifies areas where alternative control technologies may be implemented, and discusses the available alternative control technologies. The release factors were used in a computer code system called ENFORM, which calculates the annual release of any species from any part of the LWR nuclear fuel cycle given a projection of installed nuclear generation capacity. This survey of fuel cycle releases was performed for three reprocessing scenarios (stowaway, reprocessing without recycle of Pu and reprocessing with full recycle of U and Pu) for a 100-year period beginning in 1977. The radioactivity releases were ranked on the basis of a relative ranking factor. The relative ranking factor is based on the 100-year summation of the 50-year population dose commitment from an annual release of radioactive effluents. The nonradioactive releases were ranked on the basis of dilution factor. The twenty highest ranking radioactive releases were identified and each of these was analyzed in terms of the basis for calculating the release and a description of the currently employed control method. Alternative control technology is then discussed, along with the available capital and operating cost figures for alternative control methods

  14. Use of Soft Computing Technologies for a Qualitative and Reliable Engine Control System for Propulsion Systems

    Science.gov (United States)

    Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)

    2001-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by

  15. Chinese-English Automation and Computer Technology Dictionary, Volume 2.

    Science.gov (United States)

    1980-08-01

    Chinese-English Automation and Computer Technology Dictionary VOL 2 ItT: SEP 2LECTE \\This dcuflent h as een c i tsrO tog public te1a sae’ I d~suil to...zhuangbei A information link 04 tongxin ].ianjie zhuangzhi A Iconrwnicatioi link 05 tongxin shebei camuenications euipme~nt; 06 omnications facility

  16. Didactics of Teaching of Histology, Cytology and Embryology: the Role of Computer Technologies

    Directory of Open Access Journals (Sweden)

    Ye.F. Barinov

    2013-10-01

    It must be emphasized that the ability to use computer technology in teaching can be considered to be formed, if they are based on the professional competence of the teacher, knowledge of the main provisions of the cognitive psychology on the cognitive process and the factors affecting its efficiency, and correct use of methods and means of information processing of teaching material. Only in this case we can hope that the use of computer technology in the educational process will be systemic and effective.

  17. APPLICATION OF INFORMATION AND COMMUNICATION TECHNOLOGIES IN COMPUTER AIDED LANGUAGE LEARNING

    Directory of Open Access Journals (Sweden)

    I. B. Tampel

    2013-01-01

    Full Text Available The article deals with the various ways of application for automatic speech recognition, Text-to-Speech technology, pronunciation and communication skills training, vocabulary check of the taught person, audition skills training in computer aided language learning (CALL-system. In spite of some constraints such technologies application is effective both for education problems simplification and for comfort growth of the system application.

  18. Predicting Cloud Computing Technology Adoption by Organizations: An Empirical Integration of Technology Acceptance Model and Theory of Planned Behavior

    Science.gov (United States)

    Ekufu, ThankGod K.

    2012-01-01

    Organizations are finding it difficult in today's economy to implement the vast information technology infrastructure required to effectively conduct their business operations. Despite the fact that some of these organizations are leveraging on the computational powers and the cost-saving benefits of computing on the Internet cloud, others…

  19. GPU-based high-performance computing for radiation therapy

    International Nuclear Information System (INIS)

    Jia, Xun; Jiang, Steve B; Ziegenhein, Peter

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. The graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of study has been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this paper, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. (topical review)

  20. Feedback in Computer Assisted Pronunciation Training: When technology meets pedagogy

    NARCIS (Netherlands)

    Neri, A.; Cucchiarini, C.; Strik, H.

    2002-01-01

    Computer Assisted Language Learning (CALL) has now established itself as a prolific and fast growing area whose advantages are already well-known to educators. Yet, many authors lament the lack of a reliable integrated conceptual framework linking technology advances and second language acquisition

  1. Proceedings of seventh symposium on sharing of computer programs and technology in nuclear medicine, computer assisted data processing

    International Nuclear Information System (INIS)

    Howard, B.Y.; McClain, W.J.; Landay, M.

    1977-01-01

    The Council on Computers (CC) of the Society of Nuclear Medicine (SNM) annually publishes the Proceedings of its Symposium on the Sharing of Computer Programs and Technology in Nuclear Medicine. This is the seventh such volume and has been organized by topic, with the exception of the invited papers and the discussion following them. An index arranged by author and by subject is included

  2. Proceedings of seventh symposium on sharing of computer programs and technology in nuclear medicine, computer assisted data processing

    Energy Technology Data Exchange (ETDEWEB)

    Howard, B.Y.; McClain, W.J.; Landay, M. (comps.)

    1977-01-01

    The Council on Computers (CC) of the Society of Nuclear Medicine (SNM) annually publishes the Proceedings of its Symposium on the Sharing of Computer Programs and Technology in Nuclear Medicine. This is the seventh such volume and has been organized by topic, with the exception of the invited papers and the discussion following them. An index arranged by author and by subject is included.

  3. DURIP: High Performance Computing in Biomathematics Applications

    Science.gov (United States)

    2017-05-10

    Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied

  4. 4th International Conference on Computational Collective Intel- ligence Technologies and Applications (ICCCI 2012)

    CERN Document Server

    Trawiński, Bogdan; Katarzyniak, Radosław; Jo, Geun-Sik; Advanced Methods for Computational Collective Intelligence

    2013-01-01

    The book consists of 35 extended chapters which have been selected and invited from the submissions to the 4th International Conference on Computational Collective Intelligence Technologies and Applications (ICCCI 2012) held on November 28-30, 2012 in Ho Chi Minh City, Vietnam. The book is organized into six parts, which are semantic web and ontologies, social networks and e-learning, agent and multiagent systems, data mining methods and applications, soft computing, and optimization and control, respectively. All chapters in the book discuss theoretical and practical issues connected with computational collective intelligence and related technologies. The editors hope that the book can be useful for graduate and Ph.D. students in Computer Science, in particular participants in courses on Soft Computing, Multiagent Systems, and Data Mining. This book can be also useful for researchers working on the concept of computational collective intelligence in artificial populations. It is the hope of the editors that ...

  5. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  6. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  7. Quantum Computers: A New Paradigm in Information Technology

    Directory of Open Access Journals (Sweden)

    Mahesh S. Raisinghani

    2001-01-01

    Full Text Available The word 'quantum' comes from the Latin word quantus meaning 'how much'. Quantum computing is a fundamentally new mode of information processing that can be performed only by harnessing physical phenomena unique to quantum mechanics (especially quantum interference. Paul Benioff of the Argonne National Laboratory first applied quantum theory to computers in 1981 and David Deutsch of Oxford proposed quantum parallel computers in 1985, years before the realization of qubits in 1995. However, it may be well into the 21st century before we see quantum computing used at a commercial level for a variety of reasons discussed in this paper. The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This paper discusses some of the current advances, applications, and chal-lenges of quantum computing as well as its impact on corporate computing and implications for management. It shows how quantum computing can be utilized to process and store information, as well as impact cryptography for perfectly secure communication, algorithmic searching, factorizing large numbers very rapidly, and simulating quantum-mechanical systems efficiently. A broad interdisciplinary effort will be needed if quantum com-puters are to fulfill their destiny as the world's fastest computing devices.

  8. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  9. Comparing levels of school performance to science teachers' reports on knowledge/skills, instructional use and student use of computers

    Science.gov (United States)

    Kerr, Rebecca

    The purpose of this descriptive quantitative and basic qualitative study was to examine fifth and eighth grade science teachers' responses, perceptions of the role of technology in the classroom, and how they felt that computer applications, tools, and the Internet influence student understanding. The purposeful sample included survey and interview responses from fifth grade and eighth grade general and physical science teachers. Even though they may not be generalizable to other teachers or classrooms due to a low response rate, findings from this study indicated teachers with fewer years of teaching science had a higher level of computer use but less computer access, especially for students, in the classroom. Furthermore, teachers' choice of professional development moderated the relationship between the level of school performance and teachers' knowledge/skills, with the most positive relationship being with workshops that occurred outside of the school. Eighteen interviews revealed that teachers perceived the role of technology in classroom instruction mainly as teacher-centered and supplemental, rather than student-centered activities.

  10. US Geological Survey National Computer Technology Meeting; Proceedings, Phoenix, Arizona, November 14-18, 1988

    Science.gov (United States)

    Balthrop, Barbara H.; Terry, J.E.

    1991-01-01

    The U.S. Geological Survey National Computer Technology Meetings (NCTM) are sponsored by the Water Resources Division and provide a forum for the presentation of technical papers and the sharing of ideas or experiences related to computer technology. This report serves as a proceedings of the meeting held in November, 1988 at the Crescent Hotel in Phoenix, Arizona. The meeting was attended by more than 200 technical and managerial people representing all Divisions of the U.S. Geological Survey.Scientists in every Division of the U.S. Geological Survey rely heavily upon state-of-the-art computer technology (both hardware and sofnuare). Today the goals of each Division are pursued in an environment where high speed computers, distributed communications, distributed data bases, high technology input/output devices, and very sophisticated simulation tools are used regularly. Therefore, information transfer and the sharing of advances in technology are very important issues that must be addressed regularly.This report contains complete papers and abstracts of papers that were presented at the 1988 NCTM. The report is divided into topical sections that reflect common areas of interest and application. In each section, papers are presented first followed by abstracts. For these proceedings, the publication of a complete paper or only an abstract was at the discretion of the author, although complete papers were encouraged.Some papers presented at the 1988 NCTM are not published in these proceedings.

  11. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  12. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W

    2013-01-01

    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  13. Acting with Technology: Rehearsing for Mixed-Media Live Performances

    DEFF Research Database (Denmark)

    Barkhuus, Louise; Rossitto, Chiara

    2016-01-01

    require a different type of engagement from the actors and rehearsing is challenging, as it can be impossible to rehearse with all the functional technology and interaction. Here, we report experiences from a case study of two mixed-media performances; we studied the rehearsal practices of two actors who......Digital technologies provide theater with new possibilities for combining traditional stage-based performances with interactive artifacts, for streaming remote parallel performances and for other device facilitated audience interaction. Compared to traditional theater, mixed-media performances...... and mixed media performances through addressing critical factors of implementing technology into rehearsal practices....

  14. Model of training of computer science teachers by means of distant education technologies

    Directory of Open Access Journals (Sweden)

    Т А Соловьева

    2009-03-01

    Full Text Available Training of future computer science teachers in conditions of informatization of education is analyzed. Distant educational technologies (DET and traditional process of training, their advantages and disadvantages are considered, active functions of DET as the basis of the model of training by means of DET is stressed. It is shown that mixed education combining both distant ant traditional technologies takes place on the basis of the created model. Practical use of the model is shown on the example of the course «Recursion» for future computer science teachers.

  15. Integration of computer technology into the medical curriculum: the King's experience

    Directory of Open Access Journals (Sweden)

    Vickie Aitken

    1997-12-01

    Full Text Available Recently, there have been major changes in the requirements of medical education which have set the scene for the revision of medical curricula (Towle, 1991; GMC, 1993. As part of the new curriculum at King's, the opportunity has been taken to integrate computer technology into the course through Computer-Assisted Learning (CAL, and to train graduates in core IT skills. Although the use of computers in the medical curriculum has up to now been limited, recent studies have shown encouraging steps forward (see Boelen, 1995. One area where there has been particular interest is the use of notebook computers to allow students increased access to IT facilities (Maulitz et al, 1996.

  16. DEVICE TECHNOLOGY. Nanomaterials in transistors: From high-performance to thin-film applications.

    Science.gov (United States)

    Franklin, Aaron D

    2015-08-14

    For more than 50 years, silicon transistors have been continuously shrunk to meet the projections of Moore's law but are now reaching fundamental limits on speed and power use. With these limits at hand, nanomaterials offer great promise for improving transistor performance and adding new applications through the coming decades. With different transistors needed in everything from high-performance servers to thin-film display backplanes, it is important to understand the targeted application needs when considering new material options. Here the distinction between high-performance and thin-film transistors is reviewed, along with the benefits and challenges to using nanomaterials in such transistors. In particular, progress on carbon nanotubes, as well as graphene and related materials (including transition metal dichalcogenides and X-enes), outlines the advances and further research needed to enable their use in transistors for high-performance computing, thin films, or completely new technologies such as flexible and transparent devices. Copyright © 2015, American Association for the Advancement of Science.

  17. Impact of computer-aided design and drafting on operator perceptions and performance intentions

    Energy Technology Data Exchange (ETDEWEB)

    Mandeville, D.E.

    1987-01-01

    The goal of the introduction of computer-aided design and drafting (CADD) technology is to improve engineering quantity and quality outcomes. What is not so obvious is that the manner in which the operator views the CADD job may affect these outcomes. The study found that: (1) the subjects hold job characteristic beliefs about their jobs that are similar to those held by individuals in related professional and technical positions in other organizations; (2) subjects in routine jobs who had little experience with CADD believe that the characteristics of the job with CADD will be less desirable than there present job utilizing manual drafting; however, experienced users believe that there is little difference; (3) the subjects performing technologically routine jobs indicate that quantity and quality of their jobs decreases with CADD, while holders of technologically nonroutine jobs indicate the opposite; (4) significant but weak relationships existed between job characteristic beliefs, attitudes, and job intentions; however, the model relationships and strengths varied when subgroups were analyzed; and (5) the correlation of social referent group beliefs with subject beliefs are significant but weak.

  18. Detection Performance of Packet Arrival under Downclocking for Mobile Edge Computing

    Directory of Open Access Journals (Sweden)

    Zhimin Wang

    2018-01-01

    Full Text Available Mobile edge computing (MEC enables battery-powered mobile nodes to acquire information technology services at the network edge. These nodes desire to enjoy their service under power saving. The sampling rate invariant detection (SRID is the first downclocking WiFi technique that can achieve this objective. With SRID, a node detects one packet arrival at a downclocked rate. Upon a successful detection, the node reverts to a full-clocked rate to receive the packet immediately. To ensure that a node acquires its service immediately, the detection performance (namely, the miss-detection probability and the false-alarm probability of SRID is of importance. This paper is the first one to theoretically study the crucial impact of SRID attributes (e.g., tolerance threshold, correlation threshold, and energy ratio threshold on the packet detection performance. Extensive Monte Carlo experiments show that our theoretical model is very accurate. This study can help system developers set reasonable system parameters for WiFi downclocking.

  19. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    Science.gov (United States)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  20. Computer versus paper--does it make any difference in test performance?

    Science.gov (United States)

    Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin

    2015-01-01

    CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low-performing

  1. Third International Conference on Computational Science, Engineering and Information Technology (CCSEIT-2013), v.1

    CERN Document Server

    Kumar, Ashok; Annamalai, Annamalai

    2013-01-01

      This book is the proceedings of Third International Conference on Computational Science, Engineering and Information Technology (CCSEIT-2013) that was held in Konya, Turkey, on June 7-9. CCSEIT-2013 provided an excellent international forum for sharing knowledge and results in theory, methodology and applications of computational science, engineering and information technology. This book contains research results, projects, survey work and industrial experiences representing significant advances in the field. The different contributions collected in this book cover five main areas: algorithms, data structures and applications;  wireless and mobile networks; computer networks and communications; natural language processing and information theory; cryptography and information security.  

  2. Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general.

    Science.gov (United States)

    Zander, Thorsten O; Kothe, Christian

    2011-04-01

    Cognitive monitoring is an approach utilizing realtime brain signal decoding (RBSD) for gaining information on the ongoing cognitive user state. In recent decades this approach has brought valuable insight into the cognition of an interacting human. Automated RBSD can be used to set up a brain-computer interface (BCI) providing a novel input modality for technical systems solely based on brain activity. In BCIs the user usually sends voluntary and directed commands to control the connected computer system or to communicate through it. In this paper we propose an extension of this approach by fusing BCI technology with cognitive monitoring, providing valuable information about the users' intentions, situational interpretations and emotional states to the technical system. We call this approach passive BCI. In the following we give an overview of studies which utilize passive BCI, as well as other novel types of applications resulting from BCI technology. We especially focus on applications for healthy users, and the specific requirements and demands of this user group. Since the presented approach of combining cognitive monitoring with BCI technology is very similar to the concept of BCIs itself we propose a unifying categorization of BCI-based applications, including the novel approach of passive BCI.

  3. Computer Technology Directory.

    Science.gov (United States)

    Exceptional Parent, 1990

    1990-01-01

    This directory lists approximately 300 commercial vendors that offer computer hardware, software, and communication aids for children with disabilities. The company listings indicate computer compatibility and specific disabilities served by their products. (JDD)

  4. Blueprint for a microwave trapped-ion quantum computer

    DEFF Research Database (Denmark)

    Lekitsch, B.; Weidt, S.; Fowler, A. G.

    2017-01-01

    , are constructed using silicon microfabrication techniques and they are within reach of current technology. To perform the required quantum computations, the modules make use of long-wavelength-radiation based quantum gate technology. To scale this microwave quantum computer architecture to an arbitrary size we...

  5. Computer-Supported Instruction in Enhancing the Performance of Dyscalculics

    Science.gov (United States)

    Kumar, S. Praveen; Raja, B. William Dharma

    2010-01-01

    The use of instructional media is an essential component of teaching-learning process which contributes to the efficiency as well as effectiveness of the teaching-learning process. Computer-supported instruction has a very important role to play as an advanced technological instruction as it employs different instructional techniques like…

  6. High performance computing on vector systems

    CERN Document Server

    Roller, Sabine

    2008-01-01

    Presents the developments in high-performance computing and simulation on modern supercomputer architectures. This book covers trends in hardware and software development in general and specifically the vector-based systems and heterogeneous architectures. It presents innovative fields like coupled multi-physics or multi-scale simulations.

  7. Parallel distributed computing in modeling of the nanomaterials production technologies

    NARCIS (Netherlands)

    Krzhizhanovskaya, V.V.; Korkhov, V.V.; Zatevakhin, M.A.; Gorbachev, Y.E.

    2008-01-01

    Simulation of physical and chemical processes occurring in the nanomaterial production technologies is a computationally challenging problem, due to the great number of coupled processes, time and length scales to be taken into account. To solve such complex problems with a good level of detail in a

  8. FY 1998 Blue Book: Computing, Information, and Communications: Technologies for the 21st Century

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — As the 21st century approaches, the rapid convergence of computing, communications, and information technology promises unprecedented opportunities for scientific...

  9. Cost and performance of innovative remediation technologies

    International Nuclear Information System (INIS)

    Cummings, J.B.; Kingscott, J.W.; Fiedler, L.D.

    1995-01-01

    The selection and use of more cost-effective remedies requires better access to data on the performance and cost of technologies used in the field. To make data more widely available, the US Environmental Protection Agency is working jointly with member agencies of the Federal Remediation Technologies Round table to publish case studies of full-scale remediation and demonstration projects. EPA, DoD, and DOE have published case studies of cleanup projects primarily consisting of bioremediation, soil vapor extraction, and thermal desorption. Within the limits of this initial data set, the paper evaluates technology performance and cost. In the analysis of cost factors, the paper shows the use of a standardized Work Breakdown Structure (WBS). Use of the WBS will be important in future reporting of completed projects to facilitate cost comparison. The paper notes the limits to normalization and thus cross-site comparison which can be achieved using the WBS. The paper identifies conclusions from initial efforts to compile cost and performance data, highlights the importance of such efforts to the overall remediation effort, and discusses future cost and performance documentation efforts

  10. Integration of Geographical Information Systems and Geophysical Applications with Distributed Computing Technologies.

    Science.gov (United States)

    Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.

    2005-12-01

    We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.

  11. Use of Emerging Grid Computing Technologies for the Analysis of LIGO Data

    Science.gov (United States)

    Koranda, Scott

    2004-03-01

    The LIGO Scientific Collaboration (LSC) today faces the challenge of enabling analysis of terabytes of LIGO data by hundreds of scientists from institutions all around the world. To meet this challenge the LSC is developing tools, infrastructure, applications, and expertise leveraging Grid Computing technologies available today, and making available to LSC scientists compute resources at sites across the United States and Europe. We use digital credentials for strong and secure authentication and authorization to compute resources and data. Building on top of products from the Globus project for high-speed data transfer and information discovery we have created the Lightweight Data Replicator (LDR) to securely and robustly replicate data to resource sites. We have deployed at our computing sites the Virtual Data Toolkit (VDT) Server and Client packages, developed in collaboration with our partners in the GriPhyN and iVDGL projects, providing uniform access to distributed resources for users and their applications. Taken together these Grid Computing technologies and infrastructure have formed the LSC DataGrid--a coherent and uniform environment across two continents for the analysis of gravitational-wave detector data. Much work, however, remains in order to scale current analyses and recent lessons learned need to be integrated into the next generation of Grid middleware.

  12. Study on key technologies of optimization of big data for thermal power plant performance

    Science.gov (United States)

    Mao, Mingyang; Xiao, Hong

    2018-06-01

    Thermal power generation accounts for 70% of China's power generation, the pollutants accounted for 40% of the same kind of emissions, thermal power efficiency optimization needs to monitor and understand the whole process of coal combustion and pollutant migration, power system performance data show explosive growth trend, The purpose is to study the integration of numerical simulation of big data technology, the development of thermal power plant efficiency data optimization platform and nitrogen oxide emission reduction system for the thermal power plant to improve efficiency, energy saving and emission reduction to provide reliable technical support. The method is big data technology represented by "multi-source heterogeneous data integration", "large data distributed storage" and "high-performance real-time and off-line computing", can greatly enhance the energy consumption capacity of thermal power plants and the level of intelligent decision-making, and then use the data mining algorithm to establish the boiler combustion mathematical model, mining power plant boiler efficiency data, combined with numerical simulation technology to find the boiler combustion and pollutant generation rules and combustion parameters of boiler combustion and pollutant generation Influence. The result is to optimize the boiler combustion parameters, which can achieve energy saving.

  13. The Challenges and Benefits of Using Computer Technology for Communication and Teaching in the Geosciences

    Science.gov (United States)

    Fairley, J. P.; Hinds, J. J.

    2003-12-01

    The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.

  14. Radiotherapy Monte Carlo simulation using cloud computing technology.

    Science.gov (United States)

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  15. Radiotherapy Monte Carlo simulation using cloud computing technology

    International Nuclear Information System (INIS)

    Poole, C.M.; Cornelius, I.; Trapp, J.V.; Langton, C.M.

    2012-01-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  16. THE ROLE OF COMPUTER TECHNOLOGY IN TEACHING ENGLISH LANGUAGE

    Directory of Open Access Journals (Sweden)

    Батагоз Талгатовна Керимбаева

    2017-12-01

    Full Text Available In the article an attempt was made to define the role and to study the peculiarities of functioning of English language in higher education. The state of education of the Republic of Kazakhstan and trends of development of society are the most result problems of priority development of the education system on the basis of computer technology and the creation of a unified educational information environment. With the rapid development of science, fast updates of information, it is impossible to learn for a lifetime, it is important to develop the interest in obtaining knowledge for continuous self- education. Intense changes in society caused by the development of modern educational technologies, has led to the need for change of the education system. The main objective of the training is to achieve a new modern quality of education.Modernization of the Kazakhstan education defines the main goal of professional education as the training of qualified professional of the appropriate level and profile, fluent in their profession, capable to effective work on a speciality at the level of world standards, ready for professional growth and professional mobility. Modern trends of modernization of educational programs demand introduction of modern methods of teaching. The increasing introduction of new computer technology and the application of the competence approach in educational process of H.A. Yasawi International kazakh- turkish university promotes increase of efficiency of process of teaching English.

  17. High Performance Networks From Supercomputing to Cloud Computing

    CERN Document Server

    Abts, Dennis

    2011-01-01

    Datacenter networks provide the communication substrate for large parallel computer systems that form the ecosystem for high performance computing (HPC) systems and modern Internet applications. The design of new datacenter networks is motivated by an array of applications ranging from communication intensive climatology, complex material simulations and molecular dynamics to such Internet applications as Web search, language translation, collaborative Internet applications, streaming video and voice-over-IP. For both Supercomputing and Cloud Computing the network enables distributed applicati

  18. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  19. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)

    2007-03-15

    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  20. The Voice as Computer Interface: A Look at Tomorrow's Technologies.

    Science.gov (United States)

    Lange, Holley R.

    1991-01-01

    Discussion of voice as the communications device for computer-human interaction focuses on voice recognition systems for use within a library environment. Voice technologies are described, including voice response and voice recognition; examples of voice systems in use in libraries are examined; and further possibilities, including use with…

  1. Parallel algorithms and cluster computing

    CERN Document Server

    Hoffmann, Karl Heinz

    2007-01-01

    This book presents major advances in high performance computing as well as major advances due to high performance computing. It contains a collection of papers in which results achieved in the collaboration of scientists from computer science, mathematics, physics, and mechanical engineering are presented. From the science problems to the mathematical algorithms and on to the effective implementation of these algorithms on massively parallel and cluster computers we present state-of-the-art methods and technology as well as exemplary results in these fields. This book shows that problems which seem superficially distinct become intimately connected on a computational level.

  2. Computational fluid dynamics for propulsion technology: Geometric grid visualization in CFD-based propulsion technology research

    Science.gov (United States)

    Ziebarth, John P.; Meyer, Doug

    1992-01-01

    The coordination is examined of necessary resources, facilities, and special personnel to provide technical integration activities in the area of computational fluid dynamics applied to propulsion technology. Involved is the coordination of CFD activities between government, industry, and universities. Current geometry modeling, grid generation, and graphical methods are established to use in the analysis of CFD design methodologies.

  3. Photovoltaic technology, performance, manufacturing cost and markets

    International Nuclear Information System (INIS)

    Maycock, P.D.

    1999-01-01

    A comprehensive discussion of key aspects of photovoltaic energy conversion systems will provide the basis for forecasting PV module shipments from 1999 to 2010. Principal areas covered include: (1) Technology and Performance Status: The module efficiency and performance are described for commercial cell technologies including single crystal silicon, polycrystal silicon, ribbon silicon, film silicon on low cost substrate, amorphous silicon, copper indium diselenide, and cadmium telluride; (2) Manufacturing cost: 1999 costs for PV technologies in production (single crystal silicon, polycrystal silicon, and amorphous silicon) are developed. Manufacturing costs for 10--25 MW plants and 100 MW plants will be estimated; (3) The world PV market is summarized by region, top ten companies, and technology; and (4) Forecast of the World Market (seven market sectors) to 2010 will be presented. Key assumptions, price of modules, incentive programs, price of competing electricity generation will be detailed

  4. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    DOE Order 5637.1, ''Classified Computer Security,'' requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, we have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system. 1 tab

  5. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    This paper reports on DIE Order 5637.1, Classified Computer Security, which requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, the authors have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system

  6. Monitoring SLAC High Performance UNIX Computing Systems

    International Nuclear Information System (INIS)

    Lettsome, Annette K.

    2005-01-01

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia. Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface

  7. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  8. Micromagnetics on high-performance workstation and mobile computational platforms

    Science.gov (United States)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  9. Bridging the digital divide through the integration of computer and information technology in science education: An action research study

    Science.gov (United States)

    Brown, Gail Laverne

    The presence of a digital divide, computer and information technology integration effectiveness, and barriers to continued usage of computer and information technology were investigated. Thirty-four African American and Caucasian American students (17 males and 17 females) in grades 9--11 from 2 Georgia high school science classes were exposed to 30 hours of hands-on computer and information technology skills. The purpose of the exposure was to improve students' computer and information technology skills. Pre-study and post-study skills surveys, and structured interviews were used to compare race, gender, income, grade-level, and age differences with respect to computer usage. A paired t-test and McNemar test determined mean differences between student pre-study and post-study perceived skills levels. The results were consistent with findings of the National Telecommunications and Information Administration (2000) that indicated the presence of a digital divide and digital inclusion. Caucasian American participants were found to have more at-home computer and Internet access than African American participants, indicating that there is a digital divide by ethnicity. Caucasian American females were found to have more computer and Internet access which was an indication of digital inclusion. Sophomores had more at-home computer access and Internet access than other levels indicating digital inclusion. Students receiving regular meals had more computer and Internet access than students receiving free/reduced meals. Older students had more computer and Internet access than younger students. African American males had been using computer and information technology the longest which is an indication of inclusion. The paired t-test and McNemar test revealed significant perceived student increases in all skills levels. Interviews did not reveal any barriers to continued usage of the computer and information technology skills.

  10. Research and realization implementation of monitor technology on illegal external link of classified computer

    Science.gov (United States)

    Zhang, Hong

    2017-06-01

    In recent years, with the continuous development and application of network technology, network security has gradually entered people's field of vision. The host computer network external network of violations is an important reason for the threat of network security. At present, most of the work units have a certain degree of attention to network security, has taken a lot of means and methods to prevent network security problems such as the physical isolation of the internal network, install the firewall at the exit. However, these measures and methods to improve network security are often not comply with the safety rules of human behavior damage. For example, the host to wireless Internet access and dual-network card to access the Internet, inadvertently formed a two-way network of external networks and computer connections [1]. As a result, it is possible to cause some important documents and confidentiality leak even in the the circumstances of user unaware completely. Secrecy Computer Violation Out-of-band monitoring technology can largely prevent the violation by monitoring the behavior of the offending connection. In this paper, we mainly research and discuss the technology of secret computer monitoring.

  11. Performance Analyses in an Assistive Technology Service Delivery Process

    DEFF Research Database (Denmark)

    Petersen, Anne Karin

    Performance Analyses in an Assistive Technology Service Delivery Process.Keywords: process model, occupational performance, assistive technologiesThe Poster is about teaching students, using models and theory in education and practice. It is related to Occupational therapy process and professional...... af top-til-bund, klientcentreret og aktivitetsbaseret interventioner, ERGO/MunksgaardFisher, A. &, Griswold, L. A., 2014. Performance Skills. I: B.Schell red.2014 Occupational Therapy. Willard &Spackman’s occupational therapy. -12th ed., p.249-264Cook A.M., Polgar J.M. (2015) Assistive Technologies...

  12. Female Students' Experiences of Computer Technology in Single- versus Mixed-Gender School Settings

    Science.gov (United States)

    Burke, Lee-Ann; Murphy, Elizabeth

    2006-01-01

    This study explores how female students compare learning computer technology in a single- versus a mixed- gender school setting. Twelve females participated, all of whom were enrolled in a grade 12 course in Communications' Technology. Data collection included a questionnaire, a semi-structured interview and focus groups. Participants described…

  13. Cloud Computing: Exploring the scope

    OpenAIRE

    Maurya, Brajesh Kumar

    2010-01-01

    Cloud computing refers to a paradigm shift to overall IT solutions while raising the accessibility, scalability and effectiveness through its enabling technologies. However, migrated cloud platforms and services cost benefits as well as performances are neither clear nor summarized. Globalization and the recessionary economic times have not only raised the bar of a better IT delivery models but also have given access to technology enabled services via internet. Cloud computing has va...

  14. Computer technologies of future teachers of fine art training as an object of scientific educational research

    Directory of Open Access Journals (Sweden)

    Bohdan Cherniavskyi

    2017-03-01

    Full Text Available The article deals with computer technology training, highlights the current state ofcomputerization of educational process in teacher training colleges, reveals the specifictechniques of professional training of teachers of fine arts to use computer technology inteaching careers.Key words: Methods of professional training, professional activities, computertechnology training future teachers of Fine Arts, the subject of research.

  15. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter V.; Tryggvason, Tryggvi

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...

  16. The Influence of Personal Characteristics, Interaction: (Computer/Individual), Computer Self-efficacy, Personal Innovativeness in Information Technology to Computer Anxiety in use of Mind your Own Business Accounting Software

    OpenAIRE

    Mayasari, Mega; ., Gudono

    2015-01-01

    The purpose of this study was to identify the factors that cause computer anxiety in the use of Mind Your Own Business (MYOB) accounting software, i.e., to assess if there are any influence of age, gender, amount of training, ownership (usage of accounting software on a regular basis), computer self-efficacy, personal innovativeness in Information Technology (IT) to computer anxiety. The study also examined whether there is a relationship trait anxiety and negative affect to computer self-eff...

  17. The influence of audio communications technology on computer-supported collaborative learning

    Directory of Open Access Journals (Sweden)

    Denise Whitelock

    1996-12-01

    Full Text Available In recent years there has been an appreciation of the benefits that can be obtained by students working in groups or teams using computers (Eraut and Hoyles, 1988. But there is a difference in opinion as to how a partner enhances learning, and in particular how well adult learners do in the collaborative setting. In order to capitalize on the opportunities offered by new technologies, we need to understand more fully how the process of collaboration is effected by different communication technologies, and how the technologies themselves might be used to best advantage for the benefit of distance learners. However, another factor, apart from the technology itself, which is thought to influence computer-supported collaborative learning is in the gender distribution of the group. A number of classroom studies have shown gender differences when children work together with computers, and these have been reported from a number of subject-domains including science (Scanlon et al, 1993; Littleton et al, 1992. Since our own expertise is in the area of science learning, we selected a non-trivial physics task for the subjects to work with, and that is in the area of elastic collisions. Previous studies (Villani and Pacca, 1990; Whitelock et al, 1993 have shown that both adults and school-children have difficulty in predicting the subsequent motion of balls or ice pucks after they collide. Villani's study stresses that even postgraduate students tend to revert to their informal commonsense notions unless they are cued to use formal representations of these types of problem. These studies have demonstrated that the topic of elastic collisions is a complex yet fruitful one in which to engage students in group work.

  18. An application of interactive computer graphics technology to the design of dispersal mechanisms

    Science.gov (United States)

    Richter, B. J.; Welch, B. H.

    1977-01-01

    Interactive computer graphics technology is combined with a general purpose mechanisms computer code to study the operational behavior of three guided bomb dispersal mechanism designs. These studies illustrate the use of computer graphics techniques to discover operational anomalies, to assess the effectiveness of design improvements, to reduce the time and cost of the modeling effort, and to provide the mechanism designer with a visual understanding of the physical operation of such systems.

  19. Monte Carlo computation in the applied research of nuclear technology

    International Nuclear Information System (INIS)

    Xu Shuyan; Liu Baojie; Li Qin

    2007-01-01

    This article briefly introduces Monte Carlo Methods and their properties. It narrates the Monte Carlo methods with emphasis in their applications to several domains of nuclear technology. Monte Carlo simulation methods and several commonly used computer software to implement them are also introduced. The proposed methods are demonstrated by a real example. (authors)

  20. Global Journal of Computer Science and Technology. Volume 1.2

    Science.gov (United States)

    Dixit, R. K.

    2009-01-01

    Articles in this issue of "Global Journal of Computer Science and Technology" include: (1) Input Data Processing Techniques in Intrusion Detection Systems--Short Review (Suhair H. Amer and John A. Hamilton, Jr.); (2) Semantic Annotation of Stock Photography for CBIR Using MPEG-7 standards (R. Balasubramani and V. Kannan); (3) An Experimental Study…

  1. A Proposed Model for Improving Performance and Reducing Costs of IT Through Cloud Computing of Egyptian Business Enterprises

    OpenAIRE

    Mohamed M.El Hadi; Azza Monir Ismail

    2016-01-01

    Information technologies are affecting the big business enterprises of todays from data processing and transactions to achieve the goals efficiently and effectively, affecting creates new business opportunities and towards new competitive advantage, service must be enough to match the recent trends of IT such as cloud computing. Cloud computing technology has provided all IT services. Therefore, cloud computing offers an alternative to adaptable with technology model current , creating reduci...

  2. 4th International Conference on Applied Computing and Information Technology

    CERN Document Server

    2017-01-01

    This edited book presents scientific results of the 4th International Conference on Applied Computing and Information Technology (ACIT 2016) which was held on December 12–14, 2016 in Las Vegas, USA. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. The aim of this conference was also to bring out the research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the best papers from those papers accepted for presentation at the conference. The papers were chosen based on review scores submitted by members of the Program Committee, and underwent further rigorous rounds of review. Th...

  3. From chalkboard, slides, and paper to e-learning: How computing technologies have transformed anatomical sciences education.

    Science.gov (United States)

    Trelease, Robert B

    2016-11-01

    Until the late-twentieth century, primary anatomical sciences education was relatively unenhanced by advanced technology and dependent on the mainstays of printed textbooks, chalkboard- and photographic projection-based classroom lectures, and cadaver dissection laboratories. But over the past three decades, diffusion of innovations in computer technology transformed the practices of anatomical education and research, along with other aspects of work and daily life. Increasing adoption of first-generation personal computers (PCs) in the 1980s paved the way for the first practical educational applications, and visionary anatomists foresaw the usefulness of computers for teaching. While early computers lacked high-resolution graphics capabilities and interactive user interfaces, applications with video discs demonstrated the practicality of programming digital multimedia linking descriptive text with anatomical imaging. Desktop publishing established that computers could be used for producing enhanced lecture notes, and commercial presentation software made it possible to give lectures using anatomical and medical imaging, as well as animations. Concurrently, computer processing supported the deployment of medical imaging modalities, including computed tomography, magnetic resonance imaging, and ultrasound, that were subsequently integrated into anatomy instruction. Following its public birth in the mid-1990s, the World Wide Web became the ubiquitous multimedia networking technology underlying the conduct of contemporary education and research. Digital video, structural simulations, and mobile devices have been more recently applied to education. Progressive implementation of computer-based learning methods interacted with waves of ongoing curricular change, and such technologies have been deemed crucial for continuing medical education reforms, providing new challenges and opportunities for anatomical sciences educators. Anat Sci Educ 9: 583-602. © 2016 American

  4. THE ISSUE OF FORMING FUTURE MUSIC TEACHERS’ PROFESSIONAL COMPETENCE BY COMPUTER TECHNOLOGY TOOLS IN THE THEORY OF NATIONAL ART

    Directory of Open Access Journals (Sweden)

    Lyudmila Gavrilova

    2017-04-01

    Full Text Available The article deals with theoretical aspects of forming future music teachers’ professional competence by computer technology tools. The concept of professional competence has become a major criterion of preparing students for professional activities. The issue of the article is relevant as the competence approach has become a basis of implementing computer technologies into future music teachers’ training. The authors give a detailed analysis of implementing computer technologies into musical education. The special attention is paid to using a computer in musical education and making electronic pedagogical resources. The aim of the article is to outline the directions of national art research in the process of implementing computer tools that is one of the most efficient ways of updating process of future music teachers’ training. The article reveals theoretical aspects of forming future music teachers’ professional competence by computer technology tools. The authors point out that implementing musical and computer technologies into music art practice is realized in some directions: using a computer as a new musical instrument in composers, sound engineers, and arrangers’ activities; using a computer for studying the quality of music sound, analysing sounds and music compositions, spectral analysis of acoustic characteristics of singers’ voice; studying ancient music manuscripts due to digital technology; developing hardware and software for music education. A distinct direction of research is the pedagogical aspect of using a computer in music education (music and the use of special software for recording and editing music, the use of multimedia to enhance visibility in education, development of e-learning resources, etc.. The authors conclude that implementing computer technologies into future music teachers’ training makes this process more efficient. In the authors’ opinion the widespread introduction of distance learning

  5. Neuroanatomical correlates of brain-computer interface performance.

    Science.gov (United States)

    Kasahara, Kazumi; DaSalla, Charles Sayo; Honda, Manabu; Hanakawa, Takashi

    2015-04-15

    Brain-computer interfaces (BCIs) offer a potential means to replace or restore lost motor function. However, BCI performance varies considerably between users, the reasons for which are poorly understood. Here we investigated the relationship between sensorimotor rhythm (SMR)-based BCI performance and brain structure. Participants were instructed to control a computer cursor using right- and left-hand motor imagery, which primarily modulated their left- and right-hemispheric SMR powers, respectively. Although most participants were able to control the BCI with success rates significantly above chance level even at the first encounter, they also showed substantial inter-individual variability in BCI success rate. Participants also underwent T1-weighted three-dimensional structural magnetic resonance imaging (MRI). The MRI data were subjected to voxel-based morphometry using BCI success rate as an independent variable. We found that BCI performance correlated with gray matter volume of the supplementary motor area, supplementary somatosensory area, and dorsal premotor cortex. We suggest that SMR-based BCI performance is associated with development of non-primary somatosensory and motor areas. Advancing our understanding of BCI performance in relation to its neuroanatomical correlates may lead to better customization of BCIs based on individual brain structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Science.gov (United States)

    Olijnyk, Nicholas V

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers

  7. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Directory of Open Access Journals (Sweden)

    Nicholas V Olijnyk

    Full Text Available This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index are growing. China's H-index (a normalized indicator has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures; some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state, while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation. Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  8. Labour market expectation of Nigerian computer science ...

    African Journals Online (AJOL)

    ... of Nigerian computer science / Information Communication Technology (ICT) graduates. ... It also x-rays the women performance in Computer Science. ... key players were analyzed using variables such as competence, creativity, innovation, ...

  9. Technical Data Management Center: a focal point for meteorological and other environmental transport computing technology

    International Nuclear Information System (INIS)

    McGill, B.; Maskewitz, B.F.; Trubey, D.K.

    1981-01-01

    The Technical Data Management Center, collecting, packaging, analyzing, and distributing information, computer technology and data which includes meteorological and other environmental transport work is located at the Oak Ridge National Laboratory, within the Engineering Physics Division. Major activities include maintaining a collection of computing technology and associated literature citations to provide capabilities for meteorological and environmental work. Details of the activities on behalf of TDMC's sponsoring agency, the US Nuclear Regulatory Commission, are described

  10. Cloud Computing Technologies in Writing Class: Factors Influencing Students’ Learning Experience

    Directory of Open Access Journals (Sweden)

    Jenny WANG

    2017-07-01

    Full Text Available The proposed interactive online group within the cloud computing technologies as a main contribution of this paper provides easy and simple access to the cloud-based Software as a Service (SaaS system and delivers effective educational tools for students and teacher on after-class group writing assignment activities. Therefore, this study addresses the implementation of the most commonly used cloud applications, Google Docs, in a higher education course. The learning environment integrated Google Docs that students are using to develop and deploy writing assignments in between classes has been subjected to learning experience assessment. Using the questionnaire as an instrument to study participants (n=28, the system has provided an effective learning environment in between classes for the students and the instructor to stay connected. Factors influencing students’ learning experience based on cloud applications include frequency of interaction online and students’ technology experience. Suggestions to cope with challenges regarding the use of them in higher education including the technical issues are also presented. Educators are therefore encouraged to embrace cloud computing technologies as they design the course curriculum in hoping to effectively enrich students’ learning.

  11. The contribution of high-performance computing and modelling for industrial development

    CSIR Research Space (South Africa)

    Sithole, Happy

    2017-10-01

    Full Text Available Performance Computing and Modelling for Industrial Development Dr Happy Sithole and Dr Onno Ubbink 2 Strategic context • High-performance computing (HPC) combined with machine Learning and artificial intelligence present opportunities to non...

  12. Computers for Your Classroom: CAI and CMI.

    Science.gov (United States)

    Thomas, David B.; Bozeman, William C.

    1981-01-01

    The availability of compact, low-cost computer systems provides a means of assisting classroom teachers in the performance of their duties. Computer-assisted instruction (CAI) and computer-managed instruction (CMI) are two applications of computer technology with which school administrators should become familiar. CAI is a teaching medium in which…

  13. Symposium on computational fluid dynamics: technology and applications

    International Nuclear Information System (INIS)

    1988-01-01

    A symposium on the technology and applications of computational fluid dynamics (CFD) was held in Pretoria from 21-23 Nov 1988. The following aspects were covered: multilevel adaptive methods and multigrid solvers in CFD, a symbolic processing approach to CFD, interplay between CFD and analytical approximations, CFD on a transfer array, the application of CFD in high speed aerodynamics, numerical simulation of laminar blood flow, two-phase flow modelling in nuclear accident analysis, and the finite difference scheme for the numerical solution of fluid flow

  14. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    Science.gov (United States)

    Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-10-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.

  15. An Overview of Computer Network security and Research Technology

    OpenAIRE

    Rathore, Vandana

    2016-01-01

    The rapid development in the field of computer networks and systems brings both convenience and security threats for users. Security threats include network security and data security. Network security refers to the reliability, confidentiality, integrity and availability of the information in the system. The main objective of network security is to maintain the authenticity, integrity, confidentiality, availability of the network. This paper introduces the details of the technologies used in...

  16. Methods of computer experiment in gamma-radiation technologies using new radiation sources

    CERN Document Server

    Bratchenko, M I; Rozhkov, V V

    2001-01-01

    Presented id the methodology of computer modeling application for physical substantiation of new irradiation technologies and irradiators design work flow. Modeling tasks for irradiation technologies are structured along with computerized methods of their solution and appropriate types of software. Comparative analysis of available packages for Monte-Carlo modeling of electromagnetic processes in media is done concerning their application to irradiation technologies problems. The results of codes approbation and preliminary data on gamma-radiation absorbed dose distributions for nuclides of conventional sources and prospective Europium-based gamma-sources are presented.

  17. Technologies for Large Data Management in Scientific Computing

    CERN Document Server

    Pace, A

    2014-01-01

    In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focusses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.

  18. The ongoing investigation of high performance parallel computing in HEP

    CERN Document Server

    Peach, Kenneth J; Böck, R K; Dobinson, Robert W; Hansroul, M; Norton, Alan Robert; Willers, Ian Malcolm; Baud, J P; Carminati, F; Gagliardi, F; McIntosh, E; Metcalf, M; Robertson, L; CERN. Geneva. Detector Research and Development Committee

    1993-01-01

    Past and current exploitation of parallel computing in High Energy Physics is summarized and a list of R & D projects in this area is presented. The applicability of new parallel hardware and software to physics problems is investigated, in the light of the requirements for computing power of LHC experiments and the current trends in the computer industry. Four main themes are discussed (possibilities for a finer grain of parallelism; fine-grain communication mechanism; usable parallel programming environment; different programming models and architectures, using standard commercial products). Parallel computing technology is potentially of interest for offline and vital for real time applications in LHC. A substantial investment in applications development and evaluation of state of the art hardware and software products is needed. A solid development environment is required at an early stage, before mainline LHC program development begins.

  19. Comparing the performance of SIMD computers by running large air pollution models

    DEFF Research Database (Denmark)

    Brown, J.; Hansen, Per Christian; Wasniewski, J.

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on these computers. Using a realistic large-scale model, we gained detailed insight about the performance of the computers involved when used to solve large-scale scientific...... problems that involve several types of numerical computations. The computers used in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  20. Use of computer technologies in physical education of women of the first mature age

    Directory of Open Access Journals (Sweden)

    Julia Tomilina

    2016-08-01

    Full Text Available Purpose: the improvement of process of physical education of women of the first mature age by means of Pilates with use of information technologies. Material & Methods: the experience of development and deployment of computer technologies in the process of physical education of women of mature age was systematized by means of the analysis of scientific and methodical and special literature and the best pedagogical and coach's practices, which is presented in mass media. The ways of application of computer programs were revealed, and the computer program ‘’Pilates’’ was developed by means of programming in the system Visual Basic. Results: the computer program ‘’Pilates’’, which consists of directory, settlement and recreational blocks, is offered for the purpose of improvement of the process of physical education of women of the first mature age by means of Pilates and increase in their motivation to classes by physical exercises. Conclusions: it is revealed that application of the computer programs has the positive effect in practice of physical education of women of the first mature age. The computer program ‘’Pilates’’ is submitted.