WorldWideScience

Sample records for research unit computer

  1. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  2. Computing in Research.

    Science.gov (United States)

    Ashenhurst, Robert L.

    The introduction and diffusion of automatic computing facilities during the 1960's is reviewed; it is described as a time when research strategies in a broad variety of disciplines changed to take advantage of the newfound power provided by the computer. Several types of typical problems encountered by researchers who adopted the new technologies,…

  3. Research in computer science

    Science.gov (United States)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  4. Uranium chemistry research unit

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    The initial field of research of this Unit, established in 1973, was the basic co-ordination chemistry of uranium, thorium, copper, cobalt and nickel. Subsequently the interest of the Unit extended to extractive metallurgy relating to these metals. Under the term 'co-ordination chemistry' is understood the interaction of the central transition metal ion with surrounding atoms in its immediate vicinity (within bonding distance) and the influence they have on each other - for example, structural studies for determining the number and arrangement of co-ordinated atoms and spectrophotometric studies to establish how the f electron energy levels of uranium are influenced by the environment. New types of uranium compounds have been synthesized and studied, and the behaviour of uranium ions in non-aqueous systems has also received attention. This work can be applied to the development and study of extractants and new extractive processes for uranium

  5. Optical Computing - Research Trends

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 7. Optical Computing - Research Trends. Debabrata Goswami. General Article Volume 8 Issue 7 July 2003 pp 8-21. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/008/07/0008-0021. Keywords.

  6. Citizens unite for computational immunology!

    Science.gov (United States)

    Belden, Orrin S; Baker, Sarah Catherine; Baker, Brian M

    2015-07-01

    Recruiting volunteers who can provide computational time, programming expertise, or puzzle-solving talent has emerged as a powerful tool for biomedical research. Recent projects demonstrate the potential for such 'crowdsourcing' efforts in immunology. Tools for developing applications, new funding opportunities, and an eager public make crowdsourcing a serious option for creative solutions for computationally-challenging problems. Expanded uses of crowdsourcing in immunology will allow for more efficient large-scale data collection and analysis. It will also involve, inspire, educate, and engage the public in a variety of meaningful ways. The benefits are real - it is time to jump in! Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Optical Computing Research.

    Science.gov (United States)

    1987-10-30

    1489-1496, 1985. 13. W.T. Welford and R. Winston, The Optics of Nonimaging Concentrators, Academic Press, New York, N.Y., 1978 (see Appendix A). 14. R.H...AD-fIB? Ŗ OPTICAL CONPIITINO RESEAIRCII(U STANFORD UlNIV CA STINFORD / ELECTRONICS LASS J N 0000W4 30 OCT 97 SMAFOSR-TR-S?-1635 RFOSR-96...Force Base ELEMENT NO. NO. NO. NO. Washington, DC 20332-6448 11. TITLE ,Include Security ClaaticaonUNCLASSIFIED 61102F 2305 B4 OPTICAL COMPUTING RESEARCH

  8. How Do We Really Compute with Units?

    Science.gov (United States)

    Fiedler, B. H.

    2010-01-01

    The methods that we teach students for computing with units of measurement are often not consistent with the practice of professionals. For professionals, the vast majority of computations with quantities of measure are performed within programs on electronic computers, for which an accounting for the units occurs only once, in the design of the…

  9. Application Technology Research Unit

    Data.gov (United States)

    Federal Laboratory Consortium — To conduct fundamental and developmental research on new and improved application technologies to protect floricultural, nursery, landscape, turf, horticultural, and...

  10. Quantum computing for physics research

    International Nuclear Information System (INIS)

    Georgeot, B.

    2006-01-01

    Quantum computers hold great promises for the future of computation. In this paper, this new kind of computing device is presented, together with a short survey of the status of research in this field. The principal algorithms are introduced, with an emphasis on the applications of quantum computing to physics. Experimental implementations are also briefly discussed

  11. Research in computer forensics

    OpenAIRE

    Wai, Hor Cheong

    2002-01-01

    Approved for public release; distribution is unlimited Computer Forensics involves the preservation, identification, extraction and documentation of computer evidence stored in the form of magnetically encoded information. With the proliferation of E-commerce initiatives and the increasing criminal activities on the web, this area of study is catching on in the IT industry and among the law enforcement agencies. The objective of the study is to explore the techniques of computer forensics ...

  12. Multidisciplinary Computational Research

    National Research Council Canada - National Science Library

    Visbal, Miguel R

    2006-01-01

    The purpose of this work is to develop advanced multidisciplinary numerical simulation capabilities for aerospace vehicles with emphasis on highly accurate, massively parallel computational methods...

  13. A computer controlled tele-cobalt unit

    International Nuclear Information System (INIS)

    Brace, J.A.

    1982-01-01

    A computer controlled cobalt treatment unit was commissioned for treating patients in January 1980. Initially the controlling computer was a minicomputer, but now the control of the therapy unit is by a microcomputer. The treatment files, which specify the movement and configurations necessary to deliver the prescribed dose, are produced on the minicomputer and then transferred to the microcomputer using minitape cartridges. The actual treatment unit is based on a standard cobalt unit with a few additional features e.g. the drive motors can be controlled either by the computer or manually. Since the treatment unit is used for both manual and automatic treatments, the operational procedure under computer control is made to closely follow the manual procedure for a single field treatment. The necessary safety features which protect against human, hardware and software errors as well as the advantages and disadvantages of computer controlled radiotherapy are discussed

  14. Graphics supercomputer for computational fluid dynamics research

    Science.gov (United States)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  15. NASA's computer science research program

    Science.gov (United States)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  16. The Uranium Chemistry Research Unit

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    The article discusses the research work done at the Uranium Chemistry Research Unit of the University of Port Elizabeth. The initial research programme dealt with fundamental aspects of uranium chemistry. New uranium compounds were synthesized and their chemical properties were studied. Research was also done to assist the mining industry, as well as on nuclear medicine. Special mentioning is made of the use of technetium for medical diagnosis and therapy

  17. Computational chemistry research

    Science.gov (United States)

    Levin, Eugene

    1987-01-01

    Task 41 is composed of two parts: (1) analysis and design studies related to the Numerical Aerodynamic Simulation (NAS) Extended Operating Configuration (EOC) and (2) computational chemistry. During the first half of 1987, Dr. Levin served as a member of an advanced system planning team to establish the requirements, goals, and principal technical characteristics of the NAS EOC. A paper entitled 'Scaling of Data Communications for an Advanced Supercomputer Network' is included. The high temperature transport properties (such as viscosity, thermal conductivity, etc.) of the major constituents of air (oxygen and nitrogen) were correctly determined. The results of prior ab initio computer solutions of the Schroedinger equation were combined with the best available experimental data to obtain complete interaction potentials for both neutral and ion-atom collision partners. These potentials were then used in a computer program to evaluate the collision cross-sections from which the transport properties could be determined. A paper entitled 'High Temperature Transport Properties of Air' is included.

  18. Computer supported qualitative research

    CERN Document Server

    Reis, Luís; Sousa, Francislê; Moreira, António; Lamas, David

    2017-01-01

    This book contains an edited selection of the papers accepted for presentation and discussion at the first International Symposium on Qualitative Research (ISQR2016), held in Porto, Portugal, July 12th-14th, 2016. The book and the symposium features the four main application fields Education, Health, Social Sciences and Engineering and Technology and seven main subjects: Rationale and Paradigms of Qualitative Research (theoretical studies, critical reflection about epistemological dimensions, ontological and axiological); Systematization of approaches with Qualitative Studies (literature review, integrating results, aggregation studies, meta -analysis, meta- analysis of qualitative meta- synthesis, meta- ethnography); Qualitative and Mixed Methods Research (emphasis in research processes that build on mixed methodologies but with priority to qualitative approaches); Data Analysis Types (content analysis , discourse analysis , thematic analysis , narrative analysis , etc.); Innovative processes of Qualitative ...

  19. Computer Science Research at Langley

    Science.gov (United States)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  20. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  1. Computer science and operations research

    CERN Document Server

    Balci, Osman

    1992-01-01

    The interface of Operation Research and Computer Science - although elusive to a precise definition - has been a fertile area of both methodological and applied research. The papers in this book, written by experts in their respective fields, convey the current state-of-the-art in this interface across a broad spectrum of research domains which include optimization techniques, linear programming, interior point algorithms, networks, computer graphics in operations research, parallel algorithms and implementations, planning and scheduling, genetic algorithms, heuristic search techniques and dat

  2. Computational mechanics research at ONR

    International Nuclear Information System (INIS)

    Kushner, A.S.

    1986-01-01

    Computational mechanics is not an identified program at the Office of Naval Research (ONR), but rather plays a key role in the Solid Mechanics, Fluid Mechanics, Energy Conversion, and Materials Science programs. The basic philosophy of the Mechanics Division at ONR is to support fundamental research which expands the basis for understanding, predicting, and controlling the behavior of solid and fluid materials and systems at the physical and geometric scales appropriate to the phenomena of interest. It is shown in this paper that a strong commonalty of computational mechanics drivers exists for the forefront research areas in both solid and fluid mechanics

  3. The United Nuclear Research Institute

    International Nuclear Information System (INIS)

    Kiss, D.

    1978-01-01

    The UNRI, the only common institute of the socialist countries was founded in 1956 in Dubna. The scientists of small countries have the opportunity to take part in fundamental research with very expensive devices which are usually not available for them. There are six research laboratories and one department in the UNRI namely: the theoretical physical laboratory; the laboratory of high energies - there is a synchrophasotron of 1a GeV there; the laboratory of nuclear problems - there is a synchrocyclotron of 680 MeV there; the laboratory of nuclear reactions with the cyclotron U-300 which can accelerate heavy ions; the neutronphysical laboratory with the impulse reactor IBM-30; the laboratory of computation and automatization with two big computers; the department of new acceleration methods. The main results obtained by Hungarian scientist in Dubna are described. (V.N.)

  4. Control of peripheral units by satellite computer

    International Nuclear Information System (INIS)

    Tran, K.T.

    1974-01-01

    A computer system was developed allowing the control of nuclear physics experiments, and use of the results by means of graphical and conversational assemblies. This system which is made of two computers, one IBM-370/135 and one Telemecanique Electrique T1600, controls the conventional IBM peripherals and also the special ones made in the laboratory, such as data acquisition display and graphics units. The visual display is implemented by a scanning-type television, equipped with a light-pen. These units in themselves are universal, but their specifications were established to meet the requirements of nuclear physics experiments. The input-output channels of the two computers have been connected together by an interface, designed and implemented in the Laboratory. This interface allows the exchange of control signals and data (the data are changed from bytes into word and vice-versa). The T1600 controls the peripherals mentionned above according to the commands of the IBM370. Hence the T1600 has here the part of a satellite computer which allows conversation with the main computer and also insures the control of its special peripheral units [fr

  5. Complexity estimates based on integral transforms induced by computational units

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra

    2012-01-01

    Roč. 33, September (2012), s. 160-167 ISSN 0893-6080 R&D Projects: GA ČR GAP202/11/1368 Institutional research plan: CEZ:AV0Z10300504 Institutional support: RVO:67985807 Keywords : neural networks * estimates of model complexity * approximation from a dictionary * integral transforms * norms induced by computational units Subject RIV: IN - Informatics, Computer Science Impact factor: 1.927, year: 2012

  6. Economic Model For a Return on Investment Analysis of United States Government High Performance Computing (HPC) Research and Development (R & D) Investment

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, Earl C. [IDC Research Inc., Framingham, MA (United States); Conway, Steve [IDC Research Inc., Framingham, MA (United States); Dekate, Chirag [IDC Research Inc., Framingham, MA (United States)

    2013-09-30

    This study investigated how high-performance computing (HPC) investments can improve economic success and increase scientific innovation. This research focused on the common good and provided uses for DOE, other government agencies, industry, and academia. The study created two unique economic models and an innovation index: 1 A macroeconomic model that depicts the way HPC investments result in economic advancements in the form of ROI in revenue (GDP), profits (and cost savings), and jobs. 2 A macroeconomic model that depicts the way HPC investments result in basic and applied innovations, looking at variations by sector, industry, country, and organization size. A new innovation index that provides a means of measuring and comparing innovation levels. Key findings of the pilot study include: IDC collected the required data across a broad set of organizations, with enough detail to create these models and the innovation index. The research also developed an expansive list of HPC success stories.

  7. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  8. Center for Computing Research Summer Research Proceedings 2015.

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, Andrew Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parks, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  9. Research Computing and Data for Geoscience

    OpenAIRE

    Smith, Preston

    2015-01-01

    This presentation will discuss the data storage and computational resources available for GIS researchers at Purdue. This presentation will discuss the data storage and computational resources available for GIS researchers at Purdue.

  10. Clinical Epidemiology Unit - overview of research areas

    Science.gov (United States)

    Clinical Epidemiology Unit (CEU) conducts etiologic research with potential clinical and public health applications, and leads studies evaluating population-based early detection and cancer prevention strategies

  11. Research on cloud computing solutions

    Directory of Open Access Journals (Sweden)

    Liudvikas Kaklauskas

    2015-07-01

    Full Text Available Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, hybrid cloud and community. The most common and well-known deployment model is Public Cloud. A Private Cloud is suited for sensitive data, where the customer is dependent on a certain degree of security.According to the different types of services offered, cloud computing can be considered to consist of three layers (services models: IaaS (infrastructure as a service, PaaS (platform as a service, SaaS (software as a service. Main cloud computing solutions: web applications, data hosting, virtualization, database clusters and terminal services. The advantage of cloud com-puting is the ability to virtualize and share resources among different applications with the objective for better server utilization and without a clustering solution, a service may fail at the moment the server crashes.DOI: 10.15181/csat.v2i2.914

  12. Computing with impure numbers - Automatic consistency checking and units conversion using computer algebra

    Science.gov (United States)

    Stoutemyer, D. R.

    1977-01-01

    The computer algebra language MACSYMA enables the programmer to include symbolic physical units in computer calculations, and features automatic detection of dimensionally-inhomogeneous formulas and conversion of inconsistent units in a dimensionally homogeneous formula. Some examples illustrate these features.

  13. Computer graphics and research projects

    International Nuclear Information System (INIS)

    Ingtrakul, P.

    1994-01-01

    This report was prepared as an account of scientific visualization tools and application tools for scientists and engineers. It is provided a set of tools to create pictures and to interact with them in natural ways. It applied many techniques of computer graphics and computer animation through a number of full-color presentations as computer animated commercials, 3D computer graphics, dynamic and environmental simulations, scientific modeling and visualization, physically based modelling, and beavioral, skelatal, dynamics, and particle animation. It took in depth at original hardware and limitations of existing PC graphics adapters contain syste m performance, especially with graphics intensive application programs and user interfaces

  14. Graphics processing units in bioinformatics, computational biology and systems biology.

    Science.gov (United States)

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  15. Computer science research and technology volume 3

    CERN Document Server

    Bauer, Janice P

    2011-01-01

    This book presents leading-edge research from across the globe in the field of computer science research, technology and applications. Each contribution has been carefully selected for inclusion based on the significance of the research to this fast-moving and diverse field. Some topics included are: network topology; agile programming; virtualization; and reconfigurable computing.

  16. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  17. New Generation General Purpose Computer (GPC) compact IBM unit

    Science.gov (United States)

    1991-01-01

    New Generation General Purpose Computer (GPC) compact IBM unit replaces a two-unit earlier generation computer. The new IBM unit is documented in table top views alone (S91-26867, S91-26868), with the onboard equipment it supports including the flight deck CRT screen and keypad (S91-26866), and next to the two earlier versions it replaces (S91-26869).

  18. Research at the Dairy and Functional Foods Research Unit

    Science.gov (United States)

    Dr. Peggy Tomasula is Research Leader of the Dairy and Functional Foods Research Unit (DFFRU), ARS, USDA, Wyndmoor, PA, a group that includes 11 Research Scientists, 4 of whom are Lead Scientists (LS), 13 support scientists, and 3 Retired Collaborators. The mission of the DFFRU is to solve critical ...

  19. Research directions in computer engineering. Report of a workshop

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, H

    1982-09-01

    The results of a workshop held in November 1981 in Washington, DC, to outline research directions for computer engineering are reported upon. The purpose of the workshop was to provide guidance to government research funding agencies, as well as to universities and industry, as to the directions which computer engineering research should take for the next five to ten years. A select group of computer engineers was assembled, drawn from all over the United States and with expertise in virtually every aspect of today's computer technology. Industrial organisations and universities were represented in roughly equal numbers. The panel proceeded to provide a sharper definition of computer engineering than had been in popular use previously, to identify the social and national needs which provide the basis for encouraging research, to probe for obstacles to research and seek means of overcoming them and to delineate high-priority areas in which computer engineering research should be fostered. These included experimental software engineering, architectures in support of programming style, computer graphics, pattern recognition. VLSI design tools, machine intelligence, programmable automation, architectures for speech and signal processing, computer architecture and robotics. 13 references.

  20. Bubble Chamber Research Group Microcomputer Unit

    International Nuclear Information System (INIS)

    Bairstow, R.; Barlow, J.; Mace, P.R.; Seller, P.; Waters, M.; Watson, J.G.

    1982-05-01

    A distributed data acquisition system has been developed by the Bubble Chamber Research Group at the Rutherford Appleton laboratory for use with their film measuring machines. The system is based upon a set of microcomputers linked together with a VAX 11/780 computer, in a local area computer network. This network is of the star type and uses a packet switching technique. Each film measuring machine is equipped with a microcomputer which controls the function of the table, buffers data and enhances the interface between operators and machines. This paper provides a detailed description of each microcomputer and can be used as a reference manual for these computers. (author)

  1. The Microcomputer in the Clinical Nursing Research Unit

    Science.gov (United States)

    Schwirian, Patricia M.; Byers, Sandra R.

    1982-01-01

    This paper discusses the microcomputer in clinical nursing research. There are six general areas in which computers have been useful to nurses: nursing notes and charting; patient care plans; automated monitoring of high-tech nursing units; HIS and MIS systems; personnel distribution systems; and education. Three alternative models for the conduct of clinical nursing research in a hospital are described. The first is a centralized model relying on the bureaucratic structure of the hospital. Second is a decentralized network of professional nurses and research support personnel woven together by a Clinical Nurse Researcher, and third is a dedicated clinical nursing research unit. Microcomputers have five characteristics which make them vital tools for nurse researchers: user-friendliness; environment friendliness; low cost; ease of interface with other information systems; and range and quality of software.

  2. Computer research in teaching geometry future bachelors

    Directory of Open Access Journals (Sweden)

    Aliya V. Bukusheva

    2017-12-01

    Full Text Available The article is devoted to the study of the problem of usage educational studies and experiments in the geometric education of IT specialists. We consider research method applied in teaching Computer Geometry intending Bachelors studying `Mathematics and Computer Science` 02.03.01. Examples of educational and research geometric problems that require usage of computer means in order to be solved are given. These tasks are considered as variations of educational and research tasks creating problems that demand experiments with dynamic models of mathematic objects in order to be solved.

  3. Energy efficiency of computer power supply units - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Aebischer, B. [cepe - Centre for Energy Policy and Economics, Swiss Federal Institute of Technology Zuerich, Zuerich (Switzerland); Huser, H. [Encontrol GmbH, Niederrohrdorf (Switzerland)

    2002-11-15

    This final report for the Swiss Federal Office of Energy (SFOE) takes a look at the efficiency of computer power supply units, which decreases rapidly during average computer use. The background and the purpose of the project are examined. The power supplies for personal computers are discussed and the testing arrangement used is described. Efficiency, power-factor and operating points of the units are examined. Potentials for improvement and measures to be taken are discussed. Also, action to be taken by those involved in the design and operation of such power units is proposed. Finally, recommendations for further work are made.

  4. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  5. Bringing Advanced Computational Techniques to Energy Research

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  6. Spacecraft computer technology at Southwest Research Institute

    Science.gov (United States)

    Shirley, D. J.

    1993-01-01

    Southwest Research Institute (SwRI) has developed and delivered spacecraft computers for a number of different near-Earth-orbit spacecraft including shuttle experiments and SDIO free-flyer experiments. We describe the evolution of the basic SwRI spacecraft computer design from those weighing in at 20 to 25 lb and using 20 to 30 W to newer models weighing less than 5 lb and using only about 5 W, yet delivering twice the processing throughput. Because of their reduced size, weight, and power, these newer designs are especially applicable to planetary instrument requirements. The basis of our design evolution has been the availability of more powerful processor chip sets and the development of higher density packaging technology, coupled with more aggressive design strategies in incorporating high-density FPGA technology and use of high-density memory chips. In addition to reductions in size, weight, and power, the newer designs also address the necessity of survival in the harsh radiation environment of space. Spurred by participation in such programs as MSTI, LACE, RME, Delta 181, Delta Star, and RADARSAT, our designs have evolved in response to program demands to be small, low-powered units, radiation tolerant enough to be suitable for both Earth-orbit microsats and for planetary instruments. Present designs already include MIL-STD-1750 and Multi-Chip Module (MCM) technology with near-term plans to include RISC processors and higher-density MCM's. Long term plans include development of whole-core processors on one or two MCM's.

  7. AHPCRC - Army High Performance Computing Research Center

    Science.gov (United States)

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  8. The Computer Backgrounds of Soldiers in Army Units: FY01

    National Research Council Canada - National Science Library

    Singh, Harnam

    2002-01-01

    A multi-year research effort was instituted in FY99 to examine soldiers' experiences with computers, self- perceptions of their computer skill, and their ability to identify frequently used, Windows-based icons...

  9. Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  10. Parallel computing in genomic research: advances and applications

    Directory of Open Access Journals (Sweden)

    Ocaña K

    2015-11-01

    Full Text Available Kary Ocaña,1 Daniel de Oliveira2 1National Laboratory of Scientific Computing, Petrópolis, Rio de Janeiro, 2Institute of Computing, Fluminense Federal University, Niterói, Brazil Abstract: Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. Keywords: high-performance computing, genomic research, cloud computing, grid computing, cluster computing, parallel computing

  11. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-01-01

    The application of computers to controlled thermonuclear research (CTR) is essential. In the near future the use of computers in the numerical modeling of fusion systems should increase substantially. A recent panel has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies is called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. To meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR Laboratories by a communication network. The crucial element needed for success is trained personnel. The number of people with knowledge of plasma science and engineering trained in numerical methods and computer science must be increased substantially in the next few years. Nuclear engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing

  12. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-02-01

    The role of Nuclear Engineering Education in the application of computers to controlled fusion research can be a very important one. In the near future the use of computers in the numerical modelling of fusion systems should increase substantially. A recent study group has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. In order to meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR laboratories by a communications network. The crucial element that is needed for success is trained personnel. The number of people with knowledge of plasma science and engineering that are trained in numerical methods and computer science is quite small, and must be increased substantially in the next few years. Nuclear Engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing. (U.S.)

  13. Installation of new Generation General Purpose Computer (GPC) compact unit

    Science.gov (United States)

    1991-01-01

    In the Kennedy Space Center's (KSC's) Orbiter Processing Facility (OPF) high bay 2, Spacecraft Electronics technician Ed Carter (right), wearing clean suit, prepares for (26864) and installs (26865) the new Generation General Purpose Computer (GPC) compact IBM unit in Atlantis', Orbiter Vehicle (OV) 104's, middeck avionics bay as Orbiter Systems Quality Control technician Doug Snider looks on. Both men work for NASA contractor Lockheed Space Operations Company. All three orbiters are being outfitted with the compact IBM unit, which replaces a two-unit earlier generation computer.

  14. Research Directions for AI in Computer Games

    OpenAIRE

    Fairclough, Chris; Fagan, Michael; Cunningham, Padraig; Mac Namee, Brian

    2001-01-01

    The computer games industry is now bigger than the film industry. Until recently, technology in games was driven by a desire to achieve real-time, photo-realistic graphics. To a large extent, this has now been achieved. As game developers look for new and innovative technologies to drive games development, AI is coming to the fore. This paper will examine how sophisticated AI techniques, such as those being used in mainstream academic research, can be applied to computer games ...

  15. Toxic Hazards Research Unit Annual Report: 1986

    Science.gov (United States)

    1987-04-01

    mediated hemolysis by mercapto compounds. Journal of Applied Toxicology, Volume 6, Number 5, pages 336-370, 1986. Hydrophobic tributyltin ( TBT ...7 ~OF~ AAMRL-TR-87-020 NMRI-87-2 ’~LRES 4 Iq 1986 TOXIC HAZARDS RESEARCH UNIT ANNUAL REPORT WILLIAM E. HOUSTON, Ph.D. RAYMOND S. KUTZMAN, Ph.D...and is approved for publication. FOR THE COMMANDElRi BRUCE 0. STUART, Ph.D. Director, Toxic Hazards Division Harry G. Armstrong Aerospace Medical

  16. Closing Symposium of the DFG Research Unit FOR 1066

    CERN Document Server

    Niehuis, Reinhard; Kroll, Norbert; Behrends, Kathrin

    2016-01-01

    The book reports on advanced solutions to the problem of simulating wing and nacelle stall, as presented and discussed by internationally recognized researchers at the Closing Symposium of the DFG Research Unit FOR 1066. Reliable simulations of flow separation on airfoils, wings and powered engine nacelles at high Reynolds numbers represent great challenges in defining suitable mathematical models, computing numerically accurate solutions and providing comprehensive experimental data for the validation of numerical simulations. Additional problems arise from the need to consider airframe-engine interactions and inhomogeneous onset flow conditions, as real aircraft operate in atmospheric environments with often-large distortions. The findings of fundamental and applied research into these and other related issues are reported in detail in this book, which targets all readers, academics and professionals alike, interested in the development of advanced computational fluid dynamics modeling for the simulation of...

  17. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  18. A Computational Architecture for Programmable Automation Research

    Science.gov (United States)

    Taylor, Russell H.; Korein, James U.; Maier, Georg E.; Durfee, Lawrence F.

    1987-03-01

    This short paper describes recent work at the IBM T. J. Watson Research Center directed at developing a highly flexible computational architecture for research on sensor-based programmable automation. The system described here has been designed with a focus on dynamic configurability, layered user inter-faces and incorporation of sensor-based real time operations into new commands. It is these features which distinguish it from earlier work. The system is cur-rently being implemented at IBM for research purposes and internal use and is an outgrowth of programmable automation research which has been ongoing since 1972 [e.g., 1, 2, 3, 4, 5, 6] .

  19. Architecture, systems research and computational sciences

    CERN Document Server

    2012-01-01

    The Winter 2012 (vol. 14 no. 1) issue of the Nexus Network Journal is dedicated to the theme “Architecture, Systems Research and Computational Sciences”. This is an outgrowth of the session by the same name which took place during the eighth international, interdisciplinary conference “Nexus 2010: Relationships between Architecture and Mathematics, held in Porto, Portugal, in June 2010. Today computer science is an integral part of even strictly historical investigations, such as those concerning the construction of vaults, where the computer is used to survey the existing building, analyse the data and draw the ideal solution. What the papers in this issue make especially evident is that information technology has had an impact at a much deeper level as well: architecture itself can now be considered as a manifestation of information and as a complex system. The issue is completed with other research papers, conference reports and book reviews.

  20. Cloud Computing Technologies Facilitate Earth Research

    Science.gov (United States)

    2015-01-01

    Under a Space Act Agreement, NASA partnered with Seattle-based Amazon Web Services to make the agency's climate and Earth science satellite data publicly available on the company's servers. Users can access the data for free, but they can also pay to use Amazon's computing services to analyze and visualize information using the same software available to NASA researchers.

  1. Computer systems for the control of teletherapy units

    International Nuclear Information System (INIS)

    Brace, J.A.

    1985-01-01

    This paper describes a computer-controlled tracking cobalt unit installed at the Royal Free Hospital. It is based on a standard TEM MS90 unit and operates at 90-cm source-axis distance with a geometric field size of 45 x 45 cm at that distance. It has been modified so that it can be used either manually or under computer control. There are nine parameters that can be controlled positionally and two that can be controlled in rate mode; these are presented in a table

  2. Research computing in a distributed cloud environment

    International Nuclear Information System (INIS)

    Fransham, K; Agarwal, A; Armstrong, P; Bishop, A; Charbonneau, A; Desmarais, R; Hill, N; Gable, I; Gaudet, S; Goliath, S; Impey, R; Leavett-Brown, C; Ouellete, J; Paterson, M; Pritchet, C; Penfold-Brown, D; Podaima, W; Schade, D; Sobie, R J

    2010-01-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  3. Graphics processing unit based computation for NDE applications

    Science.gov (United States)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  4. Exploiting graphics processing units for computational biology and bioinformatics.

    Science.gov (United States)

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  5. The Evolution of Computer Based Learning Software Design: Computer Assisted Teaching Unit Experience.

    Science.gov (United States)

    Blandford, A. E.; Smith, P. R.

    1986-01-01

    Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…

  6. A research program in empirical computer science

    Science.gov (United States)

    Knight, J. C.

    1991-01-01

    During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.

  7. CERR: A computational environment for radiotherapy research

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Blanco, Angel I.; Clark, Vanessa H.

    2003-01-01

    A software environment is described, called the computational environment for radiotherapy research (CERR, pronounced 'sir'). CERR partially addresses four broad needs in treatment planning research: (a) it provides a convenient and powerful software environment to develop and prototype treatment planning concepts, (b) it serves as a software integration environment to combine treatment planning software written in multiple languages (MATLAB, FORTRAN, C/C++, JAVA, etc.), together with treatment plan information (computed tomography scans, outlined structures, dose distributions, digital films, etc.), (c) it provides the ability to extract treatment plans from disparate planning systems using the widely available AAPM/RTOG archiving mechanism, and (d) it provides a convenient and powerful tool for sharing and reproducing treatment planning research results. The functional components currently being distributed, including source code, include: (1) an import program which converts the widely available AAPM/RTOG treatment planning format into a MATLAB cell-array data object, facilitating manipulation; (2) viewers which display axial, coronal, and sagittal computed tomography images, structure contours, digital films, and isodose lines or dose colorwash, (3) a suite of contouring tools to edit and/or create anatomical structures, (4) dose-volume and dose-surface histogram calculation and display tools, and (5) various predefined commands. CERR allows the user to retrieve any AAPM/RTOG key word information about the treatment plan archive. The code is relatively self-describing, because it relies on MATLAB structure field name definitions based on the AAPM/RTOG standard. New structure field names can be added dynamically or permanently. New components of arbitrary data type can be stored and accessed without disturbing system operation. CERR has been applied to aid research in dose-volume-outcome modeling, Monte Carlo dose calculation, and treatment planning optimization

  8. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  9. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  10. Cloud Computing : Research Issues and Implications

    OpenAIRE

    Marupaka Rajenda Prasad; R. Lakshman Naik; V. Bapuji

    2013-01-01

    Cloud computing is a rapidly developing and excellent promising technology. It has aroused the concern of the computer society of whole world. Cloud computing is Internet-based computing, whereby shared information, resources, and software, are provided to terminals and portable devices on-demand, like the energy grid. Cloud computing is the product of the combination of grid computing, distributed computing, parallel computing, and ubiquitous computing. It aims to build and forecast sophisti...

  11. Research of Houjiayao Unit in North China

    Science.gov (United States)

    Ji, Y.

    2012-12-01

    "Houjiayao Group" is the standard stratigraphic unit of late Pleistocene in northern China, which was created by Jia Lanpo and Wei Qi during their research on Houjiayao site. Based on the mammal, ancient human fossils and Paleolithic features, "Houjiayao Group" was thought as late Pleistocene sediments. "Houjiayao Group" was defined as late Pleistocene stratigraphic units. However, the problems of the age of "Houjiayao Group", stratigraphic division and other issues, have not yet been well resolved. These issues include: the differences of age-dating results, the unclear comparison between stratigraphic units and regional contrast, the uncertain relationship between "Houjiayao Group" and "Nihewan Layer ", and so on. Houjiayao site which located in the southeast of Houjiayao village in Dongjingji town Yangyuan County, Hebei province of China, is a very important paleolithic site. But some researches show that Houjiayao site is located at the 3th terrace of Liyigou valley and there are many opinions about the age of Houjiayao site, which varies from 20-500 thousand years. Combined with former research results and many research methods, our study was mainly focused on the key problems existing in the study of "Houjiayao Group". Through the use of sequence stratigraphy, chronostratigraphy, biostratigraphy and other theoretical methods, stratigraphic section was studied in the late Pleistocene stratigraphy and sedimentary environment. Through environmental indicators and the age-dating tests, the evolution of ancient geography and environment were identified elementarily. After analyzing informations of this area, geomorphologic investigation and stratum comparation in and around Houjiayao site were done. Houjiayao site is located on the west bank of Liyigou river, which has a tributary named Black Stone River. Two or three layers of volcanic materials were found in this area, those sediments are from a buried paleovolcano in upstream of Black Stone River. The volcanic

  12. A computer control system for a research reactor

    International Nuclear Information System (INIS)

    Crawford, K.C.; Sandquist, G.M.

    1987-01-01

    Most reactor applications until now, have not required computer control of core output. Commercial reactors are generally operated at a constant power output to provide baseline power. However, if commercial reactor cores are to become load following over a wide range, then centralized digital computer control is required to make the entire facility respond as a single unit to continual changes in power demand. Navy and research reactors are much smaller and simpler and are operated at constant power levels as required, without concern for the number of operators required to operate the facility. For navy reactors, centralized digital computer control may provide space savings and reduced personnel requirements. Computer control offers research reactors versatility to efficiently change a system to develop new ideas. The operation of any reactor facility would be enhanced by a controller that does not panic and is continually monitoring all facility parameters. Eventually very sophisticated computer control systems may be developed which will sense operational problems, diagnose the problem, and depending on the severity of the problem, immediately activate safety systems or consult with operators before taking action

  13. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  14. Parameterized algorithmics for computational social choice : nine research challenges

    NARCIS (Netherlands)

    Bredereck, R.; Chen, J.; Faliszewski, P.; Guo, J.; Niedermeier, R.; Woeginger, G.J.

    2014-01-01

    Computational Social Choice is an interdisciplinary research area involving Economics, Political Science, and Social Science on the one side, and Mathematics and Computer Science (including Artificial Intelligence and Multiagent Systems) on the other side. Typical computational problems studied in

  15. Implicit Theories of Creativity in Computer Science in the United States and China

    Science.gov (United States)

    Tang, Chaoying; Baer, John; Kaufman, James C.

    2015-01-01

    To study implicit concepts of creativity in computer science in the United States and mainland China, we first asked 308 Chinese computer scientists for adjectives that would describe a creative computer scientist. Computer scientists and non-computer scientists from China (N = 1069) and the United States (N = 971) then rated how well those…

  16. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  17. The computational future for climate change research

    International Nuclear Information System (INIS)

    Washington, Warren M

    2005-01-01

    The development of climate models has a long history starting with the building of atmospheric models and later ocean models. The early researchers were very aware of the goal of building climate models which could integrate our knowledge of complex physical interactions between atmospheric, land-vegetation, hydrology, ocean, cryospheric processes, and sea ice. The transition from climate models to earth system models is already underway with coupling of active biochemical cycles. Progress is limited by present computer capability which is needed for increasingly more complex and higher resolution climate models versions. It would be a mistake to make models too complex or too high resolution. Arriving at a 'feasible' and useful model is the challenge for the climate model community. Some of the climate change history, scientific successes, and difficulties encountered with supercomputers will be presented

  18. Computer network for experimental research using ISDN

    International Nuclear Information System (INIS)

    Ida, Katsumi; Nakanishi, Hideya

    1997-01-01

    This report describes the development of a computer network that uses the Integrated Service Digital Network (ISDN) for real-time analysis of experimental plasma physics and nuclear fusion research. Communication speed, 64/128kbps (INS64) or 1.5Mbps (INS1500) per connection, is independent of how busy the network is. When INS-1500 is used, the communication speed, which is proportional to the public telephone connection fee, can be dynamically varied from 64kbps to 1472kbps (depending on how much data are being transferred using the Bandwidth-on-Demand (BOD) function in the ISDN Router. On-demand dial-up and time-out disconnection reduce the public telephone connection fee by 10%-97%. (author)

  19. New computing techniques in physics research

    International Nuclear Information System (INIS)

    Perret-Gallix, D.; Wojcik, W.

    1990-01-01

    These proceedings relate in a pragmatic way the use of methods and techniques of software engineering and artificial intelligence in high energy and nuclear physics. Such fundamental research can only be done through the design, the building and the running of equipments and systems among the most complex ever undertaken by mankind. The use of these new methods is mandatory in such an environment. However their proper integration in these real applications raise some unsolved problems. Their solution, beyond the research field, will lead to a better understanding of some fundamental aspects of software engineering and artificial intelligence. Here is a sample of subjects covered in the proceedings : Software engineering in a multi-users, multi-versions, multi-systems environment, project management, software validation and quality control, data structure and management object oriented languages, multi-languages application, interactive data analysis, expert systems for diagnosis, expert systems for real-time applications, neural networks for pattern recognition, symbolic manipulation for automatic computation of complex processes

  20. Research on Key Technologies of Cloud Computing

    Science.gov (United States)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  1. Usage of Cloud Computing Simulators and Future Systems For Computational Research

    OpenAIRE

    Lakshminarayanan, Ramkumar; Ramalingam, Rajasekar

    2016-01-01

    Cloud Computing is an Internet based computing, whereby shared resources, software and information, are provided to computers and devices on demand, like the electricity grid. Currently, IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service) are used as a business model for Cloud Computing. Nowadays, the adoption and deployment of Cloud Computing is increasing in various domains, forcing researchers to conduct research in the area of Cloud Computing ...

  2. To the problem of reliability standardization in computer-aided manufacturing at NPP units

    International Nuclear Information System (INIS)

    Yastrebenetskij, M.A.; Shvyryaev, Yu.V.; Spektor, L.I.; Nikonenko, I.V.

    1989-01-01

    The problems of reliability standardization in computer-aided manufacturing of NPP units considering the following approaches: computer-aided manufacturing of NPP units as a part of automated technological complex; computer-aided manufacturing of NPP units as multi-functional system, are analyzed. Selection of the composition of reliability indeces for computer-aided manufacturing of NPP units for each of the approaches considered is substantiated

  3. Computer Drawing Method for Operating Characteristic Curve of PV Power Plant Array Unit

    Science.gov (United States)

    Tan, Jianbin

    2018-02-01

    According to the engineering design of large-scale grid-connected photovoltaic power stations and the research and development of many simulation and analysis systems, it is necessary to draw a good computer graphics of the operating characteristic curves of photovoltaic array elements and to propose a good segmentation non-linear interpolation algorithm. In the calculation method, Component performance parameters as the main design basis, the computer can get 5 PV module performances. At the same time, combined with the PV array series and parallel connection, the computer drawing of the performance curve of the PV array unit can be realized. At the same time, the specific data onto the module of PV development software can be calculated, and the good operation of PV array unit can be improved on practical application.

  4. Computer Presentation Programs and Teaching Research Methodologies

    OpenAIRE

    Motamedi, Vahid

    2015-01-01

    Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer pres...

  5. Norwegian computers in European energy research project

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    16 NORD computers have been ordered for the JET data acquisition and storage system. The computers will be arranged in a 'double star' configuration, developed by CERN. Two control consoles each have their own computer. All computers for communication, control, diagnostics, consoles and testing are NORD-100s while the computer for data storage and analysis is a NORD-500. The operating system is SINTRAN CAMAC SERIAL HIGHWAY with fibre optics to be used for long communications paths. The programming languages FORTRAN, NODAL, NORD PL, PASCAL and BASIC may be used. The JET project and TOKAMAK type machines are briefly described. (JIW)

  6. Affective computing: A reverence for a century of research

    NARCIS (Netherlands)

    Broek, E.L. van den

    2012-01-01

    To bring affective computing a leap forward, it is best to start with a step back. A century of research has been conducted on topics, which are crucial for affective computing. Understanding this vast amount of research will accelerate progress on affective computing. Therefore, this article

  7. Affective computing: a reverence for a century of research

    NARCIS (Netherlands)

    van den Broek, Egon; Esposito, Anna; Esposito, Antonietta M.; Vinciarelli, Alessandro; Hoffmann, Rüdiger; Müller, Vincent C.

    2012-01-01

    To bring affective computing a leap forward, it is best to start with a step back. A century of research has been conducted on topics, which are crucial for affective computing. Understanding this vast amount of research will accelerate progress on affective computing. Therefore, this article

  8. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  9. Sandia`s computer support units: The first three years

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R.N. [Sandia National Labs., Albuquerque, NM (United States). Labs. Computing Dept.

    1997-11-01

    This paper describes the method by which Sandia National Laboratories has deployed information technology to the line organizations and to the desktop as part of the integrated information services organization under the direction of the Chief Information officer. This deployment has been done by the Computer Support Unit (CSU) Department. The CSU approach is based on the principle of providing local customer service with a corporate perspective. Success required an approach that was both customer compelled at times and market or corporate focused in most cases. Above all, a complete solution was required that included a comprehensive method of technology choices and development, process development, technology implementation, and support. It is the authors hope that this information will be useful in the development of a customer-focused business strategy for information technology deployment and support. Descriptions of current status reflect the status as of May 1997.

  10. Basic Research in the United States.

    Science.gov (United States)

    Handler, Philip

    1979-01-01

    Presents a discussion of the development of basic research in the U.S. since World War II. Topics include the creation of the federal agencies, physics and astronomy, chemistry, earth science, life science, the environment, and social science. (BB)

  11. Patterns of research utilization on patient care units

    Directory of Open Access Journals (Sweden)

    Lander Janice

    2008-06-01

    Full Text Available Abstract Background Organizational context plays a central role in shaping the use of research by healthcare professionals. The largest group of professionals employed in healthcare organizations is nurses, putting them in a position to influence patient and system outcomes significantly. However, investigators have often limited their study on the determinants of research use to individual factors over organizational or contextual factors. Methods The purpose of this study was to examine the determinants of research use among nurses working in acute care hospitals, with an emphasis on identifying contextual determinants of research use. A comparative ethnographic case study design was used to examine seven patient care units (two adult and five pediatric units in four hospitals in two Canadian provinces (Ontario and Alberta. Data were collected over a six-month period by means of quantitative and qualitative approaches using an array of instruments and extensive fieldwork. The patient care unit was the unit of analysis. Drawing on the quantitative data and using correspondence analysis, relationships between various factors were mapped using the coefficient of variation. Results Units with the highest mean research utilization scores clustered together on factors such as nurse critical thinking dispositions, unit culture (as measured by work creativity, work efficiency, questioning behavior, co-worker support, and the importance nurses place on access to continuing education, environmental complexity (as measured by changing patient acuity and re-sequencing of work, and nurses' attitudes towards research. Units with moderate research utilization clustered on organizational support, belief suspension, and intent to use research. Higher nursing workloads and lack of people support clustered more closely to units with the lowest research utilization scores. Conclusion Modifiable characteristics of organizational context at the patient care unit

  12. Research in Applied Mathematics, Fluid Mechanics and Computer Science

    Science.gov (United States)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.

  13. [Research activities in applied mathematics, fluid mechanics, and computer science

    Science.gov (United States)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.

  14. Parallel computing in genomic research: advances and applications.

    Science.gov (United States)

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  15. Computer aided design of fast neutron therapy units

    International Nuclear Information System (INIS)

    Gileadi, A.E.; Gomberg, H.J.; Lampe, I.

    1980-01-01

    Conceptual design of a radiation-therapy unit using fusion neutrons is presently being considered by KMS Fusion, Inc. As part of this effort, a powerful and versatile computer code, TBEAM, has been developed which enables the user to determine physical characteristics of the fast neutron beam generated in the facility under consideration, using certain given design parameters of the facility as inputs. TBEAM uses the method of statistical sampling (Monte Carlo) to solve the space, time and energy dependent neutron transport equation relating to the conceptual design described by the user-supplied input parameters. The code traces the individual source neutrons as they propagate throughout the shield-collimator structure of the unit, and it keeps track of each interaction by type, position and energy. In its present version, TBEAM is applicable to homogeneous and laminated shields of spherical geometry, to collimator apertures of conical shape, and to neutrons emitted by point sources or such plate sources as are used in neutron generators of various types. TBEAM-generated results comparing the performance of point or plate sources in otherwise identical shield-collimator configurations are presented in numerical form. (H.K.)

  16. 16 NORD computers for Europeen fusion research

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    16 computers of the NORD type, mannfactured by Nordata's French subsidiary, have been ordered for the JET data acquisition and storage system. 15 of the computers are of the NORD-100 model and the 16th is a NORD-500. The computers are to be arranged in a 'double star' configuration, developed by CERN. CAMAC Serial Highway is to be used for long communications paths, using fibre optics. The operating system is SINTRAN and the programming languages FORTRAN, NODAC, NORD PL, PASCAL and BASIC may be used. (JIW)

  17. Status of computational fluid dynamics in the United States

    International Nuclear Information System (INIS)

    Kutler, P.; Steger, J.L.; Bailey, F.R.

    1987-01-01

    CFD-related progress in U.S. aerospace industries and research institutions is evaluated with respect to methods employed, their applications, and the computer technologies employed in their implementation. Goals for subsonic CFD are primarily aimed at greater fuel efficiency; those of supersonic CFD involve the achievement of high sustained cruise efficiency. Transatmospheric/hypersonic vehicles are noted to have recently become important concerns for CFD efforts. Attention is given to aspects of discretization, Euler and Navier-Stokes general purpose codes, zonal equation methods, internal and external flows, and the impact of supercomputers and their networks in advancing the state-of-the-art. 91 references

  18. Computer Presentation Programs and Teaching Research Methodologies

    Directory of Open Access Journals (Sweden)

    Vahid Motamedi

    2015-05-01

    Full Text Available Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer presentation programs promises to increase the effectiveness of learning by making content more readily available, by reducing the cost and effort of producing quality content, and by allowing content to be more easily shared. This paper describes how problems can be overcome by using presentation packages for instruction.

  19. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  20. Eye-tracking research in computer-mediated language learning

    NARCIS (Netherlands)

    Michel, Marije; Smith, Bryan

    2017-01-01

    Though eye-tracking technology has been used in reading research for over 100 years, researchers have only recently begun to use it in studies of computer-assisted language learning (CALL). This chapter provides an overview of eye-tracking research to date, which is relevant to computer-mediated

  1. Senior Computational Scientist | Center for Cancer Research

    Science.gov (United States)

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP),

  2. Portable Computer Technology (PCT) Research and Development Program Phase 2

    Science.gov (United States)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  3. Social things : design research on social computing

    NARCIS (Netherlands)

    Hu, J.; Luen, P.; Rau, P.

    2016-01-01

    In the era of social networking and computing, things and people are more and more interconnected, giving rise to not only new opportunities but also new challenges in designing new products that are networked, and services that are adaptive to their human users and context aware in their physical

  4. Computers in Language Testing: Present Research and Some Future Directions.

    Science.gov (United States)

    Brown, James Dean

    1997-01-01

    Explores recent developments in the use of computers in language testing in four areas: (1) item banking; (2) computer-assisted language testing; (3) computerized-adaptive language testing; and (4) research on the effectiveness of computers in language testing. Examines educational measurement literature in an attempt to forecast the directions…

  5. Amorphous Computing: A Research Agenda for the Near Future

    Czech Academy of Sciences Publication Activity Database

    Wiedermann, Jiří

    2012-01-01

    Roč. 11, č. 1 (2012), s. 59-63 ISSN 1567-7818 R&D Projects: GA ČR GAP202/10/1333 Institutional research plan: CEZ:AV0Z10300504 Keywords : amorphous computing * nano-machines * flying amorphous computer Subject RIV: IN - Informatics, Computer Science Impact factor: 0.683, year: 2012

  6. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  7. DOE research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-12-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models whose execution is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex; consequently, it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  8. Interim research assessment 2003-2005 - Computer Science

    NARCIS (Netherlands)

    Mouthaan, A.J.; Hartel, Pieter H.

    This report primarily serves as a source of information for the 2007 Interim Research Assessment Committee for Computer Science at the three technical universities in the Netherlands. The report also provides information for others interested in our research activities.

  9. Archaeomagnetic research in the United States midcontinent

    Science.gov (United States)

    Lengyel, Stacey Nicole

    This dissertation combines archaeomagnetic and independent chronometric data from 240 archaeological features to develop a regional secular variation curve for the U.S. midcontinent. These data were obtained from features located between 31.5--40.5° N latitude and 82.5--93.5° W longitude that have been dated to between 60 and 10,700 cal BP. The archaeomagnetic samples were collected from 41 sites within this region over the past 35 years under the direction of four different researchers: Robert DuBois (University of Oklahoma), Daniel Wolfman (University of Arkansas and New Mexico State Museum), Wulf Gose (University of Texas at Austin), and myself. In this project, the data are initially smoothed through the moving windows method to form the first approximation of the curve. Outlier analyses and pairwise statistical comparisons are utilized to refine the smoothed curve, and the results are compared to other Holocene-aged secular variation records from North America. These analyses indicate that the final curve should be treated as three distinct segments with different precision and use recommendations. First, the 850--75 cal BP segment can be used to date archaeomagnetic sample from the project area with expected temporal precision of 100--200 years. Second, the 2528--850 cal BP segment can be used cautiously to date archaeomagnetic samples with an expected temporal precision of 200--300 years. Third, the 9755--4650 cal BP segment should be used for contextual dating purposes only, in that an undated sample can be put into a regional context through comparison with the segment's constituent samples. Finally, three archaeological problems are addressed through the archaeomagnetic data. First, archaeomagnetic data are used to resolve the temporal conflict between an eastern Tennessee structure's morphology and a much earlier radiocarbon date obtained for the structure. Then, archaeomagnetic data are used to address a number of internal chronology questions

  10. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  11. Digital computer control of a research nuclear reactor

    International Nuclear Information System (INIS)

    Crawford, Kevan

    1986-01-01

    Currently, the use of digital computers in energy producing systems has been limited to data acquisition functions. These computers have greatly reduced human involvement in the moment to moment decision process and the crisis decision process, thereby improving the safety of the dynamic energy producing systems. However, in addition to data acquisition, control of energy producing systems also includes data comparison, decision making, and control actions. The majority of the later functions are accomplished through the use of analog computers in a distributed configuration. The lack of cooperation and hence, inefficiency in distributed control, and the extent of human interaction in critical phases of control have provided the incentive to improve the later three functions of energy systems control. Properly applied, centralized control by digital computers can increase efficiency by making the system react as a single unit and by implementing efficient power changes to match demand. Additionally, safety will be improved by further limiting human involvement to action only in the case of a failure of the centralized control system. This paper presents a hardware and software design for the centralized control of a research nuclear reactor by a digital computer. Current nuclear reactor control philosophies which include redundancy, inherent safety in failure, and conservative yet operational scram initiation were used as the bases of the design. The control philosophies were applied to the power monitoring system, the fuel temperature monitoring system, the area radiation monitoring system, and the overall system interaction. Unlike the single function analog computers that are currently used to control research and commercial reactors, this system will be driven by a multifunction digital computer. Specifically, the system will perform control rod movements to conform with operator requests, automatically log the required physical parameters during reactor

  12. Educational Technology Network: a computer conferencing system dedicated to applications of computers in radiology practice, research, and education.

    Science.gov (United States)

    D'Alessandro, M P; Ackerman, M J; Sparks, S M

    1993-11-01

    Educational Technology Network (ET Net) is a free, easy to use, on-line computer conferencing system organized and funded by the National Library of Medicine that is accessible via the SprintNet (SprintNet, Reston, VA) and Internet (Merit, Ann Arbor, MI) computer networks. It is dedicated to helping bring together, in a single continuously running electronic forum, developers and users of computer applications in the health sciences, including radiology. ET Net uses the Caucus computer conferencing software (Camber-Roth, Troy, NY) running on a microcomputer. This microcomputer is located in the National Library of Medicine's Lister Hill National Center for Biomedical Communications and is directly connected to the SprintNet and the Internet networks. The advanced computer conferencing software of ET Net allows individuals who are separated in space and time to unite electronically to participate, at any time, in interactive discussions on applications of computers in radiology. A computer conferencing system such as ET Net allows radiologists to maintain contact with colleagues on a regular basis when they are not physically together. Topics of discussion on ET Net encompass all applications of computers in radiological practice, research, and education. ET Net has been in successful operation for 3 years and has a promising future aiding radiologists in the exchange of information pertaining to applications of computers in radiology.

  13. New computing techniques in physics research

    International Nuclear Information System (INIS)

    Becks, Karl-Heinz; Perret-Gallix, Denis

    1994-01-01

    New techniques were highlighted by the ''Third International Workshop on Software Engineering, Artificial Intelligence and Expert Systems for High Energy and Nuclear Physics'' in Oberammergau, Bavaria, Germany, from October 4 to 8. It was the third workshop in the series; the first was held in Lyon in 1990 and the second at France-Telecom site near La Londe les Maures in 1992. This series of workshops covers a broad spectrum of problems. New, highly sophisticated experiments demand new techniques in computing, in hardware as well as in software. Software Engineering Techniques could in principle satisfy the needs for forthcoming accelerator experiments. The growing complexity of detector systems demands new techniques in experimental error diagnosis and repair suggestions; Expert Systems seem to offer a way of assisting the experimental crew during data-taking

  14. Computational research on lithium ion battery materials

    Science.gov (United States)

    Tang, Ping

    Crystals of LiFePO4 and related materials have recently received a lot of attention due to their very promising use as cathodes in rechargeable lithium ion batteries. This thesis studied the electronic structures of FePO 4 and LiMPO4, where M=Mn, Fe, Co and Ni within the framework of density-functional theory. The first study compared the electronic structures of the LiMPO 4 and FePO4 materials in their electrochemically active olivine form, using the LAPW (linear augmented plane wave) method [1]. A comparison of results for various spin configurations suggested that the ferromagnetic configuration can serve as a useful approximation for studying general features of these systems. The partial densities of states for the LiMPO4 materials are remarkably similar to each other, showing the transition metal 3d states forming narrow bands above the O 2p band. By contrast, in absence of Li, the majority spin transition metal 3d states are well-hybridized with the O 2p band in FePO4. The second study compared the electronic structures of FePO4 in several crystal structures including an olivine, monoclinic, quartz-like, and CrVO4-like form [2,3]. For this work, in addition to the LAPW method, PAW (Projector Augmented Wave) [4], and PWscf (plane-wave pseudopotential) [5] methods were used. By carefully adjusting the computational parameters, very similar results were achieved for the three independent computational methods. Results for the relative stability of the four crystal structures are reported. In addition, partial densities of state analyses show qualitative information about the crystal field splittings and bond hybridizations and help rationalize the understanding of the electrochemical and stability properties of these materials.

  15. Open-Source Software in Computational Research: A Case Study

    Directory of Open Access Journals (Sweden)

    Sreekanth Pannala

    2008-04-01

    Full Text Available A case study of open-source (OS development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  16. PS3 CELL Development for Scientific Computation and Research

    Science.gov (United States)

    Christiansen, M.; Sevre, E.; Wang, S. M.; Yuen, D. A.; Liu, S.; Lyness, M. D.; Broten, M.

    2007-12-01

    The Cell processor is one of the most powerful processors on the market, and researchers in the earth sciences may find its parallel architecture to be very useful. A cell processor, with 7 cores, can easily be obtained for experimentation by purchasing a PlayStation 3 (PS3) and installing linux and the IBM SDK. Each core of the PS3 is capable of 25 GFLOPS giving a potential limit of 150 GFLOPS when using all 6 SPUs (synergistic processing units) by using vectorized algorithms. We have used the Cell's computational power to create a program which takes simulated tsunami datasets, parses them, and returns a colorized height field image using ray casting techniques. As expected, the time required to create an image is inversely proportional to the number of SPUs used. We believe that this trend will continue when multiple PS3s are chained using OpenMP functionality and are in the process of researching this. By using the Cell to visualize tsunami data, we have found that its greatest feature is its power. This fact entwines well with the needs of the scientific community where the limiting factor is time. Any algorithm, such as the heat equation, that can be subdivided into multiple parts can take advantage of the PS3 Cell's ability to split the computations across the 6 SPUs reducing required run time by one sixth. Further vectorization of the code can allow for 4 simultanious floating point operations by using the SIMD (single instruction multiple data) capabilities of the SPU increasing efficiency 24 times.

  17. The application of computed tomography in wound ballistics research

    International Nuclear Information System (INIS)

    Tsiatis, Nick; Moraitis, Konstantinos; Papadodima, Stavroula; Spiliopoulou, Chara; Kelekis, Alexis; Kelesis, Christos; Efstathopoulos, Efstathios; Kordolaimi, Sofia; Ploussi, Agapi

    2015-01-01

    In wound ballistics research there is a relationship between the data that characterize a bullet and the injury resulted after shooting when it perforates the human body. The bullet path in the human body following skin perforation as well as the damaging effect cannot always be predictable as they depend on various factors such as the bullet's characteristics (velocity, distance, type of firearm and so on) and the tissue types that the bullet passes through. The purpose of this presentation is to highlight the contribution of Computed Tomography (CT) in wound ballistics research. Using CT technology and studying virtual “slices” of specific areas on scanned human bodies, allows the evaluation of density and thickness of the skin, the subcutaneous tissue, the muscles, the vital organs and the bones. Density data taken from Hounsfield units can be converted in g/ml by using the appropriate software. By evaluating the results of this study, the anatomy of the human body utilizing ballistic gel will be reproduced in order to simulate the path that a bullet follows. The biophysical analysis in wound ballistics provides another application of CT technology, which is commonly used for diagnostic and therapeutic purposes in various medical disciplines. (paper)

  18. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  19. Hand held control unit for controlling a display screen-oriented computer game, and a display screen-oriented computer game having one or more such control units

    NARCIS (Netherlands)

    2001-01-01

    A hand-held control unit is used to control a display screen-oriented computer game. The unit comprises a housing with a front side, a set of control members lying generally flush with the front side for through actuating thereof controlling actions of in-game display items, and an output for

  20. A Computational Lens on Design Research

    Science.gov (United States)

    Hoyles, Celia; Noss, Richard

    2015-01-01

    In this commentary, we briefly review the collective effort of design researchers to weave theory with empirical results, in order to gain a better understanding of the processes of learning. We seek to respond to this challenging agenda by centring on the evolution of one sub-field: namely that which involves investigations within a…

  1. Computer Science Research Review 1974-75

    Science.gov (United States)

    1975-08-01

    mwmmmimmm^m^mmmrm. : i i 1 Faculty and Visitors Mario Barbaccl Research Associate B.S., Universidad Nacional de Ingenieria , Lima, Peru (1966...Engineer, Universidad Nacional de Ingenieria , Lima, Peru (1968) Ph.D., Carnegie-Mellon University (1974) Carnegie. 1969: Design Automation

  2. Computer science security research and human subjects: emerging considerations for research ethics boards.

    Science.gov (United States)

    Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin

    2011-06-01

    This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.

  3. Highlight: Research Chair unites four West African universities in ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-04-14

    Apr 14, 2016 ... Local and regional experts including researchers, consultants, and academics convened in Cotonou, Benin, on February 26, 2015 to launch a Research Chair on EcoHealth. The Chair unites four West African universities that have pledged to reduce air pollution and non-communicable respiratory ...

  4. Future Directions for Urban Forestry Research in the United States

    Science.gov (United States)

    John F. Dwyer; David J. Nowak; Gary W. Watson

    2002-01-01

    Urban forestry research promises to continue to be an integral part of the growth and development of forestry in urban and urbanizing areas of the United States. The future is expected to bring increased emphasis on research in support of the care of trees and other plants, ecological restoration, and comprehensive and adaptive management across the landscape....

  5. Infusing Active Learning into the Research Methods Unit

    Science.gov (United States)

    Bluestone, Cheryl

    2007-01-01

    The research methods unit of survey psychology classes introduces important concepts of scientific reasoning and fluency, making it an ideal course in which to deliver enhanced curricula. To increase interest and engagement, the author developed an expanded research methods and statistics module to give students the opportunity to explore…

  6. [Activities of Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  7. Comparison of Tissue Density in Hounsfield Units in Computed Tomography and Cone Beam Computed Tomography.

    Science.gov (United States)

    Varshowsaz, Masoud; Goorang, Sepideh; Ehsani, Sara; Azizi, Zeynab; Rahimian, Sepideh

    2016-03-01

    Bone quality and quantity assessment is one of the most important steps in implant treatment planning. Different methods such as computed tomography (CT) and recently suggested cone beam computed tomography (CBCT) with lower radiation dose and less time and cost are used for bone density assessment. This in vitro study aimed to compare the tissue density values in Hounsfield units (HUs) in CBCT and CT scans of different tissue phantoms with two different thicknesses, two different image acquisition settings and in three locations in the phantoms. Four different tissue phantoms namely hard tissue, soft tissue, air and water were scanned by three different CBCT and a CT system in two thicknesses (full and half) and two image acquisition settings (high and low kVp and mA). The images were analyzed at three sites (middle, periphery and intermediate) using eFilm software. The difference in density values was analyzed by ANOVA and correction coefficient test (P<0.05). There was a significant difference between density values in CBCT and CT scans in most situations, and CBCT values were not similar to CT values in any of the phantoms in different thicknesses and acquisition parameters or the three different sites. The correction coefficients confirmed the results. CBCT is not reliable for tissue density assessment. The results were not affected by changes in thickness, acquisition parameters or locations.

  8. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  9. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  10. Applied Computational Fluid Dynamics at NASA Ames Research Center

    Science.gov (United States)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  11. Computer processing techniques in digital radiography research

    International Nuclear Information System (INIS)

    Pickens, D.R.; Kugel, J.A.; Waddill, W.B.; Smith, G.D.; Martin, V.N.; Price, R.R.; James, A.E. Jr.

    1985-01-01

    In the Department of Radiology and Radiological Sciences, Vanderbilt University Medical Center, and the Center for Medical Imaging Research, Nashville, TN, there are several activities which are designed to increase the information available from film-screen acquisition as well as from direct digital acquisition of radiographic information. Two of the projects involve altering the display of images after acquisition, either to remove artifacts present as a result of the acquisition process or to change the manner in which the image is displayed to improve the perception of details in the image. These two projects use methods which can be applied to any type of digital image, but are being implemented with images digitized from conventional x-ray film. One of these research endeavors involves mathematical alteration of the image to correct for motion artifacts or registration errors between images that will be subtracted. Another applies well-known image processing methods to digital radiographic images to improve the image contrast and enhance subtle details in the image. A third project involves the use of dual energy imaging with a digital radiography system to reconstruct images which demonstrate either soft tissue details or the osseous structures. These projects are discussed in greater detail in the following sections of this communication

  12. Fiction as an Introduction to Computer Science Research

    Science.gov (United States)

    Goldsmith, Judy; Mattei, Nicholas

    2014-01-01

    The undergraduate computer science curriculum is generally focused on skills and tools; most students are not exposed to much research in the field, and do not learn how to navigate the research literature. We describe how fiction reviews (and specifically science fiction) are used as a gateway to research reviews. Students learn a little about…

  13. Developing a Research Agenda for Ubiquitous Computing in Schools

    Science.gov (United States)

    Zucker, Andrew

    2004-01-01

    Increasing numbers of states, districts, and schools provide every student with a computing device; for example, the middle schools in Maine maintain wireless Internet access and the students receive laptops. Research can provide policymakers with better evidence of the benefits and costs of 1:1 computing and establish which factors make 1:1…

  14. Model and Computing Experiment for Research and Aerosols Usage Management

    Directory of Open Access Journals (Sweden)

    Daler K. Sharipov

    2012-09-01

    Full Text Available The article deals with a math model for research and management of aerosols released into the atmosphere as well as numerical algorithm used as hardware and software systems for conducting computing experiment.

  15. Human-Computer Interaction and Information Management Research Needs

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — In a visionary future, Human-Computer Interaction HCI and Information Management IM have the potential to enable humans to better manage their lives through the use...

  16. Research on the Use of Computer-Assisted Instruction.

    Science.gov (United States)

    Craft, C. O.

    1982-01-01

    Reviews recent research studies related to computer assisted instruction (CAI). The studies concerned program effectiveness, teaching of psychomotor skills, tool availability, and factors affecting the adoption of CAI. (CT)

  17. Department of Materials Research by Computers - Overview

    International Nuclear Information System (INIS)

    Parlinski, K.

    2000-01-01

    Full text: During 1999 the main activity of the Department has been gradually moved to ab initio calculations. For that we have used the approach of density functional theory with either local density approximation (LDA) or generalized gradient approximation (GGA). This approach allows to find the structure and dynamics of any system which can be represented by a supercell with periodic boundary conditions. Our interests were limited to study of structure and dynamics of crystals. We have used two different packages of software: CASTEP and VASP and the pseudopotentials delivered with these programs. This method is parameter-free, which means that one needs to know only the physical constants, like Planck constant, element masses and electron charge, in order to get a quantitative result. We have concentrated our efforts around four subjects: calculation of phonon dispersion curves for polar crystals with LO/TO splitting, calculations of lattice dynamics of chalcopyrites, calculations of energy barriers in molecular crystals, and calculations of elastic properties and phase transitions in geologically important materials. We have calculated the phonon dispersion curves in ionic cubic MgO crystal. The phonon modes at Γ point are split to LO and TO modes. We have proposed a method to calculate this splitting by an elongated supercell. The results agree very well with the coherent inelastic neutron scattering data. Similar effects have been considered in hexagonal GaN, rhombohedral LiNbO 3 , and tetragonal Sn0 2 . In the two last crystals soft modes, responsible for the phase transitions, were found. Intensive calculations were carried out for tetragonal chalcopyrites structure. Each unit cell contains 16 atoms. By using enlarged supercell of 2 x 2 x 1 size with 64 atoms we could obtain valid phonon dispersion curves for CuInSe 2 , AgGaSe 2 , AgGaTe 2 , which agree with neutron data and Raman scattering results. Studies of the molecular motion in KSCN crystal were

  18. The care unit in nursing home research: evidence in support of a definition.

    Science.gov (United States)

    Estabrooks, Carole A; Morgan, Debra G; Squires, Janet E; Boström, Anne-Marie; Slaughter, Susan E; Cummings, Greta G; Norton, Peter G

    2011-04-14

    Defining what constitutes a resident care unit in nursing home research is both a conceptual and practical challenge. The aim of this paper is to provide evidence in support of a definition of care unit in nursing homes by demonstrating: (1) its feasibility for use in data collection, (2) the acceptability of aggregating individual responses to the unit level, and (3) the benefit of including unit level data in explanatory models. An observational study design was used. Research (project) managers, healthcare aides, care managers, nursing home administrators and directors of care from thirty-six nursing homes in the Canadian prairie provinces of Alberta, Saskatchewan and Manitoba provided data for the study. A definition of care unit was developed and applied in data collection and analyses. A debriefing session was held with research managers to investigate their experiences with using the care unit definition. In addition, survey responses from 1258 healthcare aides in 25 of the 36 nursing homes in the study, that had more than one care unit, were analyzed using a multi-level modeling approach. Trained field workers administered the Alberta Context Tool (ACT), a 58-item self-report survey reflecting 10 organizational context concepts, to healthcare aides using computer assisted personal interviews. To assess the appropriateness of obtaining unit level scores, we assessed aggregation statistics (ICC(1), ICC(2), η², and ω²), and to assess the value of using the definition of unit in explanatory models, we performed multi-level modeling. In 10 of the 36 nursing homes, the care unit definition developed was used to align the survey data (for analytic purposes) to specific care units as designated by our definition, from that reported by the facility administrator. The aggregation statistics supported aggregating the healthcare aide responses on the ACT to the realigned unit level. Findings from the multi-level modeling further supported unit level aggregation. A

  19. Activities of the Research Institute for Advanced Computer Science

    Science.gov (United States)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  20. Noncontrast computed tomographic Hounsfield unit evaluation of cerebral venous thrombosis: a quantitative evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Besachio, David A. [University of Utah, Department of Radiology, Salt Lake City (United States); United States Navy, Bethesda, MD (United States); Quigley, Edward P.; Shah, Lubdha M.; Salzman, Karen L. [University of Utah, Department of Radiology, Salt Lake City (United States)

    2013-08-15

    Our objective is to determine the utility of noncontrast Hounsfield unit values, Hounsfield unit values corrected for the patient's hematocrit, and venoarterial Hounsfield unit difference measurements in the identification of intracranial venous thrombosis on noncontrast head computed tomography. We retrospectively reviewed noncontrast head computed tomography exams performed in both normal patients and those with cerebral venous thrombosis, acquiring Hounsfield unit values in normal and thrombosed cerebral venous structures. Also, we acquired Hounsfield unit values in the internal carotid artery for comparison to thrombosed and nonthrombosed venous structures and compared the venous Hounsfield unit values to the patient's hematocrit. A significant difference is identified between Hounsfield unit values in thrombosed and nonthrombosed venous structures. Applying Hounsfield unit threshold values of greater than 65, a Hounsfield unit to hematocrit ratio of greater than 1.7, and venoarterial difference values greater than 15 alone and in combination, the majority of cases of venous thrombosis are identifiable on noncontrast head computed tomography. Absolute Hounsfield unit values, Hounsfield unit to hematocrit ratios, and venoarterial Hounsfield unit value differences are a useful adjunct in noncontrast head computed tomographic evaluation of cerebral venous thrombosis. (orig.)

  1. Noncontrast computed tomographic Hounsfield unit evaluation of cerebral venous thrombosis: a quantitative evaluation

    International Nuclear Information System (INIS)

    Besachio, David A.; Quigley, Edward P.; Shah, Lubdha M.; Salzman, Karen L.

    2013-01-01

    Our objective is to determine the utility of noncontrast Hounsfield unit values, Hounsfield unit values corrected for the patient's hematocrit, and venoarterial Hounsfield unit difference measurements in the identification of intracranial venous thrombosis on noncontrast head computed tomography. We retrospectively reviewed noncontrast head computed tomography exams performed in both normal patients and those with cerebral venous thrombosis, acquiring Hounsfield unit values in normal and thrombosed cerebral venous structures. Also, we acquired Hounsfield unit values in the internal carotid artery for comparison to thrombosed and nonthrombosed venous structures and compared the venous Hounsfield unit values to the patient's hematocrit. A significant difference is identified between Hounsfield unit values in thrombosed and nonthrombosed venous structures. Applying Hounsfield unit threshold values of greater than 65, a Hounsfield unit to hematocrit ratio of greater than 1.7, and venoarterial difference values greater than 15 alone and in combination, the majority of cases of venous thrombosis are identifiable on noncontrast head computed tomography. Absolute Hounsfield unit values, Hounsfield unit to hematocrit ratios, and venoarterial Hounsfield unit value differences are a useful adjunct in noncontrast head computed tomographic evaluation of cerebral venous thrombosis. (orig.)

  2. Summary of researches being performed in the Institute of Mathematics and Computer Science on computer science and information technologies

    Directory of Open Access Journals (Sweden)

    Artiom Alhazov

    2008-07-01

    Full Text Available Evolution of the informatization notion (which assumes automation of majority of human activities applying computers, computer networks, information technologies towards the notion of {\\it Global Information Society} (GIS challenges the determination of new paradigms of society: automation and intellectualization of production, new level of education and teaching, formation of new styles of work, active participation in decision making, etc. To assure transition to GIS for any society, including that from Republic of Moldova, requires both special training and broad application of progressive technologies and information systems. Methodological aspects concerning impact of GIS creation over the citizen, economic unit, national economy in the aggregate demands a profound study. Without systematic approach to these aspects the GIS creation would have confront great difficulties. Collective of researchers from the Institute of Mathematics and Computer Science (IMCS of Academy of Sciences of Moldova, which work in the field of computer science, constitutes the center of advanced researches and activates in those directions of researches of computer science which facilitate technologies and applications without of which the development of GIS cannot be assured.

  3. Automatic processing of radioimmunological research data on a computer

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Gorodenko, A.N.; Gorodenko, S.I.

    1979-01-01

    A program ''CRITEST'' in the language PL/1 for the EC computer intended for automatic processing of the results of radioimmunological research has been elaborated. The program works in the operation system of the OC EC computer and is performed in the section OC 60 kb. When compiling the program Eitken's modified algorithm was used. The program was clinically approved when determining a number of hormones: CTH, T 4 , T 3 , TSH. The automatic processing of the radioimmunological research data on the computer makes it possible to simplify the labour-consuming analysis and to raise its accuracy

  4. Computer Backgrounds of Soldiers in Army Units: FY00

    National Research Council Canada - National Science Library

    Fober, Gene

    2001-01-01

    .... Soldiers from four Army installations were given a survey that examined their experiences with computers, self-perceptions of their skill, and an objective test of their ability to identify Windows-based icons...

  5. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    Science.gov (United States)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  6. Economics of conservation systems research in the Southeastern United States

    Science.gov (United States)

    The use of conservation systems in crop production is not a new concept in the southeastern United States. In 1978, researchers from across the Southeast met in Griffin, Georgia for the first annual Southern Conservation Agricultural Systems Conference. Four of the ten presentations specifically men...

  7. Conducting qualitative research within Clinical Trials Units: avoiding potential pitfalls.

    Science.gov (United States)

    Cooper, Cindy; O'Cathain, Alicia; Hind, Danny; Adamson, Joy; Lawton, Julia; Baird, Wendy

    2014-07-01

    The value of using qualitative research within or alongside randomised controlled trials (RCTs) is becoming more widely accepted. Qualitative research may be conducted concurrently with pilot or full RCTs to understand the feasibility and acceptability of the interventions being tested, or to improve trial conduct. Clinical Trials Units (CTUs) in the United Kingdom (UK) manage large numbers of RCTs and, increasingly, manage the qualitative research or collaborate with qualitative researchers external to the CTU. CTUs are beginning to explicitly manage the process, for example, through the use of standard operating procedures for designing and implementing qualitative research with trials. We reviewed the experiences of two UK Clinical Research Collaboration (UKCRC) registered CTUs of conducting qualitative research concurrently with RCTs. Drawing on experiences gained from 15 studies, we identify the potential for the qualitative research to undermine the successful completion or scientific integrity of RCTs. We show that potential problems can arise from feedback of interim or final qualitative findings to members of the trial team or beyond, in particular reporting qualitative findings whilst the trial is on-going. The problems include: We make recommendations for improving the management of qualitative research within CTUs. Copyright © 2014. Published by Elsevier Inc.

  8. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  9. Measuring Impact of EPAs Computational Toxicology Research (BOSC)

    Science.gov (United States)

    Computational Toxicology (CompTox) research at the EPA was initiated in 2005. Since 2005, CompTox research efforts have made tremendous advances in developing new approaches to evaluate thousands of chemicals for potential health effects. The purpose of this case study is to trac...

  10. United States Domestic Research Reactor Infrastructure TRIGA Reactor Fuel Support

    International Nuclear Information System (INIS)

    Morrell, Douglas

    2011-01-01

    The United State Domestic Research Reactor Infrastructure Program at the Idaho National Laboratory manages and provides project management, technical, quality engineering, quality inspection and nuclear material support for the United States Department of Energy sponsored University Reactor Fuels Program. This program provides fresh, unirradiated nuclear fuel to Domestic University Research Reactor Facilities and is responsible for the return of the DOE-owned, irradiated nuclear fuel over the life of the program. This presentation will introduce the program management team, the universities supported by the program, the status of the program and focus on the return process of irradiated nuclear fuel for long term storage at DOE managed receipt facilities. It will include lessons learned from research reactor facilities that have successfully shipped spent fuel elements to DOE receipt facilities.

  11. Research and photovoltaic industry at the United States

    International Nuclear Information System (INIS)

    Lerouge, Ch.; Herino, R.; Delville, R.; Allegre, R.

    2006-06-01

    For a big country as the United States, the solar energy can be a solution for the air quality improvement, the greenhouse gases fight and the reduction of the dependence to the imported petroleum and also for the economic growth by the increase of the employment in the solar industry sector. This document takes stock on the photovoltaic in the United States in the industrial and research domains. The american photovoltaic industry is the third behind the Japan and the Germany. (A.L.B.)

  12. A 1.5 GFLOPS Reciprocal Unit for Computer Graphics

    DEFF Research Database (Denmark)

    Nannarelli, Alberto; Rasmussen, Morten Sleth; Stuart, Matthias Bo

    2006-01-01

    The reciprocal operation 1/d is a frequent operation performed in graphics processors (GPUs). In this work, we present the design of a radix-16 reciprocal unit based on the algorithm combining the traditional digit-by-digit algorithm and the approximation of the reciprocal by one Newton-Raphson i...

  13. Computational chemistry in pharmaceutical research: at the crossroads.

    Science.gov (United States)

    Bajorath, Jürgen

    2012-01-01

    Computational approaches are an integral part of pharmaceutical research. However, there are many of unsolved key questions that limit the scientific progress in the still evolving computational field and its impact on drug discovery. Importantly, a number of these questions are not new but date back many years. Hence, it might be difficult to conclusively answer them in the foreseeable future. Moreover, the computational field as a whole is characterized by a high degree of heterogeneity and so is, unfortunately, the quality of its scientific output. In light of this situation, it is proposed that changes in scientific standards and culture should be seriously considered now in order to lay a foundation for future progress in computational research.

  14. Natural computing for mechanical systems research: A tutorial overview

    Science.gov (United States)

    Worden, Keith; Staszewski, Wieslaw J.; Hensman, James J.

    2011-01-01

    A great many computational algorithms developed over the past half-century have been motivated or suggested by biological systems or processes, the most well-known being the artificial neural networks. These algorithms are commonly grouped together under the terms soft or natural computing. A property shared by most natural computing algorithms is that they allow exploration of, or learning from, data. This property has proved extremely valuable in the solution of many diverse problems in science and engineering. The current paper is intended as a tutorial overview of the basic theory of some of the most common methods of natural computing as they are applied in the context of mechanical systems research. The application of some of the main algorithms is illustrated using case studies. The paper also attempts to give some indication as to which of the algorithms emerging now from the machine learning community are likely to be important for mechanical systems research in the future.

  15. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Bach, Matthias

    2014-07-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  16. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    International Nuclear Information System (INIS)

    Bach, Matthias

    2014-01-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  17. Science Policy Research Unit annual report 1984/1985

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    The report covers the principal research programmes of the Unit, and also describes its graduate and undergraduate teaching, (listing subjects of postgraduate research) and library services. A list of 1984 published papers and staff is presented. The principle research programmes include: the setting up of the Designated Research Centre on Science, Technology and Energy Policy in British Economic Development; policy for technology and industrial innovation in industrialised countries; energy economics, technology and policy (with a sub-section on coal); European science and industrial policy; science policy and research evaluation; technical change and employment opportunities in the UK economy; new technology, manpower and skills; technology and social change; science and technology policy in developing countries; military technology and arms limitation. Short-term projects and consultancy are also covered.

  18. Ethical Guidelines for Computer Security Researchers: "Be Reasonable"

    Science.gov (United States)

    Sassaman, Len

    For most of its existence, the field of computer science has been lucky enough to avoid ethical dilemmas by virtue of its relatively benign nature. The subdisciplines of programming methodology research, microprocessor design, and so forth have little room for the greater questions of human harm. Other, more recently developed sub-disciplines, such as data mining, social network analysis, behavioral profiling, and general computer security, however, open the door to abuse of users by practitioners and researchers. It is therefore the duty of the men and women who chart the course of these fields to set rules for themselves regarding what sorts of actions on their part are to be considered acceptable and what should be avoided or handled with caution out of ethical concerns. This paper deals solely with the issues faced by computer security researchers, be they vulnerability analysts, privacy system designers, malware experts, or reverse engineers.

  19. A Survey of Comics Research in Computer Science

    Directory of Open Access Journals (Sweden)

    Olivier Augereau

    2018-06-01

    Full Text Available Graphic novels such as comic books and mangas are well known all over the world. The digital transition started to change the way people are reading comics: more and more on smartphones and tablets, and less and less on paper. In recent years, a wide variety of research about comics has been proposed and might change the way comics are created, distributed and read in the future. Early work focuses on low level document image analysis. Comic books are complex; they contains text, drawings, balloons, panels, onomatopoeia, etc. Different fields of computer science covered research about user interaction and content generation such as multimedia, artificial intelligence, human–computer interaction, etc. with different sets of values. We review the previous research about comics in computer science to state what has been done and give some insights about the main outlooks.

  20. Large scale computing in the Energy Research Programs

    International Nuclear Information System (INIS)

    1991-05-01

    The Energy Research Supercomputer Users Group (ERSUG) comprises all investigators using resources of the Department of Energy Office of Energy Research supercomputers. At the December 1989 meeting held at Florida State University (FSU), the ERSUG executive committee determined that the continuing rapid advances in computational sciences and computer technology demanded a reassessment of the role computational science should play in meeting DOE's commitments. Initial studies were to be performed for four subdivisions: (1) Basic Energy Sciences (BES) and Applied Mathematical Sciences (AMS), (2) Fusion Energy, (3) High Energy and Nuclear Physics, and (4) Health and Environmental Research. The first two subgroups produced formal subreports that provided a basis for several sections of this report. Additional information provided in the AMS/BES is included as Appendix C in an abridged form that eliminates most duplication. Additionally, each member of the executive committee was asked to contribute area-specific assessments; these assessments are included in the next section. In the following sections, brief assessments are given for specific areas, a conceptual model is proposed that the entire computational effort for energy research is best viewed as one giant nation-wide computer, and then specific recommendations are made for the appropriate evolution of the system

  1. [General practice research units in Denmark: multidisciplinary research in support of practical work].

    Science.gov (United States)

    Reventlow, Susanne; Broholm, Katalin Alexa Király; Mäkelä, Marjukka

    2014-01-01

    In Denmark the general practice research units operating in connection with universities provide a home base, training and methodology support for researchers in the field from medical students to general practitioners carrying out practical work. Research issues frequently require a multidisciplinary approach and use of different kinds of materials. Problems arising from the practical work of general practitioners take priority in the wide selection of topics. The units have networked efficiently with organizations of general practitioners and medical education. The combination of research environments has created synergy benefiting everybody and increased the scientific productivity and visibility of the field.

  2. Computational Science Research in Support of Petascale Electromagnetic Modeling

    International Nuclear Information System (INIS)

    Lee, L.-Q.

    2008-01-01

    Computational science research components were vital parts of the SciDAC-1 accelerator project and are continuing to play a critical role in newly-funded SciDAC-2 accelerator project, the Community Petascale Project for Accelerator Science and Simulation (ComPASS). Recent advances and achievements in the area of computational science research in support of petascale electromagnetic modeling for accelerator design analysis are presented, which include shape determination of superconducting RF cavities, mesh-based multilevel preconditioner in solving highly-indefinite linear systems, moving window using h- or p- refinement for time-domain short-range wakefield calculations, and improved scalable application I/O

  3. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  4. Neuromorphic Computing – From Materials Research to Systems Architecture Roundtable

    Energy Technology Data Exchange (ETDEWEB)

    Schuller, Ivan K. [Univ. of California, San Diego, CA (United States); Stevens, Rick [Argonne National Lab. (ANL), Argonne, IL (United States); Univ. of Chicago, IL (United States); Pino, Robinson [Dept. of Energy (DOE) Office of Science, Washington, DC (United States); Pechan, Michael [Dept. of Energy (DOE) Office of Science, Washington, DC (United States)

    2015-10-29

    Computation in its many forms is the engine that fuels our modern civilization. Modern computation—based on the von Neumann architecture—has allowed, until now, the development of continuous improvements, as predicted by Moore’s law. However, computation using current architectures and materials will inevitably—within the next 10 years—reach a limit because of fundamental scientific reasons. DOE convened a roundtable of experts in neuromorphic computing systems, materials science, and computer science in Washington on October 29-30, 2015 to address the following basic questions: Can brain-like (“neuromorphic”) computing devices based on new material concepts and systems be developed to dramatically outperform conventional CMOS based technology? If so, what are the basic research challenges for materials sicence and computing? The overarching answer that emerged was: The development of novel functional materials and devices incorporated into unique architectures will allow a revolutionary technological leap toward the implementation of a fully “neuromorphic” computer. To address this challenge, the following issues were considered: The main differences between neuromorphic and conventional computing as related to: signaling models, timing/clock, non-volatile memory, architecture, fault tolerance, integrated memory and compute, noise tolerance, analog vs. digital, and in situ learning New neuromorphic architectures needed to: produce lower energy consumption, potential novel nanostructured materials, and enhanced computation Device and materials properties needed to implement functions such as: hysteresis, stability, and fault tolerance Comparisons of different implementations: spin torque, memristors, resistive switching, phase change, and optical schemes for enhanced breakthroughs in performance, cost, fault tolerance, and/or manufacturability.

  5. The aging of biomedical research in the United States.

    Directory of Open Access Journals (Sweden)

    Kirstin R W Matthews

    Full Text Available In the past 30 years, the average age of biomedical researchers has steadily increased. The average age of an investigator at the National Institutes of Health (NIH rose from 39 to 51 between 1980 and 2008. The aging of the biomedical workforce was even more apparent when looking at first-time NIH grantees. The average age of a new investigator was 42 in 2008, compared to 36 in 1980. To determine if the rising barriers at NIH for entry in biomedical research might impact innovative ideas and research, we analyzed the research and publications of Nobel Prize winners from 1980 to 2010 to assess the age at which their pioneering research occurred. We established that in the 30-year period, 96 scientists won the Nobel Prize in medicine or chemistry for work related to biomedicine, and that their groundbreaking research was conducted at an average age of 41-one year younger than the average age of a new investigator at NIH. Furthermore, 78% of the Nobel Prize winners conducted their research before the age of 51, the average age of an NIH principal investigator. This suggested that limited access to NIH might inhibit research potential and novel projects, and could impact biomedicine and the next generation scientists in the United States.

  6. Computer Use and Vision-Related Problems Among University Students In Ajman, United Arab Emirate

    OpenAIRE

    Shantakumari, N; Eldeeb, R; Sreedharan, J; Gopal, K

    2014-01-01

    Background: The extensive use of computers as medium of teaching and learning in universities necessitates introspection into the extent of computer related health disorders among student population. Aim: This study was undertaken to assess the pattern of computer usage and related visual problems, among University students in Ajman, United Arab Emirates. Materials and Methods: A total of 500 Students studying in Gulf Medical University, Ajman and Ajman University of Science and Technology we...

  7. Computers, Laptops and Tools. ACER Research Monograph No. 56.

    Science.gov (United States)

    Ainley, Mary; Bourke, Valerie; Chatfield, Robert; Hillman, Kylie; Watkins, Ian

    In 1997, Balwyn High School (Australia) instituted a class of 28 Year 7 students to use laptop computers across the curriculum. This report details findings from an action research project that monitored important aspects of what happened when this program was introduced. A range of measures was developed to assess the influence of the use of…

  8. Results of a Research Evaluating Quality of Computer Science Education

    Science.gov (United States)

    Záhorec, Ján; Hašková, Alena; Munk, Michal

    2012-01-01

    The paper presents the results of an international research on a comparative assessment of the current status of computer science education at the secondary level (ISCED 3A) in Slovakia, the Czech Republic, and Belgium. Evaluation was carried out based on 14 specific factors gauging the students' point of view. The authors present qualitative…

  9. National Energy Research Scientific Computing Center 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  10. Units for on-line control with the ES computer in physical investigations

    International Nuclear Information System (INIS)

    Efimov, L.G.

    1983-01-01

    The peripheral part of complex of means created for organization of ES computer operation on-line with experimental devices, comprising two units is described. The first unit is employed as a part of a universal driver of the Camac branch for connection with microprogram ES computer channel controller and ensures multioperational (up to 44 record varieties) device software service. The bilateral data exchange between the device and computer can be performed by bytes as well as 16 or 24-digit words using CAMAC group modes and with maximum rate of 1.25 Mbyte/s. The second unit is meant for synchronization of the data aquisition process with the device starting system and for ensuring the device operator dialogue with the computer

  11. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  12. Review of research on advanced computational science in FY2016

    International Nuclear Information System (INIS)

    2017-12-01

    Research on advanced computational science for nuclear applications, based on “Plan to Achieve Medium- to Long-term Objectives of the Japan Atomic Energy Agency (Medium- to Long-term Plan)”, has been performed at Center for Computational Science and e-Systems (CCSE), Japan Atomic Energy Agency. CCSE established the committee consisting of outside experts and authorities which does research evaluation and advices for the assistance of the research and development. This report summarizes the followings. (1) Results of the R and D performed at CCSE in FY 2016 (April 1st, 2016 - March 31st, 2017), (2) Results of the evaluation on the R and D by the committee in FY 2016. (author)

  13. Review of research on advanced computational science in FY2015

    International Nuclear Information System (INIS)

    2017-01-01

    Research on advanced computational science for nuclear applications, based on 'Plan to Achieve Medium- to Long-term Objectives of the Japan Atomic Energy Agency (Medium- to Long-term Plan)', has been performed at Center for Computational Science and e-Systems (CCSE), Japan Atomic Energy Agency. CCSE established the committee consisting of outside experts and authorities which does research evaluation and advices for the assistance of the research and development. This report summarizes the followings. (1) Results of the R and D performed at CCSE in FY 2015 (April 1st, 2015 - March 31st, 2016), (2) Results of the evaluation on the R and D by the committee in FY 2015 (April 1st, 2015 - March 31st, 2016). (author)

  14. Computing for magnetic fusion energy research: An updated vision

    International Nuclear Information System (INIS)

    Henline, P.; Giarrusso, J.; Davis, S.; Casper, T.

    1993-01-01

    This Fusion Computing Council perspective is written to present the primary of the fusion computing community at the time of publication of the report necessarily as a summary of the information contained in the individual sections. These concerns reflect FCC discussions during final review of contributions from the various working groups and portray our latest information. This report itself should be considered as dynamic, requiring periodic updating in an attempt to track rapid evolution of the computer industry relevant to requirements for magnetic fusion research. The most significant common concern among the Fusion Computing Council working groups is networking capability. All groups see an increasing need for network services due to the use of workstations, distributed computing environments, increased use of graphic services, X-window usage, remote experimental collaborations, remote data access for specific projects and other collaborations. Other areas of concern include support for workstations, enhanced infrastructure to support collaborations, the User Service Centers, NERSC and future massively parallel computers, and FCC sponsored workshops

  15. Grid computing : enabling a vision for collaborative research

    International Nuclear Information System (INIS)

    von Laszewski, G.

    2002-01-01

    In this paper the authors provide a motivation for Grid computing based on a vision to enable a collaborative research environment. The authors vision goes beyond the connection of hardware resources. They argue that with an infrastructure such as the Grid, new modalities for collaborative research are enabled. They provide an overview showing why Grid research is difficult, and they present a number of management-related issues that must be addressed to make Grids a reality. They list projects that provide solutions to subsets of these issues

  16. Shared-resource computing for small research labs.

    Science.gov (United States)

    Ackerman, M J

    1982-04-01

    A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.

  17. Research foci of computing research in South Africa as reflected by publications in the South African computer journal

    CSIR Research Space (South Africa)

    Kotzé, P

    2009-01-01

    Full Text Available of research articles published in SACJ over its first 40 volumes of the journal using the ACM Computing Classification Scheme as basis. In their analysis the authors divided the publications into three cycles of more or less six years in order to identify...

  18. A high arctic experience of uniting research and monitoring

    Science.gov (United States)

    Schmidt, Niels Martin; Christensen, Torben R.; Roslin, Tomas

    2017-07-01

    Monitoring is science keeping our thumb on the pulse of the environment to detect any changes of concern for societies. Basic science is the question-driven search for fundamental processes and mechanisms. Given the firm root of monitoring in human interests and needs, basic sciences have often been regarded as scientifically "purer"—particularly within university-based research communities. We argue that the dichotomy between "research" and "monitoring" is an artificial one, and that this artificial split clouds the definition of scientific goals and leads to suboptimal use of resources. We claim that the synergy between the two scientific approaches is well distilled by science conducted under extreme logistic constraints, when scientists are forced to take full advantage of both the data and the infrastructure available. In evidence of this view, we present our experiences from two decades of uniting research and monitoring at the remote research facility Zackenberg in High Arctic Greenland. For this site, we show how the combination of insights from monitoring with the mechanistic understanding obtained from basic research has yielded the most complete understanding of the system—to the benefit of all, and as an example to follow. We therefore urge scientists from across the continuum from monitoring to research to come together, to disregard old division lines, and to work together to expose a comprehensive picture of ecosystem change and its consequences.

  19. Radioactivity and United Kingdom estuaries: an overview identifying research priorities

    International Nuclear Information System (INIS)

    Hamilton, E.I.; Clifton, R.J.; Stevens, H.E.

    1985-05-01

    The report consists of the results of an evaluation of research priorities for the environmental radioactivity of estuaries, (and near shore waters) of the United Kingdom. The format of this report is:(i) general conclusions for the future requirements for research in the field of environmental radioactivity; (ii) an overview of some specific recommendations for research; and (iii) an appendix in which a comprehensive evaluation of the research priorities for specific areas of research are given. On the basis that man is the prime target for concern and protection, special attention has been given to the environment in the vicinity of the British Nuclear Fuels (BNFL) reprocessing plant at Sellafield, Cumbria, which is the source of major releases of a variety of radionuclides into the natural environment. Subjects covered in the Appendix are: site factors; pathways to man; source term; hot particles; terrestrial inputs; surveys and monitoring; analysis; organics; field versus laboratory data; biology; bioaccumulation factors; some bioaccumulators of radioactivity; bioturbation; bacteria; genetics; natural change; sediment; resuspension; surfaces; Ksub(d) factors; pore liquids; diagenesis and the ageing processes; airborne transport of radionuclides; models; natural radioactivity; public opinion; recreation; the ICRP; the ALARA principle; decommissioning of nuclear power stations; identification of research requirements; environmental radioactivity - the national effort. (U.K.)

  20. The United States of America and scientific research.

    Directory of Open Access Journals (Sweden)

    Gregory J Hather

    2010-08-01

    Full Text Available To gauge the current commitment to scientific research in the United States of America (US, we compared federal research funding (FRF with the US gross domestic product (GDP and industry research spending during the past six decades. In order to address the recent globalization of scientific research, we also focused on four key indicators of research activities: research and development (R&D funding, total science and engineering doctoral degrees, patents, and scientific publications. We compared these indicators across three major population and economic regions: the US, the European Union (EU and the People's Republic of China (China over the past decade. We discovered a number of interesting trends with direct relevance for science policy. The level of US FRF has varied between 0.2% and 0.6% of the GDP during the last six decades. Since the 1960s, the US FRF contribution has fallen from twice that of industrial research funding to roughly equal. Also, in the last two decades, the portion of the US government R&D spending devoted to research has increased. Although well below the US and the EU in overall funding, the current growth rate for R&D funding in China greatly exceeds that of both. Finally, the EU currently produces more science and engineering doctoral graduates and scientific publications than the US in absolute terms, but not per capita. This study's aim is to facilitate a serious discussion of key questions by the research community and federal policy makers. In particular, our results raise two questions with respect to: a the increasing globalization of science: "What role is the US playing now, and what role will it play in the future of international science?"; and b the ability to produce beneficial innovations for society: "How will the US continue to foster its strengths?"

  1. The United States of America and scientific research.

    Science.gov (United States)

    Hather, Gregory J; Haynes, Winston; Higdon, Roger; Kolker, Natali; Stewart, Elizabeth A; Arzberger, Peter; Chain, Patrick; Field, Dawn; Franza, B Robert; Lin, Biaoyang; Meyer, Folker; Ozdemir, Vural; Smith, Charles V; van Belle, Gerald; Wooley, John; Kolker, Eugene

    2010-08-16

    To gauge the current commitment to scientific research in the United States of America (US), we compared federal research funding (FRF) with the US gross domestic product (GDP) and industry research spending during the past six decades. In order to address the recent globalization of scientific research, we also focused on four key indicators of research activities: research and development (R&D) funding, total science and engineering doctoral degrees, patents, and scientific publications. We compared these indicators across three major population and economic regions: the US, the European Union (EU) and the People's Republic of China (China) over the past decade. We discovered a number of interesting trends with direct relevance for science policy. The level of US FRF has varied between 0.2% and 0.6% of the GDP during the last six decades. Since the 1960s, the US FRF contribution has fallen from twice that of industrial research funding to roughly equal. Also, in the last two decades, the portion of the US government R&D spending devoted to research has increased. Although well below the US and the EU in overall funding, the current growth rate for R&D funding in China greatly exceeds that of both. Finally, the EU currently produces more science and engineering doctoral graduates and scientific publications than the US in absolute terms, but not per capita. This study's aim is to facilitate a serious discussion of key questions by the research community and federal policy makers. In particular, our results raise two questions with respect to: a) the increasing globalization of science: "What role is the US playing now, and what role will it play in the future of international science?"; and b) the ability to produce beneficial innovations for society: "How will the US continue to foster its strengths?"

  2. Software for a magnetic dick drive unit connected with a computer TPA-1001-i

    International Nuclear Information System (INIS)

    Elizarov, O.I.; Mateeva, A.; Salamatin, I.M.

    1977-01-01

    The disk drive unit with capacity 1250 K and minimal addressing part of memory 1 sector (128 10 -12-bit words) is connected with a TPA-1001-i computer. The operation regimes of the controller, functions and formats of the commands used are described as well as the software. The data transfer between the computer and magnetic disk drive unit is realized by means of programs relocatable in a binary form. These are inserted in a standard program library with modular structure. The manner of control handling and data transfer betweeen programs stored in the library on a magnetic disk drive are described. The resident program (100sub(8) words) inserted in a monitor takes into account special features of the disk drive unit being used. The algorithms of correction programs for a disk drive unit, programs for rewriting the library from papertape to disk drive unit and of the program for writing and reading the monitor are described

  3. Exploring the SCOAP3 Research Contributions of the United States

    Science.gov (United States)

    Marsteller, Matthew

    2016-03-01

    The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) is a successful global partnership of libraries, funding agencies and research centers. This presentation will inform the audience about SCOAP3 and also delve into descriptive statistics of the United States' intellectual contribution to particle physics via these open access journals. Exploration of the SCOAP3 particle physics literature using a variety of metrics tools such as Web of Science™, InCites™, Scopus® and SciVal will be shared. ORA or Sci2 will be used to visualize author collaboration networks.

  4. Intelligent Buildings and pervasive computing - research perspectives and discussions

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Krogh, Peter Gall; Kyng, Morten

    2001-01-01

    computers are everywhere, for everyone, at all times. Where IT becomes a still more integrated part of our environments with processors, sensors, and actuators connected via high-speed networks and combined with new visualization devices ranging from projections directly in the eye to large panorama......Intelligent Buildings have been the subject of research and commercial interest for more than two decades. The different perspectives range from monitoring and controlling energy consumption over interactive rooms supporting work in offices and leisure in the home, to buildings providing...... information to by-passers in plazas and urban environments. This paper puts forward the hypothesis that the coming decade will witness a dramatic increase in both quality and quantity of intelligent buildings due to the emerging field of pervasive computing: the next generation computing environments where...

  5. Research progress on quantum informatics and quantum computation

    Science.gov (United States)

    Zhao, Yusheng

    2018-03-01

    Quantum informatics is an emerging interdisciplinary subject developed by the combination of quantum mechanics, information science, and computer science in the 1980s. The birth and development of quantum information science has far-reaching significance in science and technology. At present, the application of quantum information technology has become the direction of people’s efforts. The preparation, storage, purification and regulation, transmission, quantum coding and decoding of quantum state have become the hotspot of scientists and technicians, which have a profound impact on the national economy and the people’s livelihood, technology and defense technology. This paper first summarizes the background of quantum information science and quantum computer and the current situation of domestic and foreign research, and then introduces the basic knowledge and basic concepts of quantum computing. Finally, several quantum algorithms are introduced in detail, including Quantum Fourier transform, Deutsch-Jozsa algorithm, Shor’s quantum algorithm, quantum phase estimation.

  6. UPTF test instrumentation. Measurement system identification, engineering units and computed parameters

    International Nuclear Information System (INIS)

    Sarkar, J.; Liebert, J.; Laeufer, R.

    1992-11-01

    This updated version of the previous report /1/ contains, besides additional instrumentation needed for 2D/3D Programme, the supplementary instrumentation in the inlet plenum of SG simulator and hot and cold leg of broken loop, the cold leg of intact loops and the upper plenum to meet the requirements (Test Phase A) of the UPTF Programme, TRAM, sponsored by the Federal Minister of Research and Technology (BMFT) of the Federal Republic of Germany. For understanding, the derivation and the description of the identification codes for the entire conventional and advanced measurement systems classifying the function, and the equipment unit, key, as adopted in the conventional power plants, have been included. Amendments have also been made to the appendices. In particular, the list of measurement systems covering the measurement identification code, instrument, measured quantity, measuring range, band width, uncertainty and sensor location has been updated and extended to include the supplementary instrumentation. Beyond these amendments, the uncertainties of measurements have been precisely specified. The measurement identification codes which also stand for the identification of the corresponding measured quantities in engineering units and the identification codes derived therefrom for the computed parameters have been adequately detailed. (orig.)

  7. A low cost computer-controlled electrochemical measurement system for education and research

    International Nuclear Information System (INIS)

    Cottis, R.A.

    1989-01-01

    With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs

  8. A low cost computer-controlled electrochemical measurement system for education and research

    Energy Technology Data Exchange (ETDEWEB)

    Cottis, R A [Manchester Univ. (UK). Inst. of Science and Technology

    1989-01-01

    With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs.

  9. Directions of ICF research in the United States

    International Nuclear Information System (INIS)

    Hogan, W.J.; Campbell, E.M.

    1997-01-01

    Inertial confinement fusion (ICF) research in the United States is in a dramatic upswing. Technical progress continues at a rapid pace and with the start of the construction of the National Ignition Facility (NIF) this year the total U.S. budget for ICF for fiscal year 1997 stands at $380 million. The NIF is being built as an essential component of the U.S. Stockpile Stewardship and Management Program which has been formulated to assure the continued safety, reliability, and performance of the downsized nuclear weapons stockpile in the absence of nuclear tests. This paper will discuss some of the directions that the ICF research is now taking. (AIP) copyright 1997 American Institute of Physics

  10. United States Domestic Research Reactor Infrastructure - TRIGA Reactor Fuel Support

    International Nuclear Information System (INIS)

    Morrell, Douglas

    2008-01-01

    The purpose of the United State Domestic Research Reactor Infrastructure Program is to provide fresh nuclear reactor fuel to United States universities at no, or low, cost to the university. The title of the fuel remains with the United States government and when universities are finished with the fuel, the fuel is returned to the United States government. The program is funded by the United States Department of Energy - Nuclear Energy division, managed by Department of Energy - Idaho Field Office, and contracted to the Idaho National Laboratory's Management and Operations Contractor - Battelle Energy Alliance. Program has been at Idaho since 1977 and INL subcontracts with 26 United States domestic reactor facilities (13 TRIGA facilities, 9 plate fuel facilities, 2 AGN facilities, 1 Pulstar fuel facility, 1 Critical facility). University has not shipped fuel since 1968 and as such, we have no present procedures for shipping spent fuel. In addition: floor loading rate is unknown, many interferences must be removed to allow direct access to the reactor tank, floor space in the reactor cell is very limited, pavement ends inside our fence; some of the surface is not finished. The whole approach is narrow, curving and downhill. A truck large enough to transport the cask cannot pull into the lot and then back out (nearly impossible / refused by drivers); a large capacity (100 ton), long boom crane would have to be used due to loading dock obstructions. Access to the entrance door is on a sidewalk. The campus uses it as a road for construction equipment, deliveries and security response. Large trees are on both sides of sidewalk. Spent fuel shipments have never been done, no procedures approved or in place, no approved casks, no accident or safety analysis for spent fuel loading. Any cask assembly used in this facility will have to be removed from one crane, moved on the floor and then attached to another crane to get from the staging area to the reactor room. Reactor

  11. Computer codes for problems of isotope and radiation research

    International Nuclear Information System (INIS)

    Remer, M.

    1986-12-01

    A survey is given of computer codes for problems in isotope and radiation research. Altogether 44 codes are described as titles with abstracts. 17 of them are in the INIS scope and are processed individually. The subjects are indicated in the chapter headings: 1) analysis of tracer experiments, 2) spectrum calculations, 3) calculations of ion and electron trajectories, 4) evaluation of gamma irradiation plants, and 5) general software

  12. Molecular image in biomedical research. Molecular imaging unit of the National Cancer Research Center

    International Nuclear Information System (INIS)

    Perez Bruzon, J.; Mulero Anhiorte, F.

    2010-01-01

    This article has two basic objectives. firstly, it will review briefly the most important imaging techniques used in biomedical research indicting the most significant aspects related to their application in the preclinical stage. Secondly, it will present a practical application of these techniques in a pure biomedical research centre (not associated to a clinical facility). Practical aspects such as organisation, equipment, work norms, shielding of the Spanish National Cancer Research Centre (CNIO) Imaging Unit will be shown. This is a pioneering facility in the application of these techniques in research centres without any dependence or any direct relationship with other hospital Nuclear Medicine services. (Author) 7 refs.

  13. Computing at the leading edge: Research in the energy sciences

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, A.A.; Van Dyke, P.T. [eds.

    1994-02-01

    The purpose of this publication is to highlight selected scientific challenges that have been undertaken by the DOE Energy Research community. The high quality of the research reflected in these contributions underscores the growing importance both to the Grand Challenge scientific efforts sponsored by DOE and of the related supporting technologies that the National Energy Research Supercomputer Center (NERSC) and other facilities are able to provide. The continued improvement of the computing resources available to DOE scientists is prerequisite to ensuring their future progress in solving the Grand Challenges. Titles of articles included in this publication include: the numerical tokamak project; static and animated molecular views of a tumorigenic chemical bound to DNA; toward a high-performance climate systems model; modeling molecular processes in the environment; lattice Boltzmann models for flow in porous media; parallel algorithms for modeling superconductors; parallel computing at the Superconducting Super Collider Laboratory; the advanced combustion modeling environment; adaptive methodologies for computational fluid dynamics; lattice simulations of quantum chromodynamics; simulating high-intensity charged-particle beams for the design of high-power accelerators; electronic structure and phase stability of random alloys.

  14. Computing at the leading edge: Research in the energy sciences

    International Nuclear Information System (INIS)

    Mirin, A.A.; Van Dyke, P.T.

    1994-01-01

    The purpose of this publication is to highlight selected scientific challenges that have been undertaken by the DOE Energy Research community. The high quality of the research reflected in these contributions underscores the growing importance both to the Grand Challenge scientific efforts sponsored by DOE and of the related supporting technologies that the National Energy Research Supercomputer Center (NERSC) and other facilities are able to provide. The continued improvement of the computing resources available to DOE scientists is prerequisite to ensuring their future progress in solving the Grand Challenges. Titles of articles included in this publication include: the numerical tokamak project; static and animated molecular views of a tumorigenic chemical bound to DNA; toward a high-performance climate systems model; modeling molecular processes in the environment; lattice Boltzmann models for flow in porous media; parallel algorithms for modeling superconductors; parallel computing at the Superconducting Super Collider Laboratory; the advanced combustion modeling environment; adaptive methodologies for computational fluid dynamics; lattice simulations of quantum chromodynamics; simulating high-intensity charged-particle beams for the design of high-power accelerators; electronic structure and phase stability of random alloys

  15. Computer Science Research Institute 2004 annual report of activities.

    Energy Technology Data Exchange (ETDEWEB)

    DeLap, Barbara J.; Womble, David Eugene; Ceballos, Deanna Rose

    2006-03-01

    This report summarizes the activities of the Computer Science Research Institute (CSRI) at Sandia National Laboratories during the period January 1, 2004 to December 31, 2004. During this period the CSRI hosted 166 visitors representing 81 universities, companies and laboratories. Of these 65 were summer students or faculty. The CSRI partially sponsored 2 workshops and also organized and was the primary host for 4 workshops. These 4 CSRI sponsored workshops had 140 participants--74 from universities, companies and laboratories, and 66 from Sandia. Finally, the CSRI sponsored 14 long-term collaborative research projects and 5 Sabbaticals.

  16. Computer Science Research Institute 2003 annual report of activities.

    Energy Technology Data Exchange (ETDEWEB)

    DeLap, Barbara J.; Womble, David Eugene; Ceballos, Deanna Rose

    2006-03-01

    This report summarizes the activities of the Computer Science Research Institute (CSRI) at Sandia National Laboratories during the period January 1, 2003 to December 31, 2003. During this period the CSRI hosted 164 visitors representing 78 universities, companies and laboratories. Of these 78 were summer students or faculty members. The CSRI partially sponsored 5 workshops and also organized and was the primary host for 3 workshops. These 3 CSRI sponsored workshops had 178 participants--137 from universities, companies and laboratories, and 41 from Sandia. Finally, the CSRI sponsored 18 long-term collaborative research projects and 5 Sabbaticals.

  17. Computer Science Research Institute 2005 annual report of activities.

    Energy Technology Data Exchange (ETDEWEB)

    Watts, Bernadette M.; Collis, Samuel Scott; Ceballos, Deanna Rose; Womble, David Eugene

    2008-04-01

    This report summarizes the activities of the Computer Science Research Institute (CSRI) at Sandia National Laboratories during the period January 1, 2005 to December 31, 2005. During this period, the CSRI hosted 182 visitors representing 83 universities, companies and laboratories. Of these, 60 were summer students or faculty. The CSRI partially sponsored 2 workshops and also organized and was the primary host for 3 workshops. These 3 CSRI sponsored workshops had 105 participants, 78 from universities, companies and laboratories, and 27 from Sandia. Finally, the CSRI sponsored 12 long-term collaborative research projects and 3 Sabbaticals.

  18. ElectroEncephaloGraphics: Making waves in computer graphics research.

    Science.gov (United States)

    Mustafa, Maryam; Magnor, Marcus

    2014-01-01

    Electroencephalography (EEG) is a novel modality for investigating perceptual graphics problems. Until recently, EEG has predominantly been used for clinical diagnosis, in psychology, and by the brain-computer-interface community. Researchers are extending it to help understand the perception of visual output from graphics applications and to create approaches based on direct neural feedback. Researchers have applied EEG to graphics to determine perceived image and video quality by detecting typical rendering artifacts, to evaluate visualization effectiveness by calculating the cognitive load, and to automatically optimize rendering parameters for images and videos on the basis of implicit neural feedback.

  19. Phenomenography and Grounded Theory as Research Methods in Computing Education Research Field

    Science.gov (United States)

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and…

  20. Can Nuclear Installations and Research Centres Adopt Cloud Computing Platform-

    International Nuclear Information System (INIS)

    Pichan, A.; Lazarescu, M.; Soh, S.T.

    2015-01-01

    Cloud Computing is arguably one of the recent and highly significant advances in information technology today. It produces transformative changes in the history of computing and presents many promising technological and economic opportunities. The pay-per-use model, the computing power, abundance of storage, skilled resources, fault tolerance and the economy of scale it offers, provides significant advantages to enterprises to adopt cloud platform for their business needs. However, customers especially those dealing with national security, high end scientific research institutions, critical national infrastructure service providers (like power, water) remain very much reluctant to move their business system to the cloud. One of the main concerns is the question of information security in the cloud and the threat of the unknown. Cloud Service Providers (CSP) indirectly encourages this perception by not letting their customers see what is behind their virtual curtain. Jurisdiction (information assets being stored elsewhere), data duplication, multi-tenancy, virtualisation and decentralized nature of data processing are the default characteristics of cloud computing. Therefore traditional approach of enforcing and implementing security controls remains a big challenge and largely depends upon the service provider. The other biggest challenge and open issue is the ability to perform digital forensic investigations in the cloud in case of security breaches. Traditional approaches to evidence collection and recovery are no longer practical as they rely on unrestricted access to the relevant systems and user data, something that is not available in the cloud model. This continues to fuel high insecurity for the cloud customers. In this paper we analyze the cyber security and digital forensics challenges, issues and opportunities for nuclear facilities to adopt cloud computing. We also discuss the due diligence process and applicable industry best practices which shall be

  1. Research in the United States relative to geochemistry and health

    Science.gov (United States)

    Petrie, W.L.; Cannon, H.L.

    1979-01-01

    Increasing concern regarding the effects of the geochemical environment on health in the United States has fostered research studies in a number of universities and government agencies. The necessity to evaluate the effects of natural and man-made elemental excesses in the environment on health requires the establishment of requirements and tolerance limits for the various elements in water and crops. Maps of the geographic distribution of these elements in rocks, surficial materials and ground and surface waters are also essential for comparison with the occurrence of disease. Funding support for research projects that relate to various parameters of these problems emanates largely from a few federal agencies, and much of the work is conducted at government, university and private facilities. An example of the latter is the National Academy of Sciences-National Research Council, which has several components that are addressing a variety of comparative studies of the geochemical environment related to health; studies involve specific trace elements (like selenium and magnesium), diseases (like cancer, urolithiasis and cardiovascular disease), other health factors (like aging and nutrition) and links with timely major problems (like the health effects of greatly increasing the use of coal). ?? 1979.

  2. Progress report of a research program in computational physics

    International Nuclear Information System (INIS)

    Guralnik, G.S.

    1990-01-01

    Task D's research is focused on the understanding of elementary particle physics through the techniques of quantum field theory. We make intensive use of computers to aid our research. During the last year we have made significant progress in understanding the weak interactions through the use of Monte Carlo methods as applied to the equations of quenched lattice QCD. We have launched a program to understand full (not quenched) lattice QCD on relatively large lattices using massively parallel computers. Because of our awareness that Monte Carlo methods might not be able to give a good solution to field theories with the computer power likely to be available to us for the forseeable future we have launched an entirely different numerical approach to study these problems. This ''Source Galerkin'' method is based on an algebraic approach to the field theoretic equations of motion and is (somewhat) related to variational and finite element techniques applied to a source rather than a coordinate space. The results for relatively simple problems are sensationally good. In particular, fermions can be treated in a way which allows them to retain their status as independent dynamical entities in the theory. 8 refs

  3. Centralized digital computer control of a research nuclear reactor

    International Nuclear Information System (INIS)

    Crawford, K.C.

    1987-01-01

    A hardware and software design for the centralized control of a research nuclear reactor by a digital computer are presented, as well as an investigation of automatic-feedback control. Current reactor-control philosophies including redundancy, inherent safety in failure, and conservative-yet-operational scram initiation were used as the bases of the design. The control philosophies were applied to the power-monitoring system, the fuel-temperature monitoring system, the area-radiation monitoring system, and the overall system interaction. Unlike the single-function analog computers currently used to control research and commercial reactors, this system will be driven by a multifunction digital computer. Specifically, the system will perform control-rod movements to conform with operator requests, automatically log the required physical parameters during reactor operation, perform the required system tests, and monitor facility safety and security. Reactor power control is based on signals received from ion chambers located near the reactor core. Absorber-rod movements are made to control the rate of power increase or decrease during power changes and to control the power level during steady-state operation. Additionally, the system incorporates a rudimentary level of artificial intelligence

  4. Status of reactor shielding research in the United States

    International Nuclear Information System (INIS)

    Bartine, D.E.

    1983-01-01

    Shielding research in the United States continues to place emphasis on: (1) the development and refinement of shielding design calculational methods and nuclear data; and (2) the performance of confirmation experiments, both to evaluate specific design concepts and to verify specific calculational techniques and input data. The successful prediction of the radiation levels observed within the now-operating Fast Flux Test Facility (FFTF) has demonstrated the validity of this two-pronged approach, which has since been applied to US fast breeder reactor programs and is now being used to determine radiation levels and possible further shielding needs at operating light water reactors, especially under accident conditions. A similar approach is being applied to the back end of the fission fuel cycle to verify that radiation doses at fuel element storage and transportation facilities and within fuel reprocessing plants are kept at acceptable levels without undue economic penalties

  5. The research of computer multimedia assistant in college English listening

    Science.gov (United States)

    Zhang, Qian

    2012-04-01

    With the technology development of network information, there exists more and more seriously questions to our education. Computer multimedia application breaks the traditional foreign language teaching and brings new challenges and opportunities for the education. Through the multiple media application, the teaching process is full of animation, image, voice, and characters. This can improve the learning initiative and objective with great development of learning efficiency. During the traditional foreign language teaching, people use characters learning. However, through this method, the theory performance is good but the practical application is low. During the long time computer multimedia application in the foreign language teaching, many teachers still have prejudice. Therefore, the method is not obtaining the effect. After all the above, the research has significant meaning for improving the teaching quality of foreign language.

  6. Public and nonprofit funding for research on mental disorders in France, the United Kingdom, and the United States.

    Science.gov (United States)

    Chevreul, Karine; McDaid, David; Farmer, Carrie M; Prigent, Amélie; Park, A-La; Leboyer, Marion; Kupfer, David J; Durand-Zaleski, Isabelle

    2012-07-01

    To document the investments made in research on mental disorders by both government and nonprofit nongovernmental organizations in France, the United Kingdom, and the United States. An exhaustive survey was conducted of primary sources of public and nonprofit organization funding for mental health research for the year 2007 in France and the United Kingdom and for fiscal year 2007-2008 in the United States, augmented with an examination of relevant Web sites and publications. In France, all universities and research institutions were identified using the Public Finance Act. In the United Kingdom, we scrutinized Web sites and hand searched annual reports and grant lists for the public sector and nonprofit charitable medical research awarding bodies. In the United States, we included the following sources: the National Institutes of Health, other administrative entities within the Department of Health and Human Services (eg, Centers for Disease Control and Prevention), the Department of Education, the Department of Veterans Affairs, the Department of Defense, and the National Science Foundation and, for nonprofit funding, The Foundation Center. We included research on all mental disorders and substance-related disorders using the same keywords. We excluded research on mental retardation and dementia and on the promotion of mental well-being. We used the same algorithm in each country to obtain data for only mental health funding in situations in which funding had a broader scope. France spent $27.6 million (2%) of its health research budget on mental disorders, the United Kingdom spent $172.6 million (7%), and the United States spent $5.2 billion (16%). Nongovernmental funding ranged from 1% of total funding for mental health research in France and the United States to 14% in the United Kingdom. Funding for research on mental disorders accounts for low proportions of research budgets compared with funding levels for research on other major health problems, whereas

  7. Application of the partitive analytical forecasting (PAF) technique to the United States controlled thermonuclear research effort

    International Nuclear Information System (INIS)

    Nichols, S.P.

    1975-01-01

    The Partitive Analytical Forecasting (PAF) technique is applied to the overall long-term program plans for the Division of Controlled Thermonuclear Research (DCTR) of the United States Energy Research and Development Administration (ERDA). As part of the PAF technique, the Graphical Evaluation and Review Technique (GERTS) IIIZ computer code is used to perform simulations on a logic network describing the DCTR long-term program plan. Logic networks describing the tokamak, mirror, and theta-pinch developments are simulated individually and then together to form an overall DCTR program network. The results of the simulation of the overall network using various funding schemes and strategies are presented. An economic sensitivity analysis is provided for the tokamak logic networks. An analysis is also performed of the fusion-fission hybrid concept in the context of the present DCTR goals. The results mentioned above as well as the PAF technique itself are evaluated, and recommendations for further research are discussed

  8. Application of Computational Methods in Planaria Research: A Current Update

    Directory of Open Access Journals (Sweden)

    Ghosh Shyamasree

    2017-07-01

    Full Text Available Planaria is a member of the Phylum Platyhelminthes including flatworms. Planarians possess the unique ability of regeneration from adult stem cells or neoblasts and finds importance as a model organism for regeneration and developmental studies. Although research is being actively carried out globally through conventional methods to understand the process of regeneration from neoblasts, biology of development, neurobiology and immunology of Planaria, there are many thought provoking questions related to stem cell plasticity, and uniqueness of regenerative potential in Planarians amongst other members of Phylum Platyhelminthes. The complexity of receptors and signalling mechanisms, immune system network, biology of repair, responses to injury are yet to be understood in Planaria. Genomic and transcriptomic studies have generated a vast repository of data, but their availability and analysis is a challenging task. Data mining, computational approaches of gene curation, bioinformatics tools for analysis of transcriptomic data, designing of databases, application of algorithms in deciphering changes of morphology by RNA interference (RNAi approaches, understanding regeneration experiments is a new venture in Planaria research that is helping researchers across the globe in understanding the biology. We highlight the applications of Hidden Markov models (HMMs in designing of computational tools and their applications in Planaria decoding their complex biology.

  9. An Alternative Method for Computing Unit Costs and Productivity Ratios. AIR 1984 Annual Forum Paper.

    Science.gov (United States)

    Winstead, Wayland H.; And Others

    An alternative measure for evaluating the performance of academic departments was studied. A comparison was made with the traditional manner for computing unit costs and productivity ratios: prorating the salary and effort of each faculty member to each course level based on the personal mix of course taught. The alternative method used averaging…

  10. Whole Language, Computers and CD-ROM Technology: A Kindergarten Unit on "Benjamin Bunny."

    Science.gov (United States)

    Balajthy, Ernest

    A kindergarten teacher, two preservice teachers, and a college consultant on educational computer technology designed and developed a 10-day whole-language integrated unit on the theme of Beatrix Potter's "Benjamin Bunny." The project was designed as a demonstration of the potential of integrating the CD-ROM-based version of…

  11. Using Videos and 3D Animations for Conceptual Learning in Basic Computer Units

    Science.gov (United States)

    Cakiroglu, Unal; Yilmaz, Huseyin

    2017-01-01

    This article draws on a one-semester study to investigate the effect of videos and 3D animations on students' conceptual understandings about basic computer units. A quasi-experimental design was carried out in two classrooms; videos and 3D animations were used in classroom activities in one group and those were used for homework in the other…

  12. Students' Beliefs about Mobile Devices vs. Desktop Computers in South Korea and the United States

    Science.gov (United States)

    Sung, Eunmo; Mayer, Richard E.

    2012-01-01

    College students in the United States and in South Korea completed a 28-item multidimensional scaling (MDS) questionnaire in which they rated the similarity of 28 pairs of multimedia learning materials on a 10-point scale (e.g., narrated animation on a mobile device Vs. movie clip on a desktop computer) and a 56-item semantic differential…

  13. Correlation between crystallographic computing and artificial intelligence research

    Energy Technology Data Exchange (ETDEWEB)

    Feigenbaum, E A [Stanford Univ., CA; Engelmore, R S; Johnson, C K

    1977-01-01

    Artificial intelligence research, as a part of computer science, has produced a variety of programs of experimental and applications interest: programs for scientific inference, chemical synthesis, planning robot control, extraction of meaning from English sentences, speech understanding, interpretation of visual images, and so on. The symbolic manipulation techniques used in artificial intelligence provide a framework for analyzing and coding the knowledge base of a problem independently of an algorithmic implementation. A possible application of artificial intelligence methodology to protein crystallography is described. 2 figures, 2 tables.

  14. Computer Science Teacher Professional Development in the United States: A Review of Studies Published between 2004 and 2014

    Science.gov (United States)

    Menekse, Muhsin

    2015-01-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher…

  15. Ethics in published brain-computer interface research

    Science.gov (United States)

    Specker Sullivan, L.; Illes, J.

    2018-02-01

    Objective. Sophisticated signal processing has opened the doors to more research with human subjects than ever before. The increase in the use of human subjects in research comes with a need for increased human subjects protections. Approach. We quantified the presence or absence of ethics language in published reports of brain-computer interface (BCI) studies that involved human subjects and qualitatively characterized ethics statements. Main results. Reports of BCI studies with human subjects that are published in neural engineering and engineering journals are anchored in the rationale of technological improvement. Ethics language is markedly absent, omitted from 31% of studies published in neural engineering journals and 59% of studies in biomedical engineering journals. Significance. As the integration of technological tools with the capacities of the mind deepens, explicit attention to ethical issues will ensure that broad human benefit is embraced and not eclipsed by technological exclusiveness.

  16. Connecting Arctic Research Across Boundaries through the Arctic Research Consortium of the United States (ARCUS)

    Science.gov (United States)

    Rich, R. H.; Myers, B.; Wiggins, H. V.; Zolkos, J.

    2017-12-01

    The complexities inherent in Arctic research demand a unique focus on making connections across the boundaries of discipline, institution, sector, geography, knowledge system, and culture. Since 1988, ARCUS has been working to bridge these gaps through communication, coordination, and collaboration. Recently, we have worked with partners to create a synthesis of the Arctic system, to explore the connectivity across the Arctic research community and how to strengthen it, to enable the community to have an effective voice in research funding policy, to implement a system for Arctic research community knowledge management, to bridge between global Sea Ice Prediction Network researchers and the science needs of coastal Alaska communities through the Sea Ice for Walrus Outlook, to strengthen ties between Polar researchers and educators, and to provide essential intangible infrastructure that enables cost-effective and productive research across boundaries. Employing expertise in managing for collaboration and interdisciplinarity, ARCUS complements and enables the work of its members, who constitute the Arctic research community and its key stakeholders. As a member-driven organization, everything that ARCUS does is achieved through partnership, with strong volunteer leadership of each activity. Key organizational partners in the United States include the U.S. Arctic Research Commission, Interagency Arctic Research Policy Committee, National Academy of Sciences Polar Research Board, and the North Slope Science Initiative. Internationally, ARCUS maintains strong bilateral connections with similarly focused groups in each Arctic country (and those interested in the Arctic), as well as with multinational organizations including the International Arctic Science Committee, the Association of Polar Early Career Educators, the University of the Arctic, and the Arctic Institute of North America. Currently, ARCUS is applying the best practices of the science of team science

  17. Production Support Flight Control Computers: Research Capability for F/A-18 Aircraft at Dryden Flight Research Center

    Science.gov (United States)

    Carter, John F.

    1997-01-01

    NASA Dryden Flight Research Center (DFRC) is working with the United States Navy to complete ground testing and initiate flight testing of a modified set of F/A-18 flight control computers. The Production Support Flight Control Computers (PSFCC) can give any fleet F/A-18 airplane an in-flight, pilot-selectable research control law capability. NASA DFRC can efficiently flight test the PSFCC for the following four reasons: (1) Six F/A-18 chase aircraft are available which could be used with the PSFCC; (2) An F/A-18 processor-in-the-loop simulation exists for validation testing; (3) The expertise has been developed in programming the research processor in the PSFCC; and (4) A well-defined process has been established for clearing flight control research projects for flight. This report presents a functional description of the PSFCC. Descriptions of the NASA DFRC facilities, PSFCC verification and validation process, and planned PSFCC projects are also provided.

  18. Computer Based Procedures for Field Workers - FY16 Research Activities

    International Nuclear Information System (INIS)

    Oxstrand, Johanna; Bly, Aaron

    2016-01-01

    The Computer-Based Procedure (CBP) research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. One area that could yield tremendous savings in increased efficiency and safety is in improving procedure use. A CBP provides the opportunity to incorporate context-driven job aids, such as drawings, photos, and just-in-time training. The presentation of information in CBPs can be much more flexible and tailored to the task, actual plant condition, and operation mode. The dynamic presentation of the procedure will guide the user down the path of relevant steps, thus minimizing time spent by the field worker to evaluate plant conditions and decisions related to the applicability of each step. This dynamic presentation of the procedure also minimizes the risk of conducting steps out of order and/or incorrectly assessed applicability of steps. This report provides a summary of the main research activities conducted in the Computer-Based Procedures for Field Workers effort since 2012. The main focus of the report is on the research activities conducted in fiscal year 2016. The activities discussed are the Nuclear Electronic Work Packages - Enterprise Requirements initiative, the development of a design guidance for CBPs (which compiles all insights gained through the years of CBP research), the facilitation of vendor studies at the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), a pilot study for how to enhance the plant design modification work process, the collection of feedback from a field evaluation study at Plant Vogtle, and path forward to

  19. Computer Based Procedures for Field Workers - FY16 Research Activities

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    The Computer-Based Procedure (CBP) research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. One area that could yield tremendous savings in increased efficiency and safety is in improving procedure use. A CBP provides the opportunity to incorporate context-driven job aids, such as drawings, photos, and just-in-time training. The presentation of information in CBPs can be much more flexible and tailored to the task, actual plant condition, and operation mode. The dynamic presentation of the procedure will guide the user down the path of relevant steps, thus minimizing time spent by the field worker to evaluate plant conditions and decisions related to the applicability of each step. This dynamic presentation of the procedure also minimizes the risk of conducting steps out of order and/or incorrectly assessed applicability of steps. This report provides a summary of the main research activities conducted in the Computer-Based Procedures for Field Workers effort since 2012. The main focus of the report is on the research activities conducted in fiscal year 2016. The activities discussed are the Nuclear Electronic Work Packages – Enterprise Requirements initiative, the development of a design guidance for CBPs (which compiles all insights gained through the years of CBP research), the facilitation of vendor studies at the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), a pilot study for how to enhance the plant design modification work process, the collection of feedback from a field evaluation study at Plant Vogtle, and path forward to

  20. Computation of large covariance matrices by SAMMY on graphical processing units and multicore CPUs

    Energy Technology Data Exchange (ETDEWEB)

    Arbanas, G.; Dunn, M.E.; Wiarda, D., E-mail: arbanasg@ornl.gov, E-mail: dunnme@ornl.gov, E-mail: wiardada@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN (United States)

    2011-07-01

    Computational power of Graphical Processing Units and multicore CPUs was harnessed by the nuclear data evaluation code SAMMY to speed up computations of large Resonance Parameter Covariance Matrices (RPCMs). This was accomplished by linking SAMMY to vendor-optimized implementations of the matrix-matrix multiplication subroutine of the Basic Linear Algebra Library to compute the most time-consuming step. The {sup 235}U RPCM computed previously using a triple-nested loop was re-computed using the NVIDIA implementation of the subroutine on a single Tesla Fermi Graphical Processing Unit, and also using the Intel's Math Kernel Library implementation on two different multicore CPU systems. A multiplication of two matrices of dimensions 16,000×20,000 that had previously taken days, took approximately one minute on the GPU. Comparable performance was achieved on a dual six-core CPU system. The magnitude of the speed-up suggests that these, or similar, combinations of hardware and libraries may be useful for large matrix operations in SAMMY. Uniform interfaces of standard linear algebra libraries make them a promising candidate for a programming framework of a new generation of SAMMY for the emerging heterogeneous computing platforms. (author)

  1. Computation of large covariance matrices by SAMMY on graphical processing units and multicore CPUs

    International Nuclear Information System (INIS)

    Arbanas, G.; Dunn, M.E.; Wiarda, D.

    2011-01-01

    Computational power of Graphical Processing Units and multicore CPUs was harnessed by the nuclear data evaluation code SAMMY to speed up computations of large Resonance Parameter Covariance Matrices (RPCMs). This was accomplished by linking SAMMY to vendor-optimized implementations of the matrix-matrix multiplication subroutine of the Basic Linear Algebra Library to compute the most time-consuming step. The 235 U RPCM computed previously using a triple-nested loop was re-computed using the NVIDIA implementation of the subroutine on a single Tesla Fermi Graphical Processing Unit, and also using the Intel's Math Kernel Library implementation on two different multicore CPU systems. A multiplication of two matrices of dimensions 16,000×20,000 that had previously taken days, took approximately one minute on the GPU. Comparable performance was achieved on a dual six-core CPU system. The magnitude of the speed-up suggests that these, or similar, combinations of hardware and libraries may be useful for large matrix operations in SAMMY. Uniform interfaces of standard linear algebra libraries make them a promising candidate for a programming framework of a new generation of SAMMY for the emerging heterogeneous computing platforms. (author)

  2. Simplified techniques of cerebral angiography using a mobile X-ray unit and computed radiography

    International Nuclear Information System (INIS)

    Gondo, Gakuji; Ishiwata, Yusuke; Yamashita, Toshinori; Iida, Takashi; Moro, Yutaka

    1989-01-01

    Simplified techniques of cerebral angiography using a mobile X-ray unit and computed radiography (CR) are discussed. Computed radiography is a digital radiography system in which an imaging plate is used as an X-ray detector and a final image is displayed on the film. In the angiograms performed with CR, the spatial frequency components can be enhanced for the easy analysis of fine blood vessels. Computed radiography has an automatic sensitivity and a latitude-setting mechanism, thus serving as an 'automatic camera.' This mechanism is useful for radiography with a mobile X-ray unit in hospital wards, intensive care units, or operating rooms where the appropriate setting of exposure conditions is difficult. We applied this mechanism to direct percutaneous carotid angiography and intravenous digital subtraction angiography with a mobile X-ray unit. Direct percutaneous carotid angiography using CR and a mobile X-ray unit were taken after the manual injection of a small amount of a contrast material through a fine needle. We performed direct percutaneous carotid angiography with this method 68 times on 25 cases from August 1986 to December 1987. Of the 68 angiograms, 61 were evaluated as good, compared with conventional angiography. Though the remaining seven were evaluated as poor, they were still diagnostically effective. This method is found useful for carotid angiography in emergency rooms, intensive care units, or operating rooms. Cerebral venography using CR and a mobile X-ray unit was done after the manual injection of a contrast material through the bilateral cubital veins. The cerebral venous system could be visualized from 16 to 24 seconds after the beginning of the injection of the contrast material. We performed cerebral venography with this method 14 times on six cases. These venograms were better than conventional angiograms in all cases. This method may be useful in managing patients suffering from cerebral venous thrombosis. (J.P.N.)

  3. Online Secondary Research in the Advertising Research Class: A Friendly Introduction to Computing.

    Science.gov (United States)

    Adler, Keith

    In an effort to promote computer literacy among advertising students, an assignment was devised that required the use of online database search techniques to find secondary research materials. The search program, chosen for economical reasons, was "Classroom Instruction Program" offered by Dialog Information Services. Available for a…

  4. Educational Technology Research Journals: "Journal of Educational Computing Research," 2003-2012

    Science.gov (United States)

    Nyland, Rob; Anderson, Noelle; Beckstrom, Tyler; Boren, Michael; Thomas, Rebecca; West, Richard E.

    2015-01-01

    This article analyzes articles published in the "Journal of Educational Computing Research" ("JECR") from 2003 to 2012. The authors analyzed the articles looking for trends in article types and methodologies, the most common topics addressed in the articles, the top-cited articles, and the top authors during the period. The…

  5. Distributed and grid computing projects with research focus in human health.

    Science.gov (United States)

    Diomidous, Marianna; Zikos, Dimitrios

    2012-01-01

    Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.

  6. The Medical Research Council (UK)/Uganda Virus Research Institute Uganda Research Unit on AIDS--'25 years of research through partnerships'.

    Science.gov (United States)

    Kaleebu, P; Kamali, A; Seeley, J; Elliott, A M; Katongole-Mbidde, E

    2015-02-01

    For the past 25 years, the Medical Research Council/Uganda Virus Research Institute Uganda Research Unit on AIDS has conducted research on HIV-1, coinfections and, more recently, on non-communicable diseases. Working with various partners, the research findings of the Unit have contributed to the understanding and control of the HIV epidemic both in Uganda and globally, and informed the future development of biomedical HIV interventions, health policy and practice. In this report, as we celebrate our silver jubilee, we describe some of these achievements and the Unit's multidisciplinary approach to research. We also discuss the future direction of the Unit; an exemplar of a partnership that has been largely funded from the north but led in the south. © 2014 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  7. Reactor safety research - visible demonstrations and credible computations

    Energy Technology Data Exchange (ETDEWEB)

    Loewenstein, W B; Divakaruni, S M

    1985-11-01

    EPRI has been conducting nuclear safety research for a number of years with the primary goal of assuring the safety and reliability of the nuclear plants. The visibility is emphasized by sponsoring or participating in large scale test demonstrations to credibly support the complex computations that are the basis for quantification of safety margins. Recognizing the success of the airline industry in receiving favorable public perception, the authors compare the design and operation practices of the airline industry with those of the nuclear industry practices to identify the elements contributing to public concerns and unfavorable perceptions. In this paper, authors emphasize the importance of proper communications of research results to the public in a manner that non-specialists understand. Further, EPRI supported research and results in the areas of source term, seismic and structural engineering research, analysis using probabilistic risk assessment (PRA), quantification of safety margins, digital technology development and implementation, and plant transient and performance evaluations are discussed in the paper. (orig./HP).

  8. Reactor safety research - visible demonstrations and credible computations

    International Nuclear Information System (INIS)

    Loewenstein, W.B.; Divakaruni, S.M.

    1985-01-01

    EPRI has been conducting nuclear safety research for a number of years with the primary goal of assuring the safety and reliability of the nuclear plants. The visibility is emphasized by sponsoring or participating in large scale test demonstrations to credibly support the complex computations that are the basis for quantification of safety margins. Recognizing the success of the airline industry in receiving favorable public perception, the authors compare the design and operation practices of the airline industry with those of the nuclear industry practices to identify the elements contributing to public concerns and unfavorable perceptions. In this paper, authors emphasize the importance of proper communications of research results to the public in a manner that non-specialists understand. Further, EPRI supported research and results in the areas of source term, seismic and structural engineering research, analysis using probabilistic risk assessment (PRA), quantification of safety margins, digital technology development and implementation, and plant transient and performance evaluations are discussed in the paper. (orig./HP)

  9. Technology of research of hydroturbine unit work using seismic methods

    Science.gov (United States)

    Seleznev, V. S.; Liseikin, A. V.; Gromyko, P. V.; Soloviev, V. M.

    2013-05-01

    On August, 17, 2009 one of the most significant accident in hydropower engineering was happened at Sayano-Shushenskaya Hydroelectric Power Station. Specialists of Geophysical Survey SB RAS took part in the State Committee on investigation of the accident cause at Sayano-Shushenskaya HPS. It was determined, that the cause of the accident was a break of stud-bolts on the turbine cover. Why stud-bolts did not stand a load? There were assumptions that hydraulic shock provoked the accident. But, if it is so, seismic station "Cheremushky", situated in 4 km away from the HPS, should has a record of this event. First of all, investigating the record, got at the seismic station in the moment of the accident, it was determined that strength of seismic waves, recorded at the moment of the accident, did not exceed strength of waves got at trotyl explosion of 500 g at a distance to 4 km. The version of hydraulic shock was not proved. There were distinguished low-frequency oscillations and it was determined that the hydroturbine unit (HU) had been raised up more then 10 m in height for 10 sec. Analyzing the seismic station records during the period of more than a year before the accident and records of operating modes of different HU, there was determined that oscillations radiated by second (damaged) HU were approximately 1.5 times more intense than oscillations from all other HU. After the accident at Sayano-Shushenskaya HPS hydroturbine units were started in turns: at first there were started hydroturbine units of old construction (3, 4, 5, 6), then HP of new construction (1, 7, 8, 9). We installed 10 - 15 three-component seismic stations in different points around a HU and studied field of seismic oscillations from it's work. It was determined, that HU radiates a set of monochromatic oscillations divisible by speed of rotation equal to 2.381 Hz. Change of these signals amplitude is connected with change of HU operation modes. Research of changes in oscillations spectral

  10. Medical researchers unite for study on cancer intervention

    Directory of Open Access Journals (Sweden)

    Editorial Office

    2016-08-01

    Full Text Available We introduce Drs. Antoine Snijders and Jian-Hua Mao, whose article is published in this issue of AMOR and discuss their views on cancer genetics, targeted therapy, and personalized medicine.Having worked together in numerous joint investigations that have yielded significant results, Dr. Snijders and Dr. Mao would most definitely agree that two heads are better than one. “Researchers these days need to have the ability to collaborate across many different disciplines,” said the duo in an exclusive interview with AMOR. Dr. Snijders and Dr. Mao, both with PhDs in cancer genetics and genomics, are currently based at the Biological Systems and Engineering Division of Lawrence Berkeley National Laboratory, California, which is a member of the national laboratory system supported by the U.S Department of Energy through its Office of Science. The Berkeley Lab is well known for producing excellent scholars, as thirteen Nobel Prize winners are affiliated with the Lab and seventy of its scientists are members of the National Academy of Sciences (NAS, one of the highest honors for a scientist in the United States. Dr. Snijders, a Dutch who has conducted his research at Berkeley Lab for the past eight years, did his Masters in Science (Medical Biology at the Vrije Universiteit Amsterdam, Netherlands – an institute with a strong focus on scientific research and is home to five Spinoza Prize (a.k.a. the “Dutch Nobel” winners. Dr. Snijders’s PhD (cum laude in cancer and molecular biology was awarded by University Utrecht in Netherlands, but his research work was carried out at the University of California San Francisco. Subsequently, he continued his postdoctoral research in molecular cytogenetics at the same institution. A prolific author of 114 publications (with 3,851 citations according to ResearchGate, Dr. Snijders – who also volunteers with California’s Contra Costa County Search and Rescue team for missing persons – has interests in

  11. Development of computational small animal models and their applications in preclinical imaging and therapy research

    NARCIS (Netherlands)

    Xie, Tianwu; Zaidi, Habib

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal

  12. The Role of Computers in Research and Development at Langley Research Center

    Science.gov (United States)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  13. Changing the Translational Research Landscape: A Review of the Impacts of Biomedical Research Units in England.

    Science.gov (United States)

    Marjanovic, Sonja; Soper, Bryony; Ismail, Sharif; Reding, Anais; Ling, Tom

    2012-01-01

    This article describes a review of the Biomedical Research Units (BRU) scheme, undertaken for the Department of Health. This review was a perceptions audit of senior executives involved in the scheme, and explored what impact they felt the scheme is having on the translational research landscape. More specifically, we investigated whether and how institutional relationships between NHS and academic partners, industry and other health research system players are changing because of the scheme; how the scheme is helping build critical mass in specific priority disease areas; and the effects of any changes on efforts to deliver the broader goals set out in Best Research for Best Health. The views presented are those of study informants only. The information obtained through our interviews suggests that the BRU scheme is significantly helping shape the health research system to pursue translational research and innovation, with the clear goal of realising patient benefit. The BRUs are already contributing to observable changes in institutional relationships between the NHS and academic partners: trusts and medical schools are collaborating more closely than in the past, have signed up to the same vision of translational research from bench to bedside, and are managing and governing targeted research resources more professionally and transparently than in the past. There is also a stronger emphasis on engaging industry and more strategic thinking about strengthening regional and national collaboration with other hospital trusts, PCTs, research organisations, networks and development agencies. The scheme is also transforming capacity building in the health research system. This includes (i) developing and modernising facilities and equipment for translation; (ii) building a critical mass of human resources through recruitment and training, as well as improving retention of existing expertise; and (iii) helping ensure a steady flow of funds needed to sustain research

  14. Computation studies into architecture and energy transfer properties of photosynthetic units from filamentous anoxygenic phototrophs

    Energy Technology Data Exchange (ETDEWEB)

    Linnanto, Juha Matti [Institute of Physics, University of Tartu, Riia 142, 51014 Tartu (Estonia); Freiberg, Arvi [Institute of Physics, University of Tartu, Riia 142, 51014 Tartu, Estonia and Institute of Molecular and Cell Biology, University of Tartu, Riia 23, 51010 Tartu (Estonia)

    2014-10-06

    We have used different computational methods to study structural architecture, and light-harvesting and energy transfer properties of the photosynthetic unit of filamentous anoxygenic phototrophs. Due to the huge number of atoms in the photosynthetic unit, a combination of atomistic and coarse methods was used for electronic structure calculations. The calculations reveal that the light energy absorbed by the peripheral chlorosome antenna complex transfers efficiently via the baseplate and the core B808–866 antenna complexes to the reaction center complex, in general agreement with the present understanding of this complex system.

  15. Development of DUST: A computer code that calculates release rates from a LLW disposal unit

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1992-01-01

    Performance assessment of a Low-Level Waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the disposal unit source term). The major physical processes that influence the source term are water flow, container degradation, waste form leaching, and radionuclide transport. A computer code, DUST (Disposal Unit Source Term) has been developed which incorporates these processes in a unified manner. The DUST code improves upon existing codes as it has the capability to model multiple container failure times, multiple waste form release properties, and radionuclide specific transport properties. Verification studies performed on the code are discussed

  16. Computer codes for the operational control of the research reactors

    International Nuclear Information System (INIS)

    Kalker, K.J.; Nabbi, R.; Bormann, H.J.

    1986-01-01

    Four small computer codes developed by ZFR are presented, which have been used for several years during operation of the research reactors FRJ-1, FRJ-2, AVR (all in Juelich) and DR-2 (Riso, Denmark). Because of interest coming from the other reactor stations the codes are documented within the frame work of the IAEA Research Contract No. 3634/FG. The zero-dimensional burnup program CREMAT is used for reactor cores in which flux measurements at each individual fuel element are carried out during operation. The program yields burnup data for each fuel element and for the whole core. On the basis of these data, fuel reloading is prepared for the next operational period under consideration of the permitted minimum shut down reactivity of the system. The program BURNY calculates burnup for fuel elements inaccessible for flux measurements, but for which 'position weighting factors' have been measured/calculated during zero power operation of the core, and which are assumed to be constant in all operational situations. The code CURIAX calculates post-irradiation data for discharged fuel elements needed in their manipulation and transport. These three programs have been written for highly enriched fuel and take into account U-235 only. The modification of CREMAT for LEU Cores and its combiantion with ORIGEN is in preparation. KINIK is an inverse kinetic code and widely used for absorber rod calibration at the abovementioned research reactors. It includes a special polynomial subroutine which can easily be used in other codes. (orig.) [de

  17. Collaborative research on fluidization employing computer-aided particle tracking

    International Nuclear Information System (INIS)

    Chen, M.M.

    1990-01-01

    The objective of this work is to obtain unique, fundamental information on fluidization dynamics over a wide range of flow regimes using a Transportable Computer-Aided Particle Tracking Apparatus (TCAPTA). The contractor will design and fabricate a transportable version of the Computer-Aided Particle Tracking Facility (CAPTF) he has previously developed. The contractor will install and operate the (TCAPTA) at the METC fluidization research facilities. Quantitative data on particle motion will be obtained and reduced. The data will be used to provide needed information for modeling of bed dynamics, and prediction of bed performance, including erosion. A radioactive tracer particle, identical in size shape and mass to the bed particles under study, is mixed in the bed. The radiation emitted by the tracer particle, monitored continuously by 16 scintillation detectors, allows its position to be determined as a function of time. Stochastic mixing processes intrinsic to fluidization further cause the particle to travel to all active regions of the bed, thus sampling the motion in these regions. After a long test run to insure that a sufficient sampling have been acquired, time-differentiation and other statistical processing will then yield the mean velocity distribution, the fluctuating velocity distribution, many types of auto- and cross correlations, as well as mean fluxes, including the mean momentum fluxes due to random motion, which represent the kinetic contributions to the mean stress tensor

  18. Scientific visualization in computational aerodynamics at NASA Ames Research Center

    Science.gov (United States)

    Bancroft, Gordon V.; Plessel, Todd; Merritt, Fergus; Walatka, Pamela P.; Watson, Val

    1989-01-01

    The visualization methods used in computational fluid dynamics research at the NASA-Ames Numerical Aerodynamic Simulation facility are examined, including postprocessing, tracking, and steering methods. The visualization requirements of the facility's three-dimensional graphical workstation are outlined and the types hardware and software used to meet these requirements are discussed. The main features of the facility's current and next-generation workstations are listed. Emphasis is given to postprocessing techniques, such as dynamic interactive viewing on the workstation and recording and playback on videodisk, tape, and 16-mm film. Postprocessing software packages are described, including a three-dimensional plotter, a surface modeler, a graphical animation system, a flow analysis software toolkit, and a real-time interactive particle-tracer.

  19. 77 FR 31026 - Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency...

    Science.gov (United States)

    2012-05-24

    ...] Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency... entitled: ``Use of Computer Simulation of the United States Blood Supply in Support of Planning for... and panel discussions with experts from academia, regulated industry, government, and other...

  20. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  1. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...

  2. 24 CFR 290.21 - Computing annual number of units eligible for substitution of tenant-based assistance or...

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Computing annual number of units eligible for substitution of tenant-based assistance or alternative uses. 290.21 Section 290.21 Housing and... Multifamily Projects § 290.21 Computing annual number of units eligible for substitution of tenant-based...

  3. Research on Anoplophora glabripennis in the United States

    Science.gov (United States)

    Robert A. Haack

    2003-01-01

    In the mid-1990s it was estimated that more than 400 exotic (non-native) forest insects had already become established in the United States (HAACK and BYLER, 1993; MATTSON et al., 1994; NIEMELA and MATTSON, 1996). This number has continued to grow with new exotics discovered annually in the United States (HAACK, 2002; HAACK and POLAND, 2001; HAACK et al., 2002). One...

  4. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  5. Initial quantitative evaluation of computed radiography in an intensive care unit

    International Nuclear Information System (INIS)

    Hillis, D.J.; McDonald, I.G.; Kelly, W.J.

    1996-01-01

    The first computed radiography (CR) unit in Australia was installed at St Vincent's Hospital, Melbourne, in February 1994. An initial qualitative evaluation of the attitude of the intensive care unit (ICU) physicians to the CR unit was conducted by use of a survey. The results of the survey of ICU physicians indicated that images were available faster than under the previous system and that the use of the CR system was preferred to evaluate chest tubes and line placements. While it is recognized that a further detailed radiological evaluation of the CR system is required to establish the diagnostic performance of CR compared with conventional film, some comments on the implementation of the system and ICU physician attitudes to the CR system are put forward for consideration by other hospitals examining the possible use of CR systems. 11 refs., 1 tab

  6. All-optical quantum computing with a hybrid solid-state processing unit

    International Nuclear Information System (INIS)

    Pei Pei; Zhang Fengyang; Li Chong; Song Heshan

    2011-01-01

    We develop an architecture of a hybrid quantum solid-state processing unit for universal quantum computing. The architecture allows distant and nonidentical solid-state qubits in distinct physical systems to interact and work collaboratively. All the quantum computing procedures are controlled by optical methods using classical fields and cavity QED. Our methods have a prominent advantage of the insensitivity to dissipation process benefiting from the virtual excitation of subsystems. Moreover, the quantum nondemolition measurements and state transfer for the solid-state qubits are proposed. The architecture opens promising perspectives for implementing scalable quantum computation in a broader sense that different solid-state systems can merge and be integrated into one quantum processor afterward.

  7. Unit cell-based computer-aided manufacturing system for tissue engineering

    International Nuclear Information System (INIS)

    Kang, Hyun-Wook; Park, Jeong Hun; Kang, Tae-Yun; Seol, Young-Joon; Cho, Dong-Woo

    2012-01-01

    Scaffolds play an important role in the regeneration of artificial tissues or organs. A scaffold is a porous structure with a micro-scale inner architecture in the range of several to several hundreds of micrometers. Therefore, computer-aided construction of scaffolds should provide sophisticated functionality for porous structure design and a tool path generation strategy that can achieve micro-scale architecture. In this study, a new unit cell-based computer-aided manufacturing (CAM) system was developed for the automated design and fabrication of a porous structure with micro-scale inner architecture that can be applied to composite tissue regeneration. The CAM system was developed by first defining a data structure for the computing process of a unit cell representing a single pore structure. Next, an algorithm and software were developed and applied to construct porous structures with a single or multiple pore design using solid freeform fabrication technology and a 3D tooth/spine computer-aided design model. We showed that this system is quite feasible for the design and fabrication of a scaffold for tissue engineering. (paper)

  8. Unit cell-based computer-aided manufacturing system for tissue engineering.

    Science.gov (United States)

    Kang, Hyun-Wook; Park, Jeong Hun; Kang, Tae-Yun; Seol, Young-Joon; Cho, Dong-Woo

    2012-03-01

    Scaffolds play an important role in the regeneration of artificial tissues or organs. A scaffold is a porous structure with a micro-scale inner architecture in the range of several to several hundreds of micrometers. Therefore, computer-aided construction of scaffolds should provide sophisticated functionality for porous structure design and a tool path generation strategy that can achieve micro-scale architecture. In this study, a new unit cell-based computer-aided manufacturing (CAM) system was developed for the automated design and fabrication of a porous structure with micro-scale inner architecture that can be applied to composite tissue regeneration. The CAM system was developed by first defining a data structure for the computing process of a unit cell representing a single pore structure. Next, an algorithm and software were developed and applied to construct porous structures with a single or multiple pore design using solid freeform fabrication technology and a 3D tooth/spine computer-aided design model. We showed that this system is quite feasible for the design and fabrication of a scaffold for tissue engineering.

  9. Computing the Density Matrix in Electronic Structure Theory on Graphics Processing Units.

    Science.gov (United States)

    Cawkwell, M J; Sanville, E J; Mniszewski, S M; Niklasson, Anders M N

    2012-11-13

    The self-consistent solution of a Schrödinger-like equation for the density matrix is a critical and computationally demanding step in quantum-based models of interatomic bonding. This step was tackled historically via the diagonalization of the Hamiltonian. We have investigated the performance and accuracy of the second-order spectral projection (SP2) algorithm for the computation of the density matrix via a recursive expansion of the Fermi operator in a series of generalized matrix-matrix multiplications. We demonstrate that owing to its simplicity, the SP2 algorithm [Niklasson, A. M. N. Phys. Rev. B2002, 66, 155115] is exceptionally well suited to implementation on graphics processing units (GPUs). The performance in double and single precision arithmetic of a hybrid GPU/central processing unit (CPU) and full GPU implementation of the SP2 algorithm exceed those of a CPU-only implementation of the SP2 algorithm and traditional matrix diagonalization when the dimensions of the matrices exceed about 2000 × 2000. Padding schemes for arrays allocated in the GPU memory that optimize the performance of the CUBLAS implementations of the level 3 BLAS DGEMM and SGEMM subroutines for generalized matrix-matrix multiplications are described in detail. The analysis of the relative performance of the hybrid CPU/GPU and full GPU implementations indicate that the transfer of arrays between the GPU and CPU constitutes only a small fraction of the total computation time. The errors measured in the self-consistent density matrices computed using the SP2 algorithm are generally smaller than those measured in matrices computed via diagonalization. Furthermore, the errors in the density matrices computed using the SP2 algorithm do not exhibit any dependence of system size, whereas the errors increase linearly with the number of orbitals when diagonalization is employed.

  10. Model Selection in Historical Research Using Approximate Bayesian Computation

    Science.gov (United States)

    Rubio-Campillo, Xavier

    2016-01-01

    Formal Models and History Computational models are increasingly being used to study historical dynamics. This new trend, which could be named Model-Based History, makes use of recently published datasets and innovative quantitative methods to improve our understanding of past societies based on their written sources. The extensive use of formal models allows historians to re-evaluate hypotheses formulated decades ago and still subject to debate due to the lack of an adequate quantitative framework. The initiative has the potential to transform the discipline if it solves the challenges posed by the study of historical dynamics. These difficulties are based on the complexities of modelling social interaction, and the methodological issues raised by the evaluation of formal models against data with low sample size, high variance and strong fragmentation. Case Study This work examines an alternate approach to this evaluation based on a Bayesian-inspired model selection method. The validity of the classical Lanchester’s laws of combat is examined against a dataset comprising over a thousand battles spanning 300 years. Four variations of the basic equations are discussed, including the three most common formulations (linear, squared, and logarithmic) and a new variant introducing fatigue. Approximate Bayesian Computation is then used to infer both parameter values and model selection via Bayes Factors. Impact Results indicate decisive evidence favouring the new fatigue model. The interpretation of both parameter estimations and model selection provides new insights into the factors guiding the evolution of warfare. At a methodological level, the case study shows how model selection methods can be used to guide historical research through the comparison between existing hypotheses and empirical evidence. PMID:26730953

  11. Research and realization implementation of monitor technology on illegal external link of classified computer

    Science.gov (United States)

    Zhang, Hong

    2017-06-01

    In recent years, with the continuous development and application of network technology, network security has gradually entered people's field of vision. The host computer network external network of violations is an important reason for the threat of network security. At present, most of the work units have a certain degree of attention to network security, has taken a lot of means and methods to prevent network security problems such as the physical isolation of the internal network, install the firewall at the exit. However, these measures and methods to improve network security are often not comply with the safety rules of human behavior damage. For example, the host to wireless Internet access and dual-network card to access the Internet, inadvertently formed a two-way network of external networks and computer connections [1]. As a result, it is possible to cause some important documents and confidentiality leak even in the the circumstances of user unaware completely. Secrecy Computer Violation Out-of-band monitoring technology can largely prevent the violation by monitoring the behavior of the offending connection. In this paper, we mainly research and discuss the technology of secret computer monitoring.

  12. Research on the Current Telecommuting Trends in United States and European Union Markets

    Directory of Open Access Journals (Sweden)

    Catalina Georgiana PICU

    2016-12-01

    Full Text Available In the context of globalization and due to the accelerated progress made in information and communication technology, more and more companies offer their employees the option of telecommuting. For the past twenty years, telecommuting has been on an asceding trend, an incresing number of people embracing the ability to work from home, using a computer and internet connection to communicate for their jobs. The goal of the paper is to explain the overall notion of telecommuting and to analyze the current trends in the United States and European Union markets. Another objective is to assess the advantages and disadvantages of telecommuting and the important role played by this concept within the striving purpose of corporations to achieve their strategic targets. The study is based upon specialized literature reviews regarding the emergence of telecommuting and the role it plays in organizations. A comparative analysis was conducted by the authors on two regional markets, United States and the European Union, in order to assess the trend in telecommuting and the factors that influence it. The outcome of the research shows that although the benefits of telecommuting are numerous, it does not come without challenges, both being applicable for both the employer, as well as for the employee. The research results of this study can be used by organizations when considering to offer their employees flexible work opportunities which can positively influence the long term business performance.

  13. Advanced Scientific Computing Research Network Requirements: ASCR Network Requirements Review Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Charles [Argonne National Lab. (ANL), Argonne, IL (United States); Bell, Greg [ESnet, Berkeley, CA (United States); Canon, Shane [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [ESnet, Berkeley, CA (United States); Dattoria, Vince [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Goodwin, Dave [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Lee, Jason [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hicks, Susan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Holohan, Ed [Argonne National Lab. (ANL), Argonne, IL (United States); Klasky, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lauzon, Carolyn [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Rogers, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shipman, Galen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Skinner, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tierney, Brian [ESnet, Berkeley, CA (United States)

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  14. Computer use and vision-related problems among university students in ajman, United arab emirate.

    Science.gov (United States)

    Shantakumari, N; Eldeeb, R; Sreedharan, J; Gopal, K

    2014-03-01

    The extensive use of computers as medium of teaching and learning in universities necessitates introspection into the extent of computer related health disorders among student population. This study was undertaken to assess the pattern of computer usage and related visual problems, among University students in Ajman, United Arab Emirates. A total of 500 Students studying in Gulf Medical University, Ajman and Ajman University of Science and Technology were recruited into this study. Demographic characteristics, pattern of usage of computers and associated visual symptoms were recorded in a validated self-administered questionnaire. Chi-square test was used to determine the significance of the observed differences between the variables. The level of statistical significance was at P computer users were headache - 53.3% (251/471), burning sensation in the eyes - 54.8% (258/471) and tired eyes - 48% (226/471). Female students were found to be at a higher risk. Nearly 72% of students reported frequent interruption of computer work. Headache caused interruption of work in 43.85% (110/168) of the students while tired eyes caused interruption of work in 43.5% (98/168) of the students. When the screen was viewed at distance more than 50 cm, the prevalence of headaches decreased by 38% (50-100 cm - OR: 0.62, 95% of the confidence interval [CI]: 0.42-0.92). Prevalence of tired eyes increased by 89% when screen filters were not used (OR: 1.894, 95% CI: 1.065-3.368). High prevalence of vision related problems was noted among university students. Sustained periods of close screen work without screen filters were found to be associated with occurrence of the symptoms and increased interruptions of work of the students. There is a need to increase the ergonomic awareness among students and corrective measures need to be implemented to reduce the impact of computer related vision problems.

  15. Reactor aging research. United States Nuclear Regulatory Commission

    International Nuclear Information System (INIS)

    Vassilaros, M.G.

    1998-01-01

    The reactor ageing research activities in USA described, are focused on the research of reactor vessel integrity, including regulatory issues and technical aspects. Current emphasis are described for fracture analysis, embrittlement research, inspection capabilities, validation od annealing rule, revision of regulatory guide

  16. Research Note Topographical units and soil types prove more ...

    African Journals Online (AJOL)

    The floristic data (species presence at each site) were grouped into Land Types, topographical units and broad soil types. Each group was analysed independently using multivariate detrended correspondence analysis (DCA) and the mean similarity test. The floristic data in each Land Type showed a 42% range of ...

  17. On dosimetry of radiodiagnosis facilities, mainly focused on computed tomography units

    International Nuclear Information System (INIS)

    Ghitulescu, Zoe

    2008-01-01

    The 'talk' refers to the Dosimetry of computed tomography units and it has been thought and structured in three parts, more or less stressed each of them, thus: 1) Basics of image acquisition using computed tomography technique; 2) Effective Dose calculation for a patient and its assessment using BERT concept; 3) Recommended actions of getting a good compromise in between related dose and the image quality. The aim of the first part is that the reader to become acquainted with the CT technique in order to be able of understanding the Effective Dose calculation given example and its conversion into time units using the BERT concept . The drown conclusion is that: 1) Effective dose calculation accomplished by the medical physicist (using a special soft for the CT scanner and the exam type) and, converted in time units through BERT concept, could be then communicated by the radiologist together with the diagnostic notes. Thus, it is obviously necessary a minimum informal of the patients as regards the nature and type of radiation, for instance, by the help of some leaflets. In the third part are discussed the factors which lead to get a good image quality taking into account the ALARA principle of Radiation Protection which states the fact that the dose should be 'as low as reasonable achievable'. (author)

  18. IV International Conference on Computer Algebra in Physical Research. Collection of abstracts

    International Nuclear Information System (INIS)

    Rostovtsev, V.A.

    1990-01-01

    The abstracts of the reports made on IV International conference on computer algebra in physical research are presented. The capabilities of application of computers for algebraic computations in high energy physics and quantum field theory are discussed. Particular attention is paid to a software for the REDUCE computer algebra system

  19. Computer-aided assessment of diagnostic images for epidemiological research

    Directory of Open Access Journals (Sweden)

    Gange Stephen J

    2009-11-01

    Full Text Available Abstract Background Diagnostic images are often assessed for clinical outcomes using subjective methods, which are limited by the skill of the reviewer. Computer-aided diagnosis (CAD algorithms that assist reviewers in their decisions concerning outcomes have been developed to increase sensitivity and specificity in the clinical setting. However, these systems have not been well utilized in research settings to improve the measurement of clinical endpoints. Reductions in bias through their use could have important implications for etiologic research. Methods Using the example of cortical cataract detection, we developed an algorithm for assisting a reviewer in evaluating digital images for the presence and severity of lesions. Available image processing and statistical methods that were easily implementable were used as the basis for the CAD algorithm. The performance of the system was compared to the subjective assessment of five reviewers using 60 simulated images. Cortical cataract severity scores from 0 to 16 were assigned to the images by the reviewers and the CAD system, with each image assessed twice to obtain a measure of variability. Image characteristics that affected reviewer bias were also assessed by systematically varying the appearance of the simulated images. Results The algorithm yielded severity scores with smaller bias on images where cataract severity was mild to moderate (approximately ≤ 6/16ths. On high severity images, the bias of the CAD system exceeded that of the reviewers. The variability of the CAD system was zero on repeated images but ranged from 0.48 to 1.22 for the reviewers. The direction and magnitude of the bias exhibited by the reviewers was a function of the number of cataract opacities, the shape and the contrast of the lesions in the simulated images. Conclusion CAD systems are feasible to implement with available software and can be valuable when medical images contain exposure or outcome information for

  20. Advanced Scientific Computing Research Exascale Requirements Review. An Office of Science review sponsored by Advanced Scientific Computing Research, September 27-29, 2016, Rockville, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Almgren, Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); DeMar, Phil [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Riley, Katherine [Argonne Leadership Computing Facility, Argonne, IL (United States); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bard, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Science Network; Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hack, James [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Monga, Inder [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Science Network; Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Rotman, Lauren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Science Network; Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bethel, Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bosilca, George [Univ. of Tennessee, Knoxville, TN (United States); Cappello, Frank [Argonne National Lab. (ANL), Argonne, IL (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Hill, Judy [Oak Ridge Leadership Computing Facility, Oak Ridge, TN (United States); Hollingsworth, Jeffrey K. [Univ. of Maryland, College Park, MD (United States); McInnes, Lois Curfman [Argonne National Lab. (ANL), Argonne, IL (United States); Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Moore, Shirley [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Moreland, Ken [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roser, Rob [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Shende, Sameer [Univ. of Oregon, Eugene, OR (United States); Shipman, Galen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-06-20

    The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of the U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.

  1. Research on the Teaching System of the University Computer Foundation

    Directory of Open Access Journals (Sweden)

    Ji Xiaoyun

    2016-01-01

    Full Text Available Inonal students, the teaching contents, classification, hierarchical teaching methods with the combination of professional level training, as well as for top-notch students after class to promote comprehensive training methods for different students, establish online Q & A, test platform, to strengthen the integration professional education and computer education and training system of college computer basic course of study and exploration, and the popularization and application of the basic programming course, promote the cultivation of university students in the computer foundation, thinking methods and innovative practice ability, achieve the goal of individualized educ the College of computer basic course teaching, the specific circumstances of the need for students, professiation.

  2. First Author Research Productivity of United States Radiation Oncology Residents: 2002-2007

    International Nuclear Information System (INIS)

    Morgan, Peter B.; Sopka, Dennis M.; Kathpal, Madeera; Haynes, Jeffrey C.; Lally, Brian E.; Li, Linna

    2009-01-01

    Purpose: Participation in investigative research is a required element of radiation oncology residency in the United States. Our purpose was to quantify the first author research productivity of recent U.S. radiation oncology residents during their residency training. Methods and Materials: We performed a computer-based search of PubMed and a manual review of the proceedings of the annual meetings of the American Society for Therapeutic Radiology and Oncology to identify all publications and presented abstracts with a radiation oncology resident as the first author between 2002 and 2007. Results: Of 1,098 residents trained at 81 programs, 50% published ≥1 article (range, 0-9), and 53% presented ≥1 abstract (range, 0-3) at an American Society for Therapeutic Radiology and Oncology annual meeting. The national average was 1.01 articles published and 1.09 abstracts presented per resident during 4 years of training. Of 678 articles published, 82% represented original research and 18% were review articles. Residents contributed 15% of all abstracts at American Society for Therapeutic Radiology and Oncology annual meetings, and the resident contribution to orally presented abstracts increased from 12% to 21% during the study period. Individuals training at programs with >6 residents produced roughly twice as many articles and abstracts. Holman Research Pathway residents produced double the national average of articles and abstracts. Conclusion: Although variability exists among individuals and among training programs, U.S. radiation oncology residents routinely participate in investigative research suitable for publication or presentation at a scientific meeting. These data provide national research benchmarks that can assist current and future radiation oncology residents and training programs in their self-assessment and research planning.

  3. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    Science.gov (United States)

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  4. Tribal wilderness research needs and issues in the United States and Canada

    Science.gov (United States)

    Dan McDonald; Tom McDonald; Leo H. McAvoy

    2000-01-01

    This paper represents a dialogue between tribal wilderness managers and researchers on the primary research needs of tribal wilderness in the United States and Canada. The authors identify a number of research priorities for tribal wildlands. The paper also discusses some major issues and challenges faced by researchers conducting research in areas that are culturally...

  5. Computer in surgery | Bode | Nigerian Journal of Surgical Research

    African Journals Online (AJOL)

    How has the advent of the computer impacted the field of surgery? Is it worth embracing for the older practitioners? What does the future portend for our ancient noble profession? This paper reviews current applications of computer technology in the field of surgery and the hopes it hold out to surgeons in developing ...

  6. Computer-Assisted Language Learning: Diversity in Research and Practice

    Science.gov (United States)

    Stockwell, Glenn, Ed.

    2012-01-01

    Computer-assisted language learning (CALL) is an approach to teaching and learning languages that uses computers and other technologies to present, reinforce, and assess material to be learned, or to create environments where teachers and learners can interact with one another and the outside world. This book provides a much-needed overview of the…

  7. Computer - based modeling in extract sciences research -III ...

    African Journals Online (AJOL)

    Molecular modeling techniques have been of great applicability in the study of the biological sciences and other exact science fields like agriculture, mathematics, computer science and the like. In this write up, a list of computer programs for predicting, for instance, the structure of proteins has been provided. Discussions on ...

  8. SU-D-BRD-03: A Gateway for GPU Computing in Cancer Radiotherapy Research

    Energy Technology Data Exchange (ETDEWEB)

    Jia, X; Folkerts, M [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States); Shi, F; Yan, H; Yan, Y; Jiang, S [UT Southwestern Medical Center, Dallas, TX (United States); Sivagnanam, S; Majumdar, A [University of California San Diego, La Jolla, CA (United States)

    2014-06-01

    Purpose: Graphics Processing Unit (GPU) has become increasingly important in radiotherapy. However, it is still difficult for general clinical researchers to access GPU codes developed by other researchers, and for developers to objectively benchmark their codes. Moreover, it is quite often to see repeated efforts spent on developing low-quality GPU codes. The goal of this project is to establish an infrastructure for testing GPU codes, cross comparing them, and facilitating code distributions in radiotherapy community. Methods: We developed a system called Gateway for GPU Computing in Cancer Radiotherapy Research (GCR2). A number of GPU codes developed by our group and other developers can be accessed via a web interface. To use the services, researchers first upload their test data or use the standard data provided by our system. Then they can select the GPU device on which the code will be executed. Our system offers all mainstream GPU hardware for code benchmarking purpose. After the code running is complete, the system automatically summarizes and displays the computing results. We also released a SDK to allow the developers to build their own algorithm implementation and submit their binary codes to the system. The submitted code is then systematically benchmarked using a variety of GPU hardware and representative data provided by our system. The developers can also compare their codes with others and generate benchmarking reports. Results: It is found that the developed system is fully functioning. Through a user-friendly web interface, researchers are able to test various GPU codes. Developers also benefit from this platform by comprehensively benchmarking their codes on various GPU platforms and representative clinical data sets. Conclusion: We have developed an open platform allowing the clinical researchers and developers to access the GPUs and GPU codes. This development will facilitate the utilization of GPU in radiation therapy field.

  9. Computer-aided modeling of aluminophosphate zeolites as packings of building units

    KAUST Repository

    Peskov, Maxim

    2012-03-22

    New building schemes of aluminophosphate molecular sieves from packing units (PUs) are proposed. We have investigated 61 framework types discovered in zeolite-like aluminophosphates and have identified important PU combinations using a recently implemented computational algorithm of the TOPOS package. All PUs whose packing completely determines the overall topology of the aluminophosphate framework were described and catalogued. We have enumerated 235 building models for the aluminophosphates belonging to 61 zeolite framework types, from ring- or cage-like PU clusters. It is indicated that PUs can be considered as precursor species in the zeolite synthesis processes. © 2012 American Chemical Society.

  10. Key considerations for the success of Medical Education Research and Innovation units in Canada: unit director perceptions.

    Science.gov (United States)

    Varpio, Lara; Bidlake, Erin; Humphrey-Murto, Sue; Sutherland, Stephanie; Hamstra, Stanley J

    2014-08-01

    Growth in the field of medical education is evidenced by the proliferation of units dedicated to advancing Medical Education Research and Innovation (MERI). While a review of the literature discovered narrative accounts of MERI unit development, we found no systematic examinations of the dimensions of and structures that facilitate the success of these units. We conducted qualitative interviews with the directors of 12 MERI units across Canada. Data were analyzed using qualitative description (Sandelowski in Res Nurs Health 23:334-340, 2000). Final analysis drew on Bourdieu's (Outline of a theory of practice. Cambridge University Press, Cambridge, 1977; Media, culture and society: a critical reader. Sage, London, 1986; Language and symbolic power. Harvard University Press, Cambridge, 1991) concepts of field, habitus, and capital, and more recent research investigating the field of MERI (Albert in Acad Med 79:948-954, 2004; Albert et al. in Adv Health Sci Educ 12:103-115, 2007). When asked about the metrics by which they define their success, directors cited: teaching, faculty mentoring, building collaborations, delivering conference presentations, winning grant funding, and disseminating publications. Analyzed using Bourdieu's concepts, these metrics are discussed as forms of capital that have been legitimized in the MERI field. All directors, with the exception of one, described success as being comprised of elements (capital) at both ends of the service-research spectrum (i.e., Albert's PP-PU structure). Our analysis highlights the forms of habitus (i.e., behaviors, attitudes, demeanors) directors use to negotiate, strategize and position the unit within their local context. These findings may assist institutions in developing a new-or reorganizing an existing-MERI unit. We posit that a better understanding of these complex social structures can help units become savvy participants in the MERI field. With such insight, units can improve their academic output and

  11. Embedded, everywhere: a research agenda for networked systems of embedded computers

    National Research Council Canada - National Science Library

    Committee on Networked Systems of Embedded Computers; National Research Council Staff; Division on Engineering and Physical Sciences; Computer Science and Telecommunications Board; National Academy of Sciences

    2001-01-01

    .... Embedded, Everywhere explores the potential of networked systems of embedded computers and the research challenges arising from embedding computation and communications technology into a wide variety of applicationsâ...

  12. [Research Conducted at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1997-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period 1 Oct. 1996 - 31 Mar. 1997.

  13. Research in progress at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1987-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1987 through October 1, 1987.

  14. Magnetic fusion energy and computers: the role of computing in magnetic fusion energy research and development

    International Nuclear Information System (INIS)

    1979-10-01

    This report examines the role of computing in the Department of Energy magnetic confinement fusion program. The present status of the MFECC and its associated network is described. The third part of this report examines the role of computer models in the main elements of the fusion program and discusses their dependence on the most advanced scientific computers. A review of requirements at the National MFE Computer Center was conducted in the spring of 1976. The results of this review led to the procurement of the CRAY 1, the most advanced scientific computer available, in the spring of 1978. The utilization of this computer in the MFE program has been very successful and is also described in the third part of the report. A new study of computer requirements for the MFE program was conducted during the spring of 1979 and the results of this analysis are presented in the forth part of this report

  15. Money for Research, Not for Energy Bills: Finding Energy and Cost Savings in High Performance Computer Facility Designs

    Energy Technology Data Exchange (ETDEWEB)

    Drewmark Communications; Sartor, Dale; Wilson, Mark

    2010-07-01

    High-performance computing facilities in the United States consume an enormous amount of electricity, cutting into research budgets and challenging public- and private-sector efforts to reduce energy consumption and meet environmental goals. However, these facilities can greatly reduce their energy demand through energy-efficient design of the facility itself. Using a case study of a facility under design, this article discusses strategies and technologies that can be used to help achieve energy reductions.

  16. Feasibility Study and Cost Benefit Analysis of Thin-Client Computer System Implementation Onboard United States Navy Ships

    National Research Council Canada - National Science Library

    Arbulu, Timothy D; Vosberg, Brian J

    2007-01-01

    The purpose of this MBA project was to conduct a feasibility study and a cost benefit analysis of using thin-client computer systems instead of traditional networks onboard United States Navy ships...

  17. Software architecture for a multi-purpose real-time control unit for research purposes

    Science.gov (United States)

    Epple, S.; Jung, R.; Jalba, K.; Nasui, V.

    2017-05-01

    A new, freely programmable, scalable control system for academic research purposes was developed. The intention was, to have a control unit capable of handling multiple PT1000 temperature sensors at reasonable accuracy and temperature range, as well as digital input signals and providing powerful output signals. To take full advantage of the system, control-loops are run in real time. The whole eight bit system with very limited memory runs independently of a personal computer. The two on board RS232 connectors allow to connect further units or to connect other equipment, as required in real time. This paper describes the software architecture for the third prototype that now provides stable measurements and an improvement in accuracy compared to the previous designs. As test case a thermal solar system to produce hot tap water and assist heating in a single-family house was implemented. The solar fluid pump was power-controlled and several temperatures at different points in the hydraulic system were measured and used in the control algorithms. The software architecture proved suitable to test several different control strategies and their corresponding algorithms for the thermal solar system.

  18. The research of high voltage switchgear detecting unit

    Science.gov (United States)

    Ji, Tong; Xie, Wei; Wang, Xiaoqing; Zhang, Jinbo

    2017-07-01

    In order to understand the status of the high voltage switch in the whole life circle, you must monitor the mechanical and electrical parameters that affect device health. So this paper gives a new high voltage switchgear detecting unit based on ARM technology. It can measure closing-opening mechanical wave, storage motor current wave and contactor temperature to judge the device’s health status. When something goes wrong, it can be on alert and give some advice. The practice showed that it can meet the requirements of circuit breaker mechanical properties temperature online detection.

  19. The research of computer network security and protection strategy

    Science.gov (United States)

    He, Jian

    2017-05-01

    With the widespread popularity of computer network applications, its security is also received a high degree of attention. Factors affecting the safety of network is complex, for to do a good job of network security is a systematic work, has the high challenge. For safety and reliability problems of computer network system, this paper combined with practical work experience, from the threat of network security, security technology, network some Suggestions and measures for the system design principle, in order to make the masses of users in computer networks to enhance safety awareness and master certain network security technology.

  20. The United States Advanced Reactor Technologies Research and Development Program

    International Nuclear Information System (INIS)

    O’Connor, Thomas J.

    2014-01-01

    The following aspects are addressed: • Nuclear energy mission; • Reactor research development and deployment (RD&D) programs: - Light Water Reactor Sustainability Program; - Small Modular Reactor Licensing Technical Support; - Advanced Reactor Technologies (ART)

  1. Market research of window units and doors industry in Russia

    OpenAIRE

    Grishankova, Elena

    2010-01-01

    The purpose of this research is to analyze macro-environmental and competitive forces in the Russian market and to determine possible entry modes for a new company. Some practical information on legal issues and regulatory organizations is also included in the paper in order to create a comprehensive overview of any potentially influential factors. The conceptual framework is based on the macro-environmental market research approach, Michael Porter’s five forces framework and internationa...

  2. United States Crystalline Repository Project - key research areas

    International Nuclear Information System (INIS)

    Patera, E.S.

    1986-01-01

    The Crystalline Repository Project is responsible for siting the second high-level nuclear waste repository in crystalline rock for the US Department of Energy. A methodology is being developed to define data and information needs and a way to evaluate that information. The areas of research the Crystalline Repository Project is involved in include fluid flow in a fractured network, coupled thermal, chemical and flow processes and cooperation in other nations and OECD research programs

  3. Uniting Resilience Research and Practice With an Inequalities Approach

    Directory of Open Access Journals (Sweden)

    Angie Hart

    2016-12-01

    Full Text Available The concept of resilience has evolved, from an individual-level characteristic to a wider ecological notion that takes into account broader person–environment interactions, generating an increased interest in health and well-being research, practice and policy. At the same time, the research and policy-based attempts to build resilience are increasingly under attack for responsibilizing individuals and maintaining, rather than challenging, the inequitable structure of society. When adversities faced by children and young people result from embedded inequality and social disadvantage, resilience-based knowledge has the potential to influence the wider adversity context. Therefore, it is vital that conceptualizations of resilience encompass this potential for marginalized people to challenge and transform aspects of their adversity, without holding them responsible for the barriers they face. This article outlines and provides examples from an approach that we are taking in our research and practice, which we have called Boingboing resilience. We argue that it is possible to bring resilience research and practice together with a social justice approach, giving equal and simultaneous attention to individuals and to the wider system. To achieve this goal, we suggest future research should have a co-produced and inclusive research design that overcomes the dilemma of agency and responsibility, contains a socially transformative element, and has the potential to empower children, young people, and families.

  4. United Kingdom health research analyses and the benefits of shared data.

    Science.gov (United States)

    Carter, James G; Sherbon, Beverley J; Viney, Ian S

    2016-06-24

    To allow research organisations to co-ordinate activity to the benefit of national and international funding strategies requires assessment of the funding landscape; this, in turn, relies on a consistent approach for comparing expenditure on research. Here, we discuss the impact and benefits of the United Kingdom's Health Research Classification System (HRCS) in national landscaping analysis of health research and the pros and cons of performing large-scale funding analyses. The first United Kingdom health research analysis (2004/2005) brought together the 11 largest public and charity funders of health research to develop the HRCS and use this categorisation to examine United Kingdom health research. The analysis was revisited in 2009/2010 and again in 2014. The most recent quinquennial analysis in 2014 compiled data from 64 United Kingdom research organisations, accounting for 91% of all public/charitable health research funding in the United Kingdom. The three analyses summarise the United Kingdom's health research expenditure in 2004/2005, 2009/2010 and 2014, and can be used to identify changes in research activity and disease focus over this 10 year period. The 2004/2005 analysis provided a baseline for future reporting and evidence for a United Kingdom Government review that recommended the co-ordination of United Kingdom health research should be strengthened to accelerate the translation of basic research into clinical and economic benefits. Through the second and third analyses, we observed strategic prioritisation of certain health research activities and disease areas, with a strong trend toward increased funding for more translational research, and increases in specific areas such as research on prevention. The use of HRCS in the United Kingdom to analyse the research landscape has provided benefit both to individual participatory funders and in coordinating initiatives at a national level. A modest amount of data for each project is sufficient for a

  5. Original Research Intra-abdominal fat: Comparison of computed ...

    African Journals Online (AJOL)

    advantage for composition measurement of no radiation exposure ... Computed Tomography (CT) fat segmentation represents a defined method of quantifying intra-abdominal fat, with .... spiral CT scan with 3-mm slices covering the abdomen,.

  6. Research on application of computer technologies in jewelry process

    Directory of Open Access Journals (Sweden)

    Junbo Xia

    2017-06-01

    Full Text Available Jewelry production is a process of precious raw materials and low losses in processing. The traditional manual mode is unable to meet the needs of enterprises in reality, while the involvement of computer technology can just solve this practical problem. At present, the problem of restricting the application for computer in jewelry production is mainly a failure to find a production model that can serve the whole industry chain with the computer as the core of production. This paper designs a “synchronous and diversified” production model with “computer aided design technology” and “rapid prototyping technology” as the core, and tests with actual production cases, and achieves certain results, which are forward-looking and advanced.

  7. Computer-aided system for cryogenic research facilities

    International Nuclear Information System (INIS)

    Gerasimov, V.P.; Zhelamsky, M.V.; Mozin, I.V.; Repin, S.S.

    1994-01-01

    A computer-aided system is developed for the more effective choice and optimization of the design and manufacturing technologies of the superconductor for the magnet system of the International Thermonuclear Experimental Reactor (ITER) with the aim to ensure the superconductor certification. The computer-aided system provides acquisition, processing, storage and display of data describing the proceeding tests, the detection of any parameter deviations and their analysis. Besides, it generates commands for the equipment switch off in emergency situations. ((orig.))

  8. Computing for magnetic fusion energy research: The next five years

    International Nuclear Information System (INIS)

    Mann, L.; Glasser, A.; Sauthoff, N.

    1991-01-01

    This report considers computing needs in magnetic fusion for the next five years. It is the result of two and a half years of effort by representatives of all aspects of the magnetic fusion community. The report also factors in the results of a survey that was distributed to the laboratories and universities that support fusion. There are four areas of computing support discussed: theory, experiment, engineering, and systems

  9. Increasing the trustworthiness of research results: the role of computers in qualitative text analysis

    Science.gov (United States)

    Lynne M. Westphal

    2000-01-01

    By using computer packages designed for qualitative data analysis a researcher can increase trustworthiness (i.e., validity and reliability) of conclusions drawn from qualitative research results. This paper examines trustworthiness issues and therole of computer software (QSR's NUD*IST) in the context of a current research project investigating the social...

  10. Computer science handbook. Vol. 13.3. Environmental computer science. Computer science methods for environmental protection and environmental research

    International Nuclear Information System (INIS)

    Page, B.; Hilty, L.M.

    1994-01-01

    Environmental computer science is a new partial discipline of applied computer science, which makes use of methods and techniques of information processing in environmental protection. Thanks to the inter-disciplinary nature of environmental problems, computer science acts as a mediator between numerous disciplines and institutions in this sector. The handbook reflects the broad spectrum of state-of-the art environmental computer science. The following important subjects are dealt with: Environmental databases and information systems, environmental monitoring, modelling and simulation, visualization of environmental data and knowledge-based systems in the environmental sector. (orig.) [de

  11. Mapping Investments and Published Outputs in Norovirus Research: A Systematic Analysis of Research Funded in the United States and United Kingdom During 1997-2013.

    Science.gov (United States)

    Head, Michael G; Fitchett, Joseph R; Lichtman, Amos B; Soyode, Damilola T; Harris, Jennifer N; Atun, Rifat

    2016-02-01

    Norovirus accounts for a considerable portion of the global disease burden. Mapping national or international investments relating to norovirus research is limited. We analyzed the focus and type of norovirus research funding awarded to institutions in the United States and United Kingdom during 1997-2013. Data were obtained from key public and philanthropic funders across both countries, and norovirus-related research was identified from study titles and abstracts. Included studies were further categorized by the type of scientific investigation, and awards related to vaccine, diagnostic, and therapeutic research were identified. Norovirus publication trends are also described using data from Scopus. In total, US and United Kingdom funding investment for norovirus research was £97.6 million across 349 awards; 326 awards (amount, £84.9 million) were received by US institutions, and 23 awards (£12.6 million) were received by United Kingdom institutions. Combined, £81.2 million of the funding (83.2%) was for preclinical research, and £16.4 million (16.8%) was for translational science. Investments increased from £1.7 million in 1997 to £11.8 million in 2013. Publication trends showed a consistent temporal increase from 48 in 1997 to 182 in 2013. Despite increases over time, trends in US and United Kingdom funding for norovirus research clearly demonstrate insufficient translational research and limited investment in diagnostics, therapeutics, or vaccine research. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  12. Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research

    OpenAIRE

    Stodden, Victoria; Miguez, Sheila

    2014-01-01

    The goal of this article is to coalesce a discussion around best practices for scholarly research that utilizes computational methods, by providing a formalized set of best practice recommendations to guide computational scientists and other stakeholders wishing to disseminate reproducible research, facilitate innovation by enabling data and code re-use, and enable broader communication of the output of computational scientific research. Scholarly dissemination and communication standards are...

  13. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    Science.gov (United States)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  14. Chemical Equilibrium, Unit 2: Le Chatelier's Principle. A Computer-Enriched Module for Introductory Chemistry. Student's Guide and Teacher's Guide.

    Science.gov (United States)

    Jameson, A. Keith

    Presented are the teacher's guide and student materials for one of a series of self-instructional, computer-based learning modules for an introductory, undergraduate chemistry course. The student manual for this unit on Le Chatelier's principle includes objectives, prerequisites, pretest, instructions for executing the computer program, and…

  15. Uncited Research Articles in Popular United States General Radiology Journals.

    Science.gov (United States)

    Rosenkrantz, Andrew B; Chung, Ryan; Duszak, Richard

    2018-05-03

    This study aimed to characterize articles in popular general radiology journals that go uncited for a decade after publication. Using the Web of Science database, we identified annual citation counts for 13,459 articles published in Radiology, American Journal of Roentgenology, and Academic Radiology between 1997 and 2006. From this article cohort, we then identified all original research articles that accrued zero citations within a decade of publication. A concurrent equal-sized cohort of most cited articles was created. Numerous characteristics of the uncited and most cited articles were identified and compared. Only 47 uncited articles went uncited for a decade after publication. When compared to the 47 most cited articles over that same window, the uncited articles were significantly (P articles, uncited articles also had significantly (P articles published in popular general radiology journals, only a very small number of original research investigations remained uncited a decade after publication. Given that citations reflect the impact of radiology research, this observation suggests that journals are appropriately selecting meaningful work. Investigators seeking to avoid futile publication might consider their research initiatives in light of these characteristics. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  16. Toxic Hazards Research Unit Annual Technical Report: 1985

    Science.gov (United States)

    1985-09-01

    varnish makers’ and painters’ naphtha, Toxicol. Appl. Pharmacol., 32:263-281. Carpenter, C. P.. E. R. Kinkead, D. L. Geary, L. J. Sullivan, Jr., and J...and Pharmacology of Inorganic and Fluorine Contairnin Compounds, AMRL-TR-67-224, Aerospace Medical Research Laboiatory, Wright-Patterson Air Force Base

  17. Developing information technology at the Medical Research Unit of the Albert Schweitzer Hospital in Lambaréné, Gabon.

    Science.gov (United States)

    Dibacka, Paterne Lessihuin; Bounda, Yann; Nguema, Davy Ondo; Lell, Bertrand

    2010-03-01

    Information technology has become a key resource for research institutions, providing services such as hardware, software and network maintenance, as well as data management services. The IT department of the Medical Research Unit (MRU) of the Albert Schweitzer Hospital in Lambaréné, Gabon is a good example of how IT has developed at African Research Centres in recent years and demonstrates the scope of work that a modern research centre needs to offer. It illustrates the development in the past 15 years--from single computers maintained by investigators to the present situation of a group of well-trained local IT personal who are in charge of a variety of hardware and software and who also develop applications for use in a research environment. Open source applications are particularly suited for these needs and various applications are used in data management, data analysis, accounting, administration and quality management.

  18. The possible usability of three-dimensional cone beam computed dental tomography in dental research

    Science.gov (United States)

    Yavuz, I.; Rizal, M. F.; Kiswanjaya, B.

    2017-08-01

    The innovations and advantages of three-dimensional cone beam computed dental tomography (3D CBCT) are continually growing for its potential use in dental research. Imaging techniques are important for planning research in dentistry. Newly improved 3D CBCT imaging systems and accessory computer programs have recently been proven effective for use in dental research. The aim of this study is to introduce 3D CBCT and open a window for future research possibilities that should be given attention in dental research.

  19. Computer interfacing of the unified systems for personnel supervising in nuclear units

    International Nuclear Information System (INIS)

    Staicu, M.

    1997-01-01

    The dosimetric supervising of the personnel working in nuclear units is based on the information supplied by: 1) the dosimetric data obtained by the method of thermoluminescence; 2) the dosimetric data obtained by the method of photo dosimetry: 3) the records from medical periodic control. To create a unified system of supervising the following elements were combined: a) an Automatic System of TLD Reading and Data Processing (SACDTL). The data from this system are transmitted 'on line' to the computer; b) the measuring line of the optical density of exposed dosimetric films. The interface achieved within the general ensemble SACDTL could be adapted to this line of measurement. The transmission of the data from the measurement line to the computer is made 'on line'; c) the medical surveillance data for each person transmitted 'off line' to the database computer. The unified system resulting from the unification of the three supervising systems will achieve the following general functions: - registering of the personnel working in the nuclear field; - recording the dosimetric data; - processing and presentation of the data; - issuing of measurement bulletins. Thus, by means of unified database, dosimetric intercomparison and correlative studies can be undertaken. (author)

  20. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    Science.gov (United States)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  1. Research in progress and other activities of the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1993-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  2. Toxic Hazards Research Unit Annual Technical Report: 1984

    Science.gov (United States)

    1984-09-01

    exposed to TOCP exhibited the classic lesions of delayed neuropathy (Smith and Lillie, 1931; Barnes and Denz, 1953; (avanagh, 1954; and Fenton , 1955...Safety of *.i Chemicals in Food, Drugs, and Cosmetics , The staff of the Divi- sion of Pharmacology of the Federal Food and Drug Administration, Austin...Annual Technical Report: 1967, AMRL-TR-67-137 (AD 834723), Aerospace Medical Research Laboratory, Wright-Patterson Air Force Base, Ohio. Fenton , J. C

  3. Government Cloud Computing Policies: Potential Opportunities for Advancing Military Biomedical Research.

    Science.gov (United States)

    Lebeda, Frank J; Zalatoris, Jeffrey J; Scheerer, Julia B

    2018-02-07

    indicated that the security infrastructure in cloud services may be more compliant with the Health Insurance Portability and Accountability Act of 1996 regulations than traditional methods. To gauge the DoD's adoption of cloud technologies proposed metrics included cost factors, ease of use, automation, availability, accessibility, security, and policy compliance. Since 2009, plans and policies were developed for the use of cloud technology to help consolidate and reduce the number of data centers which were expected to reduce costs, improve environmental factors, enhance information technology security, and maintain mission support for service members. Cloud technologies were also expected to improve employee efficiency and productivity. Federal cloud computing policies within the last decade also offered increased opportunities to advance military healthcare. It was assumed that these opportunities would benefit consumers of healthcare and health science data by allowing more access to centralized cloud computer facilities to store, analyze, search and share relevant data, to enhance standardization, and to reduce potential duplications of effort. We recommend that cloud computing be considered by DoD biomedical researchers for increasing connectivity, presumably by facilitating communications and data sharing, among the various intra- and extramural laboratories. We also recommend that policies and other guidances be updated to include developing additional metrics that will help stakeholders evaluate the above mentioned assumptions and expectations. Published by Oxford University Press on behalf of the Association of Military Surgeons of the United States 2018. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  4. Computer algebra as a research tool in physics

    International Nuclear Information System (INIS)

    Drouffe, J.M.

    1985-04-01

    The progress of computer algebra observed during these last years has had certainly an impact in physics. I want to precise the role of these new techniques in this application domain and to analyze their present limitations. In Section 1, I describe briefly the use of algebraic manipulation programs at the elementary level. The numerical and symbolic solutions of problems are compared in Section 2. Section 3 is devoted to a prospective about the use of computer algebra at the highest level, as an ''intelligent'' system. I recall in Section 4 what is required from a system to be used in physics

  5. Research of scatter correction on industry computed tomography

    International Nuclear Information System (INIS)

    Sun Shaohua; Gao Wenhuan; Zhang Li; Chen Zhiqiang

    2002-01-01

    In the scanning process of industry computer tomography, scatter blurs the reconstructed image. The grey values of pixels in the reconstructed image are away from what is true and such effect need to be corrected. If the authors use the conventional method of deconvolution, many steps of iteration are needed and the computing time is not satisfactory. The author discusses a method combining Ordered Subsets Convex algorithm and scatter model to implement scatter correction and promising results are obtained in both speed and image quality

  6. Wind Energy in the United States: Market and Research Update

    International Nuclear Information System (INIS)

    Goldman, P.R.; Thresher, R.W.; Hock, S.M.

    1999-01-01

    U.S. market activity has increased over the last two years. In 1998, new capacity totaled about 150 MW and projected 1999 capacity additions are over 600 MW. As the electricity market continues to evolve under restructuring, the U.S. Department of Energy (U.S. DOE) Wind Energy Program has positioned itself to work with industry to meet current challenges and opportunities, and prepare for the market of tomorrow. Some opportunities include green power markets and distributed applications, although a primary challenge involves the fact that avoided cost payments to renewable generators are not high enough to economically support projects. A recently incorporated power exchange in California, APX, Inc., has demonstrated that green power does attract a premium over prices on the conventional power exchange. The key elements of the U.S. DOE Wind Program are (1) Applied Research, which is critical for achieving advanced turbine designs capable of competing in a restructured market that emphasizes low cost generation; (2) Turbine Research, which supports the U.S. industry in developing competitive, high performance, reliable wind turbine technology for global energy markets; and (3) Cooperative Research and Testing, under which standards development and certification testing are the key activities for the current year

  7. An Overview of Research Infrastructure for Medieval Studies in the United States: Associations, Institutes, and Universities

    Directory of Open Access Journals (Sweden)

    Zan Kocher

    2011-11-01

    Full Text Available This overview of research infrastructure in the United States brieflymentions some institutes, universities, associations, conferences,sources of funding, types of courses, research databases, academicjournals and book publishers. It intends to make American medievalistresources better accessible to colleagues from other countries, and toencourage those who wish to study in the United States and those whoare using the Internet to seek printed or digital materials for theirteaching or research.

  8. Design research in statistics education : on symbolizing and computer tools

    NARCIS (Netherlands)

    Bakker, A.

    2004-01-01

    The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research

  9. Cone beam computed tomography image guidance system for a dedicated intracranial radiosurgery treatment unit.

    Science.gov (United States)

    Ruschin, Mark; Komljenovic, Philip T; Ansell, Steve; Ménard, Cynthia; Bootsma, Gregory; Cho, Young-Bin; Chung, Caroline; Jaffray, David

    2013-01-01

    Image guidance has improved the precision of fractionated radiation treatment delivery on linear accelerators. Precise radiation delivery is particularly critical when high doses are delivered to complex shapes with steep dose gradients near critical structures, as is the case for intracranial radiosurgery. To reduce potential geometric uncertainties, a cone beam computed tomography (CT) image guidance system was developed in-house to generate high-resolution images of the head at the time of treatment, using a dedicated radiosurgery unit. The performance and initial clinical use of this imaging system are described. A kilovoltage cone beam CT system was integrated with a Leksell Gamma Knife Perfexion radiosurgery unit. The X-ray tube and flat-panel detector are mounted on a translational arm, which is parked above the treatment unit when not in use. Upon descent, a rotational axis provides 210° of rotation for cone beam CT scans. Mechanical integrity of the system was evaluated over a 6-month period. Subsequent clinical commissioning included end-to-end testing of targeting performance and subjective image quality performance in phantoms. The system has been used to image 2 patients, 1 of whom received single-fraction radiosurgery and 1 who received 3 fractions, using a relocatable head frame. Images of phantoms demonstrated soft tissue contrast visibility and submillimeter spatial resolution. A contrast difference of 35 HU was easily detected at a calibration dose of 1.2 cGy (center of head phantom). The shape of the mechanical flex vs scan angle was highly reproducible and exhibited cone beam CT image guidance system was successfully adapted to a radiosurgery unit. The system is capable of producing high-resolution images of bone and soft tissue. The system is in clinical use and provides excellent image guidance without invasive frames. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Monte Carlo computation in the applied research of nuclear technology

    International Nuclear Information System (INIS)

    Xu Shuyan; Liu Baojie; Li Qin

    2007-01-01

    This article briefly introduces Monte Carlo Methods and their properties. It narrates the Monte Carlo methods with emphasis in their applications to several domains of nuclear technology. Monte Carlo simulation methods and several commonly used computer software to implement them are also introduced. The proposed methods are demonstrated by a real example. (authors)

  11. ND6600 computer in fusion-energy research

    International Nuclear Information System (INIS)

    Young, K.G.

    1982-12-01

    The ND6600, a computer-based multichannel analyzer with eight ADCs, is used to acquire x-ray data. This manual introduces a user to the Nuclear Data system and contains the information necessary for the user to acquire, display, and record data. The manual also guides the programmer in the hardware and software maintenance of the system

  12. Computer - based modeling in extract sciences research -I ...

    African Journals Online (AJOL)

    Specifically, in the discipline of chemistry, it has been of great utility. Its use dates back to the 17th Century and includes such wide areas as computational chemistry, chemoinformatics, molecular mechanics, chemical dynamics, molecular dynamics, molecular graphics and algorithms. Modeling has been employed ...

  13. ND6600 computer in fusion-energy research

    Energy Technology Data Exchange (ETDEWEB)

    Young, K.G.

    1982-12-01

    The ND6600, a computer-based multichannel analyzer with eight ADCs, is used to acquire x-ray data. This manual introduces a user to the Nuclear Data system and contains the information necessary for the user to acquire, display, and record data. The manual also guides the programmer in the hardware and software maintenance of the system.

  14. Characteristic research on Hong Kong "I learned" series computer textbooks

    Science.gov (United States)

    Hu, Jinyan; Liu, Zhongxia; Li, Yuanyuan; Lu, Jianheng; Zhang, Lili

    2011-06-01

    Currently, the construction of information technology textbooks in the primary and middle schools is an important content of the information technology curriculum reform. The article expect to have any inspire and reference on inland China school information technology teaching material construction and development through the analyzing and refining the characteristics of the Hong Kong quality textbook series - "I learn . elementary school computer cognitive curriculum".

  15. Computer Generated Optical Illusions: A Teaching and Research Tool.

    Science.gov (United States)

    Bailey, Bruce; Harman, Wade

    Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…

  16. Summary of research in applied mathematics, numerical analysis, and computer sciences

    Science.gov (United States)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  17. Man-Computer Symbiosis Through Interactive Graphics: A Survey and Identification of Critical Research Areas.

    Science.gov (United States)

    Knoop, Patricia A.

    The purpose of this report was to determine the research areas that appear most critical to achieving man-computer symbiosis. An operational definition of man-computer symbiosis was developed by: (1) reviewing and summarizing what others have said about it, and (2) attempting to distinguish it from other types of man-computer relationships. From…

  18. The development of computer industry and applications of its relevant techniques in nuclear research laboratories

    International Nuclear Information System (INIS)

    Dai Guiliang

    1988-01-01

    The increasing needs for computers in the area of nuclear science and technology are described. The current status of commerical availabe computer products of different scale in world market are briefly reviewed. A survey of some noticeable techniques is given from the view point of computer applications in nuclear science research laboratories

  19. Instructional Computer Use in the Community College: A Discussion of the Research and Its Implications.

    Science.gov (United States)

    Bower, Beverly L.

    1998-01-01

    Reviews research on the instructional benefits of computer technology. Discusses the computer readiness of students, faculty, and institutions, and suggests that despite mixed findings, political and organizational realities indicate computer-based instruction is a feasible alternative for community colleges. Therefore, educators should continue…

  20. Student teaching and research laboratory focusing on brain-computer interface paradigms--A creative environment for computer science students.

    Science.gov (United States)

    Rutkowski, Tomasz M

    2015-08-01

    This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.

  1. Enabling International Safeguards Research and Development in the United States

    International Nuclear Information System (INIS)

    Dwight, John E.; Schanfein, Mark J.; Bjornard, Trond A.

    2009-01-01

    Idaho National Laboratory (INL) is the lead laboratory in nuclear energy research and development within the U.S. Department of Energy national laboratory complex. INL is tasked with the advancement of nuclear energy research and development, and leadership in the renaissance of nuclear power globally. INL scientists have been central to the assessment of needs and the integration of technical programs aimed at the world-wide growth of nuclear power. One of the grand challenges of the nuclear energy resurgence is nuclear nonproliferation. Nonproliferation technology development is key to meeting this challenge. The needed advances in nonproliferation technologies are being made more difficult by the growing gap between increasing demands for nuclear materials to support technology development, and reduced availability of these materials. The gap is caused by the reduction, consolidation and more stringent lockdown of nuclear materials, made necessary by heightened and evolving security concerns, in the face of increased demand for materials to support technology development. Ironically, the increased demand for materials for technology development is made necessary by these same security concerns. The situation will continue to worsen if safeguards and security budgets remain limited for the International Atomic Energy Agency (IAEA) and many member states, while growth in global nuclear energy becomes a reality. Effective U.S. leadership in the closing of this gap is vital to homeland security and global stability. INL has taken positive steps, described in this paper, to close this gap by reestablishing a viable base for the development, testing and demonstration of safeguards and security technologies. Key attributes of this technology development base are (1) the availability of a wide variety of special nuclear materials in forms that allow for enhanced accessibility; (2) ease of access by U.S. government, national laboratory, industry and academic institution

  2. Extreme Scale Computing for First-Principles Plasma Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Choogn-Seock [Princeton University

    2011-10-12

    World superpowers are in the middle of the “Computnik” race. US Department of Energy (and National Nuclear Security Administration) wishes to launch exascale computer systems into the scientific (and national security) world by 2018. The objective is to solve important scientific problems and to predict the outcomes using the most fundamental scientific laws, which would not be possible otherwise. Being chosen into the next “frontier” group can be of great benefit to a scientific discipline. An extreme scale computer system requires different types of algorithms and programming philosophy from those we have been accustomed to. Only a handful of scientific codes are blessed to be capable of scalable usage of today’s largest computers in operation at petascale (using more than 100,000 cores concurrently). Fortunately, a few magnetic fusion codes are competing well in this race using the “first principles” gyrokinetic equations.These codes are beginning to study the fusion plasma dynamics in full-scale realistic diverted device geometry in natural nonlinear multiscale, including the large scale neoclassical and small scale turbulence physics, but excluding some ultra fast dynamics. In this talk, most of the above mentioned topics will be introduced at executive level. Representative properties of the extreme scale computers, modern programming exercises to take advantage of them, and different philosophies in the data flows and analyses will be presented. Examples of the multi-scale multi-physics scientific discoveries made possible by solving the gyrokinetic equations on extreme scale computers will be described. Future directions into “virtual tokamak experiments” will also be discussed.

  3. An overview of enabling technology research in the United States

    International Nuclear Information System (INIS)

    Baker, Charles C.

    2002-01-01

    The mission of the US Fusion Energy Sciences Program is to advance plasma science, fusion science, and fusion technology--the knowledge base needed for an economically and environmentally attractive fusion energy source. In support of this overall mission, the Enabling Technology Program in the US incorporates both near and long term R and D, contributes to material and engineering sciences as well as technology development, contributes to spin-off applications, and performs global systems assessments and focused design studies. This work supports both magnetic and inertial fusion energy (IFE) concepts. The Enabling Technology research mission is to contribute to the national science and technology base by developing the enabling technology for existing and next-step experimental devices, by exploring and understanding key materials and technology feasibility issues for attractive fusion power sources, by conducting advanced design studies that integrate the wealth of our understanding to guide R and D priorities and by developing design solutions for next-step and future devices. The Enabling Technology Program Plan is organized around five elements: plasma technologies, fusion (chamber) technologies, materials sciences, advanced design, and IFE chamber and target technologies. The principal technical features and research objectives are described for each element

  4. Department of Energy research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-08-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programmatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models, the execution of which is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex, and consequently it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  5. An exploratory survey of design science research amongst South African computing scholars

    CSIR Research Space (South Africa)

    Naidoo, R

    2012-10-01

    Full Text Available The debate ensues as to whether the traditional focus of computing research on theory development and verification and therefore has adequate immediate practical relevance. Despite increasing claims of the potential of design science research (DSR...

  6. The Rise of Computing Research in East Africa: The Relationship between Funding, Capacity and Research Community in a Nascent Field

    Science.gov (United States)

    Harsh, Matthew; Bal, Ravtosh; Wetmore, Jameson; Zachary, G. Pascal; Holden, Kerry

    2018-01-01

    The emergence of vibrant research communities of computer scientists in Kenya and Uganda has occurred in the context of neoliberal privatization, commercialization, and transnational capital flows from donors and corporations. We explore how this funding environment configures research culture and research practices, which are conceptualized as…

  7. An example of a United States Nuclear Research Center

    International Nuclear Information System (INIS)

    Bhattacharyya, S. K.

    1999-01-01

    Under the likely scenario in which public support for nuclear energy remains low and fossil fuels continue to be abundant and cheap, government supported nuclear research centers must adapt their missions to ensure that they tackle problems of current significance. It will be critical to be multidisciplinary, to generate economic value, and to apply nuclear competencies to current problems. Addressing problems in nuclear safety, D and D, nuclear waste management, nonproliferation, isotope production are a few examples of current needs in the nuclear arena. Argonne's original mission, to develop nuclear reactor technology, was a critical need for the U.S. in 1946. It would be wise to recognize that this mission was a special instance of a more general one--to apply unique human and physical capital to long term, high risk technology development in response to society's needs. International collaboration will enhance the collective chances for success as the world moves into the 21st century

  8. Public Spending on Health Service and Policy Research in Canada, the United Kingdom, and the United States: A Modest Proposal

    Directory of Open Access Journals (Sweden)

    Vidhi Thakkar

    2017-11-01

    Full Text Available Health services and policy research (HSPR represent a multidisciplinary field which integrates knowledge from health economics, health policy, health technology assessment, epidemiology, political science among other fields, to evaluate decisions in health service delivery. Health service decisions are informed by evidence at the clinical, organizational, and policy level, levels with distinct, managerial drivers. HSPR has an evolving discourse spanning knowledge translation, linkage and exchange between research and decision-maker partners and more recently, implementation science and learning health systems. Local context is important for HSPR and is important in advancing health reform practice. The amounts and configuration of national investment in this field remain important considerations which reflect priority investment areas. The priorities set within this field or research may have greater or lesser effects and promise with respect to modernizing health services in pursuit of better value and better population outcomes. Within Canada an asset map for HSPR was published by the national HSPR research institute. Having estimated publiclyfunded research spending in Canada, we sought identify best available comparable estimates from the United States and the United Kingdom. Investments from industry and charitable organizations were not included in these numbers. This commentary explores spending by the United States, Canada, and the United Kingdom on HSPR as a fraction of total public spending on health and the importance of these respective investments in advancing health service performance. Proposals are offered on the merits of common nomenclature and accounting for areas of investigation in pursuit of some comparable way of assessing priority HSPR investments and suggestions for earmarking such investments to total investment in health services spending.

  9. Computer programs for unit-cell determination in electron diffraction experiments

    International Nuclear Information System (INIS)

    Li, X.Z.

    2005-01-01

    A set of computer programs for unit-cell determination from an electron diffraction tilt series and pattern indexing has been developed on the basis of several well-established algorithms. In this approach, a reduced direct primitive cell is first determined from experimental data, in the means time, the measurement errors of the tilt angles are checked and minimized. The derived primitive cell is then checked for possible higher lattice symmetry and transformed into a proper conventional cell. Finally a least-squares refinement procedure is adopted to generate optimum lattice parameters on the basis of the lengths of basic reflections in each diffraction pattern and the indices of these reflections. Examples are given to show the usage of the programs

  10. Development of Thermal Performance Analysis Computer Program on Turbine Cycle of Yoggwang 3,4 Units

    Energy Technology Data Exchange (ETDEWEB)

    Hong, S.Y.; Choi, K.H.; Jee, M.H.; Chung, S.I. [Korea Electric Power Research Institute, Taejon (Korea)

    2002-07-01

    The objective of the study ''Development of Thermal Performance Analysis Computer Program on Turbine Cycle of Yonggwang 3,4 Units'' is to utilize computerized program to the performance test of the turbine cycle or the analysis of the operational status of the thermal plants. In addition, the result can be applicable to the analysis of the thermal output at the abnormal status and be a powerful tool to find out the main problems for such cases. As a results, the output of this study can supply the way to confirm the technical capability to operate the plants efficiently and to obtain the economic gains remarkably. (author). 27 refs., 73 figs., 6 tabs.

  11. Utero-fetal unit and pregnant woman modeling using a computer graphics approach for dosimetry studies.

    Science.gov (United States)

    Anquez, Jérémie; Boubekeur, Tamy; Bibin, Lazar; Angelini, Elsa; Bloch, Isabelle

    2009-01-01

    Potential sanitary effects related to electromagnetic fields exposure raise public concerns, especially for fetuses during pregnancy. Human fetus exposure can only be assessed through simulated dosimetry studies, performed on anthropomorphic models of pregnant women. In this paper, we propose a new methodology to generate a set of detailed utero-fetal unit (UFU) 3D models during the first and third trimesters of pregnancy, based on segmented 3D ultrasound and MRI data. UFU models are built using recent geometry processing methods derived from mesh-based computer graphics techniques and embedded in a synthetic woman body. Nine pregnant woman models have been generated using this approach and validated by obstetricians, for anatomical accuracy and representativeness.

  12. 3rd International Conference on "Emerging Research in Computing, Information, Communication and Applications"

    CERN Document Server

    Prasad, NH; Nalini, N

    2015-01-01

    This proceedings volume covers the proceedings of ERCICA 2015. ERCICA provides an interdisciplinary forum for researchers, professional engineers and scientists, educators, and technologists to discuss, debate and promote research and technology in the upcoming areas of  Computing, Information, Communication and their Applications. The contents of this book cover emerging research areas in fields of Computing, Information, Communication and Applications. This will prove useful to both researchers and practicing engineers.

  13. [Computers in biomedical research: I. Analysis of bioelectrical signals].

    Science.gov (United States)

    Vivaldi, E A; Maldonado, P

    2001-08-01

    A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.

  14. Research program in computational physics: [Progress report for Task D

    International Nuclear Information System (INIS)

    Guralnik, G.S.

    1987-01-01

    Studies are reported of several aspects of the purely gluonic sector of QCD, including methods for efficiently generating gauge configurations, properties of the standard Wilson action and improved actions, and properties of the pure glue theory itself. Simulation of quantum chromodynamics in the ''quenched approximation'', in which the back reaction of quarks upon gauge fields is neglected, is studied with fermions introduced on the lattice via both Wilson and staggered formulations. Efforts are also reported to compute QCD matrix elements and to simulate QCD theory beyond the quenched approximation considering the effect of the quarks on the gauge fields. Work is in progress toward improving the algorithms used to generate the gauge field configurations and to compute the quark propagators. Implementation of lattice QCD on a hypercube is also reported

  15. Advanced Computation Dynamics Simulation of Protective Structures Research

    Science.gov (United States)

    2013-02-01

    The properties of the reinforcement are a unit weight of 490 lb/ft3, a yield strength of 60 ksi, a Young’s modulus of 29000 ksi, and a Poisson ratio ...distance between the centroid of the tension reinforcement and extreme compression fiber , and a = the depth of the equivalent rectangular compressive...modulus, pr is the Poisson ratio , tlimit is tensile limit, slimit is the shear limit, ftough is the fracture toughness, sreten is the shear retention

  16. An Overview of Computer Network security and Research Technology

    OpenAIRE

    Rathore, Vandana

    2016-01-01

    The rapid development in the field of computer networks and systems brings both convenience and security threats for users. Security threats include network security and data security. Network security refers to the reliability, confidentiality, integrity and availability of the information in the system. The main objective of network security is to maintain the authenticity, integrity, confidentiality, availability of the network. This paper introduces the details of the technologies used in...

  17. Computer Vision Research and Its Applications to Automated Cartography

    Science.gov (United States)

    1984-09-01

    Imaging Geometry from a Camera Transformation Matrix. Many scene analysis algorithms require knowledge of the geometry of the image formation process as a...to compute the imaging geometry directly from the constraints provided by the known data points. Partial information such as the camera’s focal length...Artificial Infelli- 1 fence 4, 1973, 121-137. 8. Kanade, T., A theory of origami world, Artificial Intelligence 13, 1080, 270-311. 0. Barnard, S. T

  18. International stem cell collaboration: how disparate policies between the United States and the United Kingdom impact research.

    Science.gov (United States)

    Luo, Jingyuan; Flynn, Jesse M; Solnick, Rachel E; Ecklund, Elaine Howard; Matthews, Kirstin R W

    2011-03-08

    As the scientific community globalizes, it is increasingly important to understand the effects of international collaboration on the quality and quantity of research produced. While it is generally assumed that international collaboration enhances the quality of research, this phenomenon is not well examined. Stem cell research is unique in that it is both politically charged and a research area that often generates international collaborations, making it an ideal case through which to examine international collaborations. Furthermore, with promising medical applications, the research area is dynamic and responsive to a globalizing science environment. Thus, studying international collaborations in stem cell research elucidates the role of existing international networks in promoting quality research, as well as the effects that disparate national policies might have on research. This study examined the impact of collaboration on publication significance in the United States and the United Kingdom, world leaders in stem cell research with disparate policies. We reviewed publications by US and UK authors from 2008, along with their citation rates and the political factors that may have contributed to the number of international collaborations. The data demonstrated that international collaborations significantly increased an article's impact for UK and US investigators. While this applied to UK authors whether they were corresponding or secondary, this effect was most significant for US authors who were corresponding authors. While the UK exhibited a higher proportion of international publications than the US, this difference was consistent with overall trends in international scientific collaboration. The findings suggested that national stem cell policy differences and regulatory mechanisms driving international stem cell research in the US and UK did not affect the frequency of international collaborations, or even the countries with which the US and UK most

  19. Sanford Underground Research Facility - The United State's Deep Underground Research Facility

    Science.gov (United States)

    Vardiman, D.

    2012-12-01

    The 2.5 km deep Sanford Underground Research Facility (SURF) is managed by the South Dakota Science and Technology Authority (SDSTA) at the former Homestake Mine site in Lead, South Dakota. The US Department of Energy currently supports the development of the facility using a phased approach for underground deployment of experiments as they obtain an advanced design stage. The geology of the Sanford Laboratory site has been studied during the 125 years of operations at the Homestake Mine and more recently as part of the preliminary geotechnical site investigations for the NSF's Deep Underground Science and Engineering Laboratory project. The overall geology at DUSEL is a well-defined stratigraphic sequence of schist and phyllites. The three major Proterozoic units encountered in the underground consist of interbedded schist, metasediments, and amphibolite schist which are crosscut by Tertiary rhyolite dikes. Preliminary geotechnical site investigations included drift mapping, borehole drilling, borehole televiewing, in-situ stress analysis, laboratory analysis of core, mapping and laser scanning of new excavations, modeling and analysis of all geotechnical information. The investigation was focused upon the determination if the proposed site rock mass could support the world's largest (66 meter diameter) deep underground excavation. While the DUSEL project has subsequently been significantly modified, these data are still available to provide a baseline of the ground conditions which may be judiciously extrapolated throughout the entire Proterozoic rock assemblage for future excavations. Recommendations for facility instrumentation and monitoring were included in the preliminary design of the DUSEL project design and include; single and multiple point extensometers, tape extensometers and convergence measurements (pins), load cells and pressure cells, smart cables, inclinometers/Tiltmeters, Piezometers, thermistors, seismographs and accelerometers, scanners (laser

  20. What Does Research on Computer-Based Instruction Have to Say to the Reading Teacher?

    Science.gov (United States)

    Balajthy, Ernest

    1987-01-01

    Examines questions typically asked about the effectiveness of computer-based reading instruction, suggesting that these questions must be refined to provide meaningful insight into the issues involved. Describes several critical problems with existing research and presents overviews of research on the effects of computer-based instruction on…

  1. Unit physics performance of a mix model in Eulerian fluid computations

    Energy Technology Data Exchange (ETDEWEB)

    Vold, Erik [Los Alamos National Laboratory; Douglass, Rod [Los Alamos National Laboratory

    2011-01-25

    In this report, we evaluate the performance of a K-L drag-buoyancy mix model, described in a reference study by Dimonte-Tipton [1] hereafter denoted as [D-T]. The model was implemented in an Eulerian multi-material AMR code, and the results are discussed here for a series of unit physics tests. The tests were chosen to calibrate the model coefficients against empirical data, principally from RT (Rayleigh-Taylor) and RM (Richtmyer-Meshkov) experiments, and the present results are compared to experiments and to results reported in [D-T]. Results show the Eulerian implementation of the mix model agrees well with expectations for test problems in which there is no convective flow of the mass averaged fluid, i.e., in RT mix or in the decay of homogeneous isotropic turbulence (HIT). In RM shock-driven mix, the mix layer moves through the Eulerian computational grid, and there are differences with the previous results computed in a Lagrange frame [D-T]. The differences are attributed to the mass averaged fluid motion and examined in detail. Shock and re-shock mix are not well matched simultaneously. Results are also presented and discussed regarding model sensitivity to coefficient values and to initial conditions (IC), grid convergence, and the generation of atomically mixed volume fractions.

  2. Large Scale Computing and Storage Requirements for Basic Energy Sciences Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Wasserman, Harvey

    2011-03-31

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility supporting research within the Department of Energy's Office of Science. NERSC provides high-performance computing (HPC) resources to approximately 4,000 researchers working on about 400 projects. In addition to hosting large-scale computing facilities, NERSC provides the support and expertise scientists need to effectively and efficiently use HPC systems. In February 2010, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR) and DOE's Office of Basic Energy Sciences (BES) held a workshop to characterize HPC requirements for BES research through 2013. The workshop was part of NERSC's legacy of anticipating users future needs and deploying the necessary resources to meet these demands. Workshop participants reached a consensus on several key findings, in addition to achieving the workshop's goal of collecting and characterizing computing requirements. The key requirements for scientists conducting research in BES are: (1) Larger allocations of computational resources; (2) Continued support for standard application software packages; (3) Adequate job turnaround time and throughput; and (4) Guidance and support for using future computer architectures. This report expands upon these key points and presents others. Several 'case studies' are included as significant representative samples of the needs of science teams within BES. Research teams scientific goals, computational methods of solution, current and 2013 computing requirements, and special software and support needs are summarized in these case studies. Also included are researchers strategies for computing in the highly parallel, 'multi-core' environment that is expected to dominate HPC architectures over the next few years. NERSC has strategic plans and initiatives already underway that address key workshop findings. This report includes a

  3. Developing a clinical trial unit to advance research in an academic institution.

    Science.gov (United States)

    Croghan, Ivana T; Viker, Steven D; Limper, Andrew H; Evans, Tamara K; Cornell, Alissa R; Ebbert, Jon O; Gertz, Morie A

    2015-11-01

    Research, clinical care, and education are the three cornerstones of academic health centers in the United States. The research climate has always been riddled with ebbs and flows, depending on funding availability. During a time of reduced funding, the number and scope of research studies have been reduced, and in some instances, a field of study has been eliminated. Recent reductions in the research funding landscape have led institutions to explore new ways to continue supporting research. Mayo Clinic in Rochester, MN has developed a clinical trial unit within the Department of Medicine, which provides shared resources for many researchers and serves as a solution for training and mentoring new investigators and study teams. By building on existing infrastructure and providing supplemental resources to existing research, the Department of Medicine clinical trial unit has evolved into an effective mechanism for conducting research. This article discusses the creation of a central unit to provide research support in clinical trials and presents the advantages, disadvantages, and required building blocks for such a unit. Copyright © 2015 Mayo Clinic. Published by Elsevier Inc. All rights reserved.

  4. Real-time computation of parameter fitting and image reconstruction using graphical processing units

    Science.gov (United States)

    Locans, Uldis; Adelmann, Andreas; Suter, Andreas; Fischer, Jannis; Lustermann, Werner; Dissertori, Günther; Wang, Qiulin

    2017-06-01

    In recent years graphical processing units (GPUs) have become a powerful tool in scientific computing. Their potential to speed up highly parallel applications brings the power of high performance computing to a wider range of users. However, programming these devices and integrating their use in existing applications is still a challenging task. In this paper we examined the potential of GPUs for two different applications. The first application, created at Paul Scherrer Institut (PSI), is used for parameter fitting during data analysis of μSR (muon spin rotation, relaxation and resonance) experiments. The second application, developed at ETH, is used for PET (Positron Emission Tomography) image reconstruction and analysis. Applications currently in use were examined to identify parts of the algorithms in need of optimization. Efficient GPU kernels were created in order to allow applications to use a GPU, to speed up the previously identified parts. Benchmarking tests were performed in order to measure the achieved speedup. During this work, we focused on single GPU systems to show that real time data analysis of these problems can be achieved without the need for large computing clusters. The results show that the currently used application for parameter fitting, which uses OpenMP to parallelize calculations over multiple CPU cores, can be accelerated around 40 times through the use of a GPU. The speedup may vary depending on the size and complexity of the problem. For PET image analysis, the obtained speedups of the GPU version were more than × 40 larger compared to a single core CPU implementation. The achieved results show that it is possible to improve the execution time by orders of magnitude.

  5. Report by the AERES on the unit: Research unit on the environment under the supervision of establishments and bodies: IRSN

    International Nuclear Information System (INIS)

    2010-10-01

    This report is a kind of audit report on a research laboratory. The authors discuss an assessment of the whole unit activities in terms of strengths and opportunities, aspects to be improved and risks, productions and publications, scientific quality, influence and attractiveness (awards, recruitment capacity, capacity to obtain financing and to tender, participation to international programs), strategy and governance, and project. These same aspects are then discussed and commented for each research axis: study of the seismic hazard, study of risk management related to waste storage in deep geological layer, radionuclide transfer in the biosphere, study of the effects on ecosystems of a chronic exposure to radioactive materials

  6. Design of a microprocessor-based Control, Interface and Monitoring (CIM unit for turbine engine controls research

    Science.gov (United States)

    Delaat, J. C.; Soeder, J. F.

    1983-01-01

    High speed minicomputers were used in the past to implement advanced digital control algorithms for turbine engines. These minicomputers are typically large and expensive. It is desirable for a number of reasons to use microprocessor-based systems for future controls research. They are relatively compact, inexpensive, and are representative of the hardware that would be used for actual engine-mounted controls. The Control, Interface, and Monitoring Unit (CIM) contains a microprocessor-based controls computer, necessary interface hardware and a system to monitor while it is running an engine. It is presently being used to evaluate an advanced turbofan engine control algorithm.

  7. Coordinated Research Projects of the IAEA Atomic and Molecular Data Unit

    Science.gov (United States)

    Braams, B. J.; Chung, H.-K.

    2011-05-01

    The IAEA Atomic and Molecular Data Unit is dedicated to the provision of databases for atomic, molecular and plasma-material interaction (AM/PMI) data that are relevant for nuclear fusion research. IAEA Coordinated Research Projects (CRPs) are the principal mechanism by which the Unit encourages data evaluation and the production of new data. Ongoing and planned CRPs on AM/PMI data are briefly described here.

  8. Coordinated Research Projects of the IAEA Atomic and Molecular Data Unit

    International Nuclear Information System (INIS)

    Braams, B. J.; Chung, H.-K.

    2011-01-01

    The IAEA Atomic and Molecular Data Unit is dedicated to the provision of databases for atomic, molecular and plasma-material interaction (AM/PMI) data that are relevant for nuclear fusion research. IAEA Coordinated Research Projects (CRPs) are the principal mechanism by which the Unit encourages data evaluation and the production of new data. Ongoing and planned CRPs on AM/PMI data are briefly described here.

  9. Improvement of the reactivity computer for HANARO research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Min Jin; Park, S. J.; Jung, H. S.; Choi, Y. S.; Lee, K. H.; Seo, S. G

    2001-04-01

    The reactivity computer in HANARO has a dedicated neutron detection system for experiments and measurements of the reactor characteristics. This system consists of a personal computer and a multi-function I/O board, and collects the signals from the various neutron detectors. The existing hardware and software are developed under the DOS environment so that they are very inconvenient to use and have difficulties in finding replacement parts. Since the continuity of the signal is often lost when we process the wide rang signal, the need for its improvement has been an issue. The purpose of this project is to upgrade the hardware and software for data collection and processing in order for them to be compatible with Windows{sup TM} operating system and to solve the known issue. We have replaced the existing system with new multi-function I/O board and Pentium III class PC, and the application program for the wide range reactivity measurement and multi-function signal counter have been developed. The newly replaced multi-function I/O board has seven times fast A/D conversion rate and collects sufficient amount of data in a short time. The new application program is user-friendly and provides various useful information on its display screen so that the ability of data processing and storage has been very much enhanced.

  10. Research in Parallel Algorithms and Software for Computational Aerosciences

    Science.gov (United States)

    Domel, Neal D.

    1996-01-01

    Phase 1 is complete for the development of a computational fluid dynamics CFD) parallel code with automatic grid generation and adaptation for the Euler analysis of flow over complex geometries. SPLITFLOW, an unstructured Cartesian grid code developed at Lockheed Martin Tactical Aircraft Systems, has been modified for a distributed memory/massively parallel computing environment. The parallel code is operational on an SGI network, Cray J90 and C90 vector machines, SGI Power Challenge, and Cray T3D and IBM SP2 massively parallel machines. Parallel Virtual Machine (PVM) is the message passing protocol for portability to various architectures. A domain decomposition technique was developed which enforces dynamic load balancing to improve solution speed and memory requirements. A host/node algorithm distributes the tasks. The solver parallelizes very well, and scales with the number of processors. Partially parallelized and non-parallelized tasks consume most of the wall clock time in a very fine grain environment. Timing comparisons on a Cray C90 demonstrate that Parallel SPLITFLOW runs 2.4 times faster on 8 processors than its non-parallel counterpart autotasked over 8 processors.

  11. Research Skills for Journalism Students: From Basics to Computer-Assisted Reporting.

    Science.gov (United States)

    Drueke, Jeanetta; Streckfuss, Richard

    1997-01-01

    Despite the availability of computer-assisted research, a survey of 300 newspapers found that many journalists still rely on paper sources or neglect research altogether. This article describes the development and implementation of a beginning reporting course that integrates research skills, demonstrates the value of research in reporting, and…

  12. Experiences using SciPy for computer vision research

    Energy Technology Data Exchange (ETDEWEB)

    Eads, Damian R [Los Alamos National Laboratory; Rosten, Edward J [Los Alamos National Laboratory

    2008-01-01

    SciPy is an effective tool suite for prototyping new algorithms. We share some of our experiences using it for the first time to support our research in object detection. SciPy makes it easy to integrate C code, which is essential when algorithms operating on large data sets cannot be vectorized. The universality of Python, the language in which SciPy was written, gives the researcher access to a broader set of non-numerical libraries to support GUI development, interface with databases, manipulate graph structures. render 3D graphics, unpack binary files, etc. Python's extensive support for operator overloading makes SciPy's syntax as succinct as its competitors, MATLAB, Octave, and R. More profoundly, we found it easy to rework research code written with SciPy into a production application, deployable on numerous platforms.

  13. Computer science teacher professional development in the United States: a review of studies published between 2004 and 2014

    Science.gov (United States)

    Menekse, Muhsin

    2015-10-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher professional development. In this study, the main goal was to systematically review the studies regarding computer science professional development to understand the scope, context, and effectiveness of these programs in the past decade (2004-2014). Based on 21 journal articles and conference proceedings, this study explored: (1) Type of professional development organization and source of funding, (2) professional development structure and participants, (3) goal of professional development and type of evaluation used, (4) specific computer science concepts and training tools used, (5) and their effectiveness to improve teacher practice and student learning.

  14. Research Update: Computational materials discovery in soft matter

    Directory of Open Access Journals (Sweden)

    Tristan Bereau

    2016-05-01

    Full Text Available Soft matter embodies a wide range of materials, which all share the common characteristics of weak interaction energies determining their supramolecular structure. This complicates structure-property predictions and hampers the direct application of data-driven approaches to their modeling. We present several aspects in which these methods play a role in designing soft-matter materials: drug design as well as information-driven computer simulations, e.g., histogram reweighting. We also discuss recent examples of rational design of soft-matter materials fostered by physical insight and assisted by data-driven approaches. We foresee the combination of data-driven and physical approaches a promising strategy to move the field forward.

  15. Research initiatives for plug-and-play scientific computing

    International Nuclear Information System (INIS)

    McInnes, Lois Curfman; Dahlgren, Tamara; Nieplocha, Jarek; Bernholdt, David; Allan, Ben; Armstrong, Rob; Chavarria, Daniel; Elwasif, Wael; Gorton, Ian; Kenny, Joe; Krishan, Manoj; Malony, Allen; Norris, Boyana; Ray, Jaideep; Shende, Sameer

    2007-01-01

    This paper introduces three component technology initiatives within the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS) that address ever-increasing productivity challenges in creating, managing, and applying simulation software to scientific discovery. By leveraging the Common Component Architecture (CCA), a new component standard for high-performance scientific computing, these initiatives tackle difficulties at different but related levels in the development of component-based scientific software: (1) deploying applications on massively parallel and heterogeneous architectures, (2) investigating new approaches to the runtime enforcement of behavioral semantics, and (3) developing tools to facilitate dynamic composition, substitution, and reconfiguration of component implementations and parameters, so that application scientists can explore tradeoffs among factors such as accuracy, reliability, and performance

  16. Neural computation and particle accelerators research, technology and applications

    CERN Document Server

    D'Arras, Horace

    2010-01-01

    This book discusses neural computation, a network or circuit of biological neurons and relatedly, particle accelerators, a scientific instrument which accelerates charged particles such as protons, electrons and deuterons. Accelerators have a very broad range of applications in many industrial fields, from high energy physics to medical isotope production. Nuclear technology is one of the fields discussed in this book. The development that has been reached by particle accelerators in energy and particle intensity has opened the possibility to a wide number of new applications in nuclear technology. This book reviews the applications in the nuclear energy field and the design features of high power neutron sources are explained. Surface treatments of niobium flat samples and superconducting radio frequency cavities by a new technique called gas cluster ion beam are also studied in detail, as well as the process of electropolishing. Furthermore, magnetic devises such as solenoids, dipoles and undulators, which ...

  17. International Conference on Emerging Research in Electronics, Computer Science and Technology

    CERN Document Server

    Sheshadri, Holalu; Padma, M

    2014-01-01

    PES College of Engineering is organizing an International Conference on Emerging Research in Electronics, Computer Science and Technology (ICERECT-12) in Mandya and merging the event with Golden Jubilee of the Institute. The Proceedings of the Conference presents high quality, peer reviewed articles from the field of Electronics, Computer Science and Technology. The book is a compilation of research papers from the cutting-edge technologies and it is targeted towards the scientific community actively involved in research activities.

  18. Summaries of research and development activities by using JAERI computer system in FY2003. April 1, 2003 - March 31, 2004

    International Nuclear Information System (INIS)

    2005-03-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Research Institute (JAERI) installed large computer system included super-computers in order to support research and development activities in JAERI. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and big user's research and development activities by using the computer system in FY2003 (April 1, 2003 - March 31, 2004). (author)

  19. Summaries of research and development activities by using JAERI computer system in FY2004 (April 1, 2004 - March 31, 2005)

    International Nuclear Information System (INIS)

    2005-08-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Research Institute (JAERI) installed large computer systems including super-computers in order to support research and development activities in JAERI. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and the big users' research and development activities by using the computer system in FY2004 (April 1, 2004 - March 31, 2005). (author)

  20. High Performance Computing Assets for Ocean Acoustics Research

    Science.gov (United States)

    2016-11-18

    processors, effectively), and 512GB memory . The second has 24 CPU cores, dual -thread, (48 processors, effectively), and 512GB memory . The third has...28 CPU cores, dual -thread, (56 processors, effectively), and 256GB memory . Mr. Arthur Newhall ofWHOI worked with the vendors to secure the best...Headrick Office ofNaval Research, Code 322 One Liberty Center 875 North Randolph Street, Suite 4125 Arlington, VA 22203 Dear Dr. Headrick

  1. Utilizing General Purpose Graphics Processing Units to Improve Performance of Computer Modelling and Visualization

    Science.gov (United States)

    Monk, J.; Zhu, Y.; Koons, P. O.; Segee, B. E.

    2009-12-01

    With the introduction of the G8X series of cards by nVidia an architecture called CUDA was released, virtually all subsequent video cards have had CUDA support. With this new architecture nVidia provided extensions for C/C++ that create an Application Programming Interface (API) allowing code to be executed on the GPU. Since then the concept of GPGPU (general purpose graphics processing unit) has been growing, this is the concept that the GPU is very good a algebra and running things in parallel so we should take use of that power for other applications. This is highly appealing in the area of geodynamic modeling, as multiple parallel solutions of the same differential equations at different points in space leads to a large speedup in simulation speed. Another benefit of CUDA is a programmatic method of transferring large amounts of data between the computer's main memory and the dedicated GPU memory located on the video card. In addition to being able to compute and render on the video card, the CUDA framework allows for a large speedup in the situation, such as with a tiled display wall, where the rendered pixels are to be displayed in a different location than where they are rendered. A CUDA extension for VirtualGL was developed allowing for faster read back at high resolutions. This paper examines several aspects of rendering OpenGL graphics on large displays using VirtualGL and VNC. It demonstrates how performance can be significantly improved in rendering on a tiled monitor wall. We present a CUDA enhanced version of VirtualGL as well as the advantages to having multiple VNC servers. It will discuss restrictions caused by read back and blitting rates and how they are affected by different sizes of virtual displays being rendered.

  2. The research of clinical application of computed tomographic virtual gastroscopy

    International Nuclear Information System (INIS)

    Zhang Lei; Pan Zhenyu; Zhai Xiaoli; Gu Hua; Wang Yajie; Ding Yi; Wang Li; Liang Ying; Zhai Renyou

    2000-01-01

    Objective: To investigate the values, methods and findings of computed tomographic virtual gastroscopy (CTVG). Methods: Sixty-nine patients underwent the examination of spiral CT after charged air into stomachs in different cubage. The CT scan conditions were collimating width 3 mm, pitch 1.2 - 2.5, scanning speed 0.8 s/360 degree, the raw data of CT volume scan was reconstructed in overlapping rate 33% - 67%. Then the images of CTVG were built using navigator software (GE AG, USA). Results: The accuracy, sensitivity, and specificity of CTVG were 92.8%, 96.4%, and 78.6%, respectively. CTVG corresponded well with fibrous gastroscopy and specimens in demonstrating the gastric lesions. CTVG was provided with the ability of revealing the tiny lesions of chronic atrophic gastritis, chronic erosive gastritis, chronic proliferative gastritis, and acute hemorrhagic gastritis in some degree. The high quality imaging of CTVG could be obtained in condition of collimating width 3 mm, pitch 1.2 - 1.5, overlapping 50% - 67%, well hold-breath, gastric cubage in full and feasible scan positions. Conclusion: CTVG is a rising means of gastric examination and has great value in clinic applications

  3. Dynamic computer simulations of electrophoresis: three decades of active research.

    Science.gov (United States)

    Thormann, Wolfgang; Caslavska, Jitka; Breadmore, Michael C; Mosher, Richard A

    2009-06-01

    Dynamic models for electrophoresis are based upon model equations derived from the transport concepts in solution together with user-inputted conditions. They are able to predict theoretically the movement of ions and are as such the most versatile tool to explore the fundamentals of electrokinetic separations. Since its inception three decades ago, the state of dynamic computer simulation software and its use has progressed significantly and Electrophoresis played a pivotal role in that endeavor as a large proportion of the fundamental and application papers were published in this periodical. Software is available that simulates all basic electrophoretic systems, including moving boundary electrophoresis, zone electrophoresis, ITP, IEF and EKC, and their combinations under almost exactly the same conditions used in the laboratory. This has been employed to show the detailed mechanisms of many of the fundamental phenomena that occur in electrophoretic separations. Dynamic electrophoretic simulations are relevant for separations on any scale and instrumental format, including free-fluid preparative, gel, capillary and chip electrophoresis. This review includes a historical overview, a survey of current simulators, simulation examples and a discussion of the applications and achievements of dynamic simulation.

  4. Magnetic fusion energy and computers. The role of computing in magnetic fusion energy research and development (second edition)

    International Nuclear Information System (INIS)

    1983-01-01

    This report documents the structure and uses of the MFE Network and presents a compilation of future computing requirements. Its primary emphasis is on the role of supercomputers in fusion research. One of its key findings is that with the introduction of each successive class of supercomputer, qualitatively improved understanding of fusion processes has been gained. At the same time, even the current Class VI machines severely limit the attainable realism of computer models. Many important problems will require the introduction of Class VII or even larger machines before they can be successfully attacked

  5. The nursing professorial unit: translating acute and critical care nursing research

    Directory of Open Access Journals (Sweden)

    Martin Christensen

    2017-11-01

    Full Text Available Background and context: Implementation of current research in practice is challenging for ward-based nursing staff. However, university-based nursing academics are seen as the research experts and are perhaps well placed to support clinical nursing research. The problem lies with the divide between practice and academia; universities often use the clinical environment as the place to conduct research but this is often not translated effectively into practice. The development of a nursing professorial unit for acute and critical care was undertaken to meet this challenge. The unit’s key aim is to develop, mentor and support a nursing research culture that is wholly situated within and driven by the requirements of the clinical environment. Aim: The aim of this article is to offer some insights as to how staff set about engaging with and developing the nursing professorial unit to support nursing research in our local hospital. Conclusions: The article highlights how an effective and coordinated approach to supporting clinical nursing research is possible. The nursing professorial unit has been successful in bridging the divide between academia and practice by using a non-university approach to supporting nursing research. Instead we have adopted the philosophy that practice is the sole driver for research and as academics our role is to support that position. Implications for practice: The adoption of the nursing professorial unit model for supporting clinical nursing research is beneficial in closing the divide between clinical practice and the university The continual presence of the academics in the clinical environment has had a positive impact on research development and implementation in practice The nursing professorial unit has become an integral part of the nursing culture in the hospital environment

  6. Rock-Mechanics Research. A Survey of United States Research to 1965, with a Partial Survey of Canadian Universities.

    Science.gov (United States)

    National Academy of Sciences - National Research Council, Washington, DC.

    The results of a survey, conducted by the Committee on Rock Mechanics, to determine the status of training and research in rock mechanics in presented in this publication. In 1964 and 1965 information was gathered by questionnaires sent to industries, selected federal agencies, and universities in both the United States and Canada. Results are…

  7. Calculating the Unit Cost Factors for Decommissioning Cost Estimation of the Nuclear Research Reactor

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Lee, Dong Gyu; Jung, Chong Hun; Lee, Kune Woo

    2006-01-01

    The estimated decommissioning cost of nuclear research reactor is calculated by applying a unit cost factor-based engineering cost calculation method on which classification of decommissioning works fitted with the features and specifications of decommissioning objects and establishment of composition factors are based. Decommissioning cost of nuclear research reactor is composed of labor cost, equipment and materials cost. Labor cost of decommissioning costs in decommissioning works are calculated on the basis of working time consumed in decommissioning objects. In this paper, the unit cost factors and work difficulty factors which are needed to calculate the labor cost in estimating decommissioning cost of nuclear research reactor are derived and figured out.

  8. Astrophysical Computation in Research, the Classroom and Beyond

    Science.gov (United States)

    Frank, Adam

    2009-03-01

    In this talk I review progress in the use of simulations as a tool for astronomical research, for education and public outreach. The talk will include the basic elements of numerical simulations as well as advances in algorithms which have led to recent dramatic progress such as the use of Adaptive Mesh Refinement methods. The scientific focus of the talk will be star formation jets and outflows while the educational emphasis will be on the use of advanced platforms for simulation based learning in lecture and integrated homework. Learning modules for science outreach websites such as DISCOVER magazine will also be highlighted.

  9. The application of computer modeling to health effect research

    Energy Technology Data Exchange (ETDEWEB)

    Yang, R.S.H. [Colorado State Univ., Ft. Collins, CO (United States)

    1996-12-31

    In the United States, estimates show that more than 30,000 hazardous waste disposal sites exist, not including military installations, U.S. Department of Energy nuclear facilities, and hundreds and thousands of underground fuel storage tanks; these sites undoubtedly have their own respective hazardous waste chemical problems. When so many sites contain hazardous chemicals, how does one study the health effects of the chemicals at these sites? There could be many different answers, but none would be perfect. For an area as complex and difficult as the study of chemical mixtures associated with hazardous waste disposal sites, there are no perfect approaches and protocols. Human exposure to chemicals, be it environmental or occupational, is rarely, if ever, limited to a single chemical. Therefore, it is essential that we consider multiple chemical effects and interactions in our risk assessment process. Systematic toxicity testing of chemical mixtures in the environment or workplace that uses conventional toxicology methodologies is highly impractical because of the immense numbers of mixtures involved. For example, about 600,000 chemicals are being used in our society. Just considering binary chemical mixtures, this means that there could be 600,000 x 599,999/2 = 359,999,400,000 pairs of chemicals. Assuming that only one in a million of these pairs of chemicals acts synergistically or has other toxicologic interactions, there would still be 359,999 binary chemical mixtures possessing toxicologic interactions. Moreover, toxicologic interactions undoubtedly exist among chemical mixtures with three or more component chemicals; the number of possible combinations for these latter mixtures is almost infinite. These are astronomically large numbers with respect to systematic toxicity testing. 22 refs., 5 figs., 1 tab.

  10. Computational fluid dynamics simulation of wind-driven inter-unit dispersion around multi-storey buildings: Upstream building effect

    DEFF Research Database (Denmark)

    Ai, Zhengtao; Mak, C.M.; Dai, Y.W.

    2017-01-01

    of such changed airflow patterns on inter-unit dispersion characteristics around a multi-storey building due to wind effect. Computational fluid dynamics (CFD) method in the framework of Reynolds-averaged Navier-stokes modelling was employed to predict the coupled outdoor and indoor airflow field, and the tracer...... gas technique was used to simulate the dispersion of infectious agents between units. Based on the predicted concentration field, a mass conservation based parameter, namely re-entry ratio, was used to evaluate quantitatively the inter-unit dispersion possibilities and thus assess risks along...

  11. Computational Omics Pre-Awardees | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the pre-awardees of the Computational Omics solicitation. Working with NVIDIA Foundation's Compute the Cure initiative and Leidos Biomedical Research Inc., the NCI, through this solicitation, seeks to leverage computational efforts to provide tools for the mining and interpretation of large-scale publicly available ‘omics’ datasets.

  12. ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus

    OpenAIRE

    Karp, P.D.; Berger, B.; Kovats, D.; Lengauer, T.; Linial, M.; Sabeti, P.; Hide, W.; Rost, B.

    2015-01-01

    Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computati...

  13. Campus Grids: Bringing Additional Computational Resources to HEP Researchers

    International Nuclear Information System (INIS)

    Weitzel, Derek; Fraser, Dan; Bockelman, Brian; Swanson, David

    2012-01-01

    It is common at research institutions to maintain multiple clusters that represent different owners or generations of hardware, or that fulfill different needs and policies. Many of these clusters are consistently under utilized while researchers on campus could greatly benefit from these unused capabilities. By leveraging principles from the Open Science Grid it is now possible to utilize these resources by forming a lightweight campus grid. The campus grids framework enables jobs that are submitted to one cluster to overflow, when necessary, to other clusters within the campus using whatever authentication mechanisms are available on campus. This framework is currently being used on several campuses to run HEP and other science jobs. Further, the framework has in some cases been expanded beyond the campus boundary by bridging campus grids into a regional grid, and can even be used to integrate resources from a national cyberinfrastructure such as the Open Science Grid. This paper will highlight 18 months of operational experiences creating campus grids in the US, and the different campus configurations that have successfully utilized the campus grid infrastructure.

  14. On-line use of personal computers to monitor and evaluate important parameters in the research reactor DHRUVA

    International Nuclear Information System (INIS)

    Sharma, S.K.; Sengupta, S.N.; Darbhe, M.D.; Agarwal, S.K.

    1998-01-01

    The on-line use of Personal Computers in research reactors, with custom made applications for aiding the operators in analysing plant conditions under normal and abnormal situations, has become extremely popular. A system has been developed to monitor and evaluate important parameters for the research reactor DHRUVA, a 100 MW research reactor located at the Bhabha Atomic Research Centre, Trombay. The system was essentially designed for on-line computation of the following parameters: reactor thermal power, reactivity load due to Xenon, core reactivity balance and performance monitoring of shut-down devices. Apart from the on-line applications, the system has also been developed to cater some off-line applications with Local Area Network in the Dhruva complex. The microprocessor based system is designed to function as an independent unit, parallel dumping the acquired data to a PC for application programmes. The user interface on the personal computer is menu driven application software written in 'C' language. The main input parameters required for carrying out the options given in the above menu are: Reactor power, Moderator level, Coolant inlet temperature to the core, Secondary coolant flow rate, temperature rise of secondary coolant across the heat exchangers, heavy water level in the Dump tank and Drop time of individual shut off rods. (author)

  15. 2K09 and thereafter : the coming era of integrative bioinformatics, systems biology and intelligent computing for functional genomics and personalized medicine research

    Science.gov (United States)

    2010-01-01

    Significant interest exists in establishing synergistic research in bioinformatics, systems biology and intelligent computing. Supported by the United States National Science Foundation (NSF), International Society of Intelligent Biological Medicine (http://www.ISIBM.org), International Journal of Computational Biology and Drug Design (IJCBDD) and International Journal of Functional Informatics and Personalized Medicine, the ISIBM International Joint Conferences on Bioinformatics, Systems Biology and Intelligent Computing (ISIBM IJCBS 2009) attracted more than 300 papers and 400 researchers and medical doctors world-wide. It was the only inter/multidisciplinary conference aimed to promote synergistic research and education in bioinformatics, systems biology and intelligent computing. The conference committee was very grateful for the valuable advice and suggestions from honorary chairs, steering committee members and scientific leaders including Dr. Michael S. Waterman (USC, Member of United States National Academy of Sciences), Dr. Chih-Ming Ho (UCLA, Member of United States National Academy of Engineering and Academician of Academia Sinica), Dr. Wing H. Wong (Stanford, Member of United States National Academy of Sciences), Dr. Ruzena Bajcsy (UC Berkeley, Member of United States National Academy of Engineering and Member of United States Institute of Medicine of the National Academies), Dr. Mary Qu Yang (United States National Institutes of Health and Oak Ridge, DOE), Dr. Andrzej Niemierko (Harvard), Dr. A. Keith Dunker (Indiana), Dr. Brian D. Athey (Michigan), Dr. Weida Tong (FDA, United States Department of Health and Human Services), Dr. Cathy H. Wu (Georgetown), Dr. Dong Xu (Missouri), Drs. Arif Ghafoor and Okan K Ersoy (Purdue), Dr. Mark Borodovsky (Georgia Tech, President of ISIBM), Dr. Hamid R. Arabnia (UGA, Vice-President of ISIBM), and other scientific leaders. The committee presented the 2009 ISIBM Outstanding Achievement Awards to Dr. Joydeep Ghosh (UT

  16. Usefulness of computed tomography hounsfield unit measurement for diagnosis of congenital cholesteatoma

    International Nuclear Information System (INIS)

    Ahn, Sang Hyuk; Kim, Yong Woo; Baik, Seung Kug; Hwang, Jae Yeon; Lee, Il Woo

    2014-01-01

    To evaluate the usefulness of Hounsfield unit (HU) measurements for diagnosing of congenital cholesteatoma. A total of 43 patients who underwent surgery due to middle ear cavity lesions were enrolled. Twenty-one patients were confirmed to have congenital cholesteatoma by histopathological results and the other 22 patients were confirmed to have otitis media (OM) by operation. Their computed tomography images were retrospectively reviewed. We measured HU of the soft tissue mass in the middle ear cavity. In addition, we evaluated the largest diameter and location of the mass, the presence of bony erosion in the ear ossicle, and the status of the tympanic membrane in the cholesteatoma group. The mean HU was 37.36 ± 6.11 (range, 27.5-52.5) in the congenital cholesteatoma group and 76.09 ± 8.74 (range, 58.5-96) in the OM group (p < 0.001). The cut-off value was 55.5. The most common location for congenital cholesteatoma was the mesotympanum, and ear ossicle erosion was present in 24%. All patients had an intact tympanic membrane. HU measurement may be useful as an additional indicator to diagnose congenital cholesteatoma.

  17. Usefulness of computed tomography hounsfield unit measurement for diagnosis of congenital cholesteatoma

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Sang Hyuk; Kim, Yong Woo; Baik, Seung Kug; Hwang, Jae Yeon; Lee, Il Woo [Medical Research Institute, Pusan National University Yangsan Hospital, College of Medicine, Pusan National University, Yangsan (Korea, Republic of)

    2014-02-15

    To evaluate the usefulness of Hounsfield unit (HU) measurements for diagnosing of congenital cholesteatoma. A total of 43 patients who underwent surgery due to middle ear cavity lesions were enrolled. Twenty-one patients were confirmed to have congenital cholesteatoma by histopathological results and the other 22 patients were confirmed to have otitis media (OM) by operation. Their computed tomography images were retrospectively reviewed. We measured HU of the soft tissue mass in the middle ear cavity. In addition, we evaluated the largest diameter and location of the mass, the presence of bony erosion in the ear ossicle, and the status of the tympanic membrane in the cholesteatoma group. The mean HU was 37.36 ± 6.11 (range, 27.5-52.5) in the congenital cholesteatoma group and 76.09 ± 8.74 (range, 58.5-96) in the OM group (p < 0.001). The cut-off value was 55.5. The most common location for congenital cholesteatoma was the mesotympanum, and ear ossicle erosion was present in 24%. All patients had an intact tympanic membrane. HU measurement may be useful as an additional indicator to diagnose congenital cholesteatoma.

  18. Some research advances in computer graphics that will enhance applications to engineering design

    Science.gov (United States)

    Allan, J. J., III

    1975-01-01

    Research in man/machine interactions and graphics hardware/software that will enhance applications to engineering design was described. Research aspects of executive systems, command languages, and networking used in the computer applications laboratory are mentioned. Finally, a few areas where little or no research is being done were identified.

  19. Development of computational small animal models and their applications in preclinical imaging and therapy research

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Tianwu [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Geneva Neuroscience Center, Geneva University, Geneva CH-1205 (Switzerland); Department of Nuclear Medicine and Molecular Imaging, University of Groningen, University Medical Center Groningen, Groningen 9700 RB (Netherlands)

    2016-01-15

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  20. Development of computational small animal models and their applications in preclinical imaging and therapy research.

    Science.gov (United States)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  1. Development of computational small animal models and their applications in preclinical imaging and therapy research

    International Nuclear Information System (INIS)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future

  2. FFUSION yearbook 1997. Annual report of the Finnish fusion research unit. Association EURATOM-TEKES

    Energy Technology Data Exchange (ETDEWEB)

    Karttunen, S; Paettikangas, T [eds.; VTT Energy, Espoo (Finland)

    1998-02-01

    Finnish fusion programme (FFUSION) is one of the eleven national energy research programmes funded by the Technological Development Centre of Finland (TEKES). The FFUSION programme was fully integrated into European Fusion Programme just after Finland joined the European Union. The contract of Association Euratom and Tekes was signed in 1995 and extends to the end of 1999. Finland became a member of JET Joint Undertaking in 1996, other contracts with Euratom include NET agreement and the Staff Mobility Agreement. FFUSION programme with participating research institutes and universities forms the Fusion Research Unit of the Association Euratom-Tekes. This annual report summarises the research activities of the Finnish Research Unit in 1997. The programme consists of two parts: Physics and Technology. The research areas of the physics are: Fusion plasma engineering, and Radio-frequency heating and Plasma diagnostics. The technology is focused into three areas: Fusion reactor materials (first wall components and joining techniques), Remote handling and viewing systems, and Superconductors

  3. Computer technologies of future teachers of fine art training as an object of scientific educational research

    Directory of Open Access Journals (Sweden)

    Bohdan Cherniavskyi

    2017-03-01

    Full Text Available The article deals with computer technology training, highlights the current state ofcomputerization of educational process in teacher training colleges, reveals the specifictechniques of professional training of teachers of fine arts to use computer technology inteaching careers.Key words: Methods of professional training, professional activities, computertechnology training future teachers of Fine Arts, the subject of research.

  4. Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.

    Science.gov (United States)

    Knerr, Bruce W.; And Others

    Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…

  5. Computer Literacy for Life Sciences: Helping the Digital-Era Biology Undergraduates Face Today's Research

    Science.gov (United States)

    Smolinski, Tomasz G.

    2010-01-01

    Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of…

  6. Service-oriented computing : State of the art and research challenges

    NARCIS (Netherlands)

    Papazoglou, Michael P.; Traverso, Paolo; Dustdar, Schahram; Leymann, Frank

    2007-01-01

    Service-oriented computing promotes the idea of assembling application components into a network of services that can be loosely coupled to create flexible, dynamic business processes and agile applications that span organizations and computing platforms. An SOC research road map provides a context

  7. Young Researchers Advancing Computational Science: Perspectives of the Young Scientists Conference 2015

    NARCIS (Netherlands)

    Boukhanovsky, A.V.; Ilyin, V.A; Krzhizhanovskaya, V.V.; Athanassoulis, G.A.; Klimentov, A.A.; Sloot, P.M.A.

    2015-01-01

    We present an annual international Young Scientists Conference (YSC) on computational science http://ysc.escience.ifmo.ru/, which brings together renowned experts and young researchers working in high-performance computing, data-driven modeling, and simulation of large-scale complex systems. The

  8. Job satisfaction and importance for intensive care unit research coordinators: results from binational survey.

    Science.gov (United States)

    Rickard, Claire M; Roberts, Brigit L; Foote, Jonathon; McGrail, Matthew R

    2007-09-01

    To measure Intensive Care Unit Research coordinator job satisfaction and importance and to identify priorities for role development. Research coordinator numbers are growing internationally in response to increasing clinical research activity. In Australia, 1% of registered nurses work principally in research, many as Research coordinators. Internationally, the Association of Clinical Research Professionals currently has 6536 certified Research coordinators in 13 countries, with likely additional large numbers practicing without the voluntary certification. Research coordinators are almost always nurses, but little is know about this emerging specialty. Design. Cross-sectional study using anonymous self-report questionnaire. After ethics approval, the McCloskey-Mueller Satisfaction Scale and McCloskey-Mueller Importance Scale were administered via the Internet. The sample was 49 (response rate 71%) Research coordinators from the Australia and New Zealand Intensive Care Unit Research coordinators' Interest Group. Research coordinators were satisfied with structural aspects of the position working business hours; flexibility of working hours; high levels of responsibility and control over their work. Dissatisfaction was expressed regarding: remuneration and recognition; compensation for weekend work; salary package; career advancement opportunities; and childcare facilities. High priorities for role development are those rated highly important but with much lower satisfaction. These are: compensation for weekend call-out work; salary and remuneration package; recognition by management and clinicians; career advancement opportunities; departmental research processes; encouragement and feedback; and number of working hours. Increasing numbers of nurses have been attracted to this clinically based research position. These data contribute to the understanding and development of the role.

  9. Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research

    Directory of Open Access Journals (Sweden)

    Victoria Stodden

    2014-07-01

    Full Text Available The goal of this article is to coalesce a discussion around best practices for scholarly research that utilizes computational methods, by providing a formalized set of best practice recommendations to guide computational scientists and other stakeholders wishing to disseminate reproducible research, facilitate innovation by enabling data and code re-use, and enable broader communication of the output of computational scientific research. Scholarly dissemination and communication standards are changing to reflect the increasingly computational nature of scholarly research, primarily to include the sharing of the data and code associated with published results. We also present these Best Practices as a living, evolving, and changing document at http://wiki.stodden.net/Best_Practices.

  10. Effect of Jigsaw II, Reading-Writing-Presentation, and Computer Animations on the Teaching of "Light" Unit

    Science.gov (United States)

    Koç, Yasemin; Yildiz, Emre; Çaliklar, Seyma; Simsek, Ümit

    2016-01-01

    The aim of this study is to determine the effect of Jigsaw II technique, reading-writing-presentation method, and computer animation on students' academic achievements, epistemological beliefs, attitudes towards science lesson, and the retention of knowledge in the "Light" unit covered in the 7th grade. The sample of the study consists…

  11. The Development of an Individualized Instructional Program in Beginning College Mathematics Utilizing Computer Based Resource Units. Final Report.

    Science.gov (United States)

    Rockhill, Theron D.

    Reported is an attempt to develop and evaluate an individualized instructional program in pre-calculus college mathematics. Four computer based resource units were developed in the areas of set theory, relations and function, algebra, trigonometry, and analytic geometry. Objectives were determined by experienced calculus teachers, and…

  12. Making Research Matter Comment on "Public Spending on Health Service and Policy Research in Canada, the United Kingdom, and the United States: A Modest Proposal".

    Science.gov (United States)

    Hunter, David J; Frank, John

    2017-08-13

    We offer a UK-based commentary on the recent "Perspective" published in IJHPM by Thakkar and Sullivan. We are sympathetic to the authors' call for increased funding for health service and policy research (HSPR). However, we point out that increasing that investment - in any of the three countries they compare: Canada, the United States and the United Kingdom- will ipso facto not necessarily lead to any better use of research by health system decision-makers in these settings. We cite previous authors' descriptions of the many factors that tend to make the worlds of researchers and decision-makers into "two solitudes." And we call for changes in the structure and funding of HSPR, particularly the incentives now in place for purely academic publishing, to tackle a widespread reality: most published research in HSPR, as in other applied fields of science, is never read or used by the vast majority of decision-makers, working out in the "real world. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  13. Cooperative Fish and Wildlife Research Units Program—2017 year in review

    Science.gov (United States)

    Organ, John F.; Thompson, John D.; Dennerline, Donald E.; Childs, Dawn E.

    2018-02-08

    The Cooperative Fish and Wildlife Research Units Program was involved in a number of notable events during 2017, many concerning our personnel. Dr. Barry Grand left his position as Leader of the Alabama Cooperative Fish and Wildlife Research Unit to become the Cooperative Units Program Supervisor for the South, replacing Dr. Kevin Whalen who took over as Supervisor for the West. We welcomed Dr. Sarah Converse who left the Patuxent Wildlife Research Center to become Leader of the Washington Cooperative Fish and Wildlife Research Unit. Dr. Amanda Rosenberger joined the Tennessee Cooperative Fishery Research Unit as Assistant Leader, transferring from the Missouri Cooperative Unit. Dr. Scott Carleton left his position as Assistant Unit Leader in New Mexico to become Chief of the Region 2 Migratory Bird Program of the U.S. Fish and Wildlife Service.We said farewell to many colleagues who retired. Their departure is bittersweet as we wish them health, happiness, and wellness in retirement. We will miss their companionship and the extraordinary contributions they have made to the Cooperative Fish and Wildlife Research Units Program and conservation.The Cooperative Fish and Wildlife Research Units Program has a record high number of vacant scientist positions due to a combination of retirements and base funding short-falls. These issues are affecting our ability to meet cooperator needs. Yet, we remain highly productive. For example, this year we released a report (https://doi.org/10.3133/cir1427) containing abstracts of nearly 600 of our research projects, covering thematic areas ranging from advanced technologies to wildlife diseases. We provided highly competent, trained scientists and natural resource managers for our cooperators’ workforce. We delivered technical training and guidance to professional practitioners. We provided critical information to cooperators for decisions on species status assessments and management of species of greatest conservation need

  14. Regional research exploitation of the LHC a case-study of the required computing resources

    CERN Document Server

    Almehed, S; Eerola, Paule Anna Mari; Mjörnmark, U; Smirnova, O G; Zacharatou-Jarlskog, C; Åkesson, T

    2002-01-01

    A simulation study to evaluate the required computing resources for a research exploitation of the Large Hadron Collider (LHC) has been performed. The evaluation was done as a case study, assuming existence of a Nordic regional centre and using the requirements for performing a specific physics analysis as a yard-stick. Other imput parameters were: assumption for the distribution of researchers at the institutions involved, an analysis model, and two different functional structures of the computing resources.

  15. Summaries of research and development activities by using JAEA computer system in FY2005. April 1, 2005 - March, 31, 2006

    International Nuclear Information System (INIS)

    2006-10-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and the big users' research and development activities by using the computer system in FY2005 (April 1, 2005 - March 31, 2006). (author)

  16. Summaries of research and development activities by using JAEA computer system in FY2006. April 1, 2006 - March 31, 2007

    International Nuclear Information System (INIS)

    2008-02-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. CCSE operates and manages the computer system and network system. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2006 (April 1, 2006 - March 31, 2007). (author)

  17. Qualitative Computing and Qualitative Research: Addressing the Challenges of Technology and Globalization

    Directory of Open Access Journals (Sweden)

    César A. Cisneros Puebla

    2012-05-01

    Full Text Available Qualitative computing has been part of our lives for thirty years. Today, we urgently call for an evaluation of its international impact on qualitative research. Evaluating the international impact of qualitative research and qualitative computing requires a consideration of the vast amount of qualitative research over the last decades, as well as thoughtfulness about the uneven and unequal way in which qualitative research and qualitative computing are present in different fields of study and geographical regions. To understand the international impact of qualitative computing requires evaluation of the digital divide and the huge differences between center and peripheries. The international impact of qualitative research, and, in particular qualitative computing, is the question at the heart of this array of selected papers from the "Qualitative Computing: Diverse Worlds and Research Practices Conference." In this article, we introduce the reader to the goals, motivation, and atmosphere at the conference, taking place in Istanbul, Turkey, in 2011. The dialogue generated there is still in the air, and this introduction is a call to spread that voice. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1202285

  18. [Results of the marketing research study "Acceptance of physician's office computer systems"].

    Science.gov (United States)

    Steinhausen, D; Brinkmann, F; Engelhard, A

    1998-01-01

    We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.

  19. High Performance Computing and Storage Requirements for Biological and Environmental Research Target 2017

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Wasserman, Harvey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)

    2013-05-01

    The National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,500 users working on some 650 projects that involve nearly 600 codes in a wide variety of scientific disciplines. In addition to large-­scale computing and storage resources NERSC provides support and expertise that help scientists make efficient use of its systems. The latest review revealed several key requirements, in addition to achieving its goal of characterizing BER computing and storage needs.

  20. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  1. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    Science.gov (United States)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  2. United States Department of Agriculture-Agricultural Research Service research in application technology for pest management.

    Science.gov (United States)

    Smith, L A; Thomson, S J

    2003-01-01

    A research summary is presented that emphasizes ARS achievements in application technology over the past 2-3 years. Research focused on the improvement of agricultural pesticide application is important from the standpoint of crop protection as well as environmental safety. Application technology research is being actively pursued within the ARS, with a primary focus on application system development, drift management, efficacy enhancement and remote sensing. Research on application systems has included sensor-controlled hooded sprayers, new approaches to direct chemical injection, and aerial electrostatic sprayers. For aerial application, great improvements in on-board flow controllers permit accurate field application of chemicals. Aircraft parameters such as boom position and spray release height are being altered to determine their effect on drift. Other drift management research has focused on testing of low-drift nozzles, evaluation of pulsed spray technologies and evaluation of drift control adjuvants. Research on the use of air curtain sprayers in orchards, air-assist sprayers for row crops and vegetables, and air deflectors on aircraft has documented improvements in application efficacy. Research has shown that the fate of applied chemicals is influenced by soil properties, and this has implications for herbicide efficacy and dissipation in the environment. Remote sensing systems are being used to target areas in the field where pests are present so that spray can be directed to only those areas. Soil and crop conditions influence propensity for weeds and insects to proliferate in any given field area. Research has indicated distinct field patterns favorable for weed growth and insect concentration, which can provide further assistance for targeted spraying.

  3. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    Science.gov (United States)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  4. Performance characterization of megavoltage computed tomography imaging on a helical tomotherapy unit

    International Nuclear Information System (INIS)

    Meeks, Sanford L.; Harmon, Joseph F. Jr.; Langen, Katja M.; Willoughby, Twyla R.; Wagner, Thomas H.; Kupelian, Patrick A.

    2005-01-01

    Helical tomotherapy is an innovative means of delivering IGRT and IMRT using a device that combines features of a linear accelerator and a helical computed tomography (CT) scanner. The HI-ART II can generate CT images from the same megavoltage x-ray beam it uses for treatment. These megavoltage CT (MVCT) images offer verification of the patient position prior to and potentially during radiation therapy. Since the unit uses the actual treatment beam as the x-ray source for image acquisition, no surrogate telemetry systems are required to register image space to treatment space. The disadvantage to using the treatment beam for imaging, however, is that the physics of radiation interactions in the megavoltage energy range may force compromises between the dose delivered and the image quality in comparison to diagnostic CT scanners. The performance of the system is therefore characterized in terms of objective measures of noise, uniformity, contrast, and spatial resolution as a function of the dose delivered by the MVCT beam. The uniformity and spatial resolutions of MVCT images generated by the HI-ART II are comparable to that of diagnostic CT images. Furthermore, the MVCT scan contrast is linear with respect to the electron density of material imaged. MVCT images do not have the same performance characteristics as state-of-the art diagnostic CT scanners when one objectively examines noise and low-contrast resolution. These inferior results may be explained, at least partially, by the low doses delivered by our unit; the dose is 1.1 cGy in a 20 cm diameter cylindrical phantom. In spite of the poorer low-contrast resolution, these relatively low-dose MVCT scans provide sufficient contrast to delineate many soft-tissue structures. Hence, these images are useful not only for verifying the patient's position at the time of therapy, but they are also sufficient for delineating many anatomic structures. In conjunction with the ability to recalculate radiotherapy doses on

  5. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  6. Large Scale Computing and Storage Requirements for Biological and Environmental Research

    Energy Technology Data Exchange (ETDEWEB)

    DOE Office of Science, Biological and Environmental Research Program Office (BER),

    2009-09-30

    In May 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of Biological and Environmental Research (BER) held a workshop to characterize HPC requirements for BER-funded research over the subsequent three to five years. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. Chief among them: scientific progress in BER-funded research is limited by current allocations of computational resources. Additionally, growth in mission-critical computing -- combined with new requirements for collaborative data manipulation and analysis -- will demand ever increasing computing, storage, network, visualization, reliability and service richness from NERSC. This report expands upon these key points and adds others. It also presents a number of"case studies" as significant representative samples of the needs of science teams within BER. Workshop participants were asked to codify their requirements in this"case study" format, summarizing their science goals, methods of solution, current and 3-5 year computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel,"multi-core" environment that is expected to dominate HPC architectures over the next few years.

  7. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  8. Current researches on safety assessment of radioactive waste disposal in the United States

    International Nuclear Information System (INIS)

    Tasaka, Hiroshi; Kiyose, Ryohei

    1980-01-01

    Recently, the problem of safe disposal of radioactive waste generated from nuclear fuel cycle becomes more important in Japan. On the other hand, many researches on shallow land burial of low-level wastes and geologic isolation of high-level wastes have been carried out in the United States of America. In this report, the researches on the safety assessment of radioactive waste disposal in the United States of America were briefly introduced with emphasis on the studies on behavior and migration of radionuclide from disposed waste in geosphere. (author)

  9. The United States Culture Collection Network (USCCN): Enhancing Microbial Genomics Research through Living Microbe Culture Collections

    Science.gov (United States)

    Boundy-Mills, Kyria; Hess, Matthias; Bennett, A. Rick; Ryan, Matthew; Kang, Seogchan; Nobles, David; Eisen, Jonathan A.; Inderbitzin, Patrik; Sitepu, Irnayuli R.; Torok, Tamas; Brown, Daniel R.; Cho, Juliana; Wertz, John E.; Mukherjee, Supratim; Cady, Sherry L.

    2015-01-01

    The mission of the United States Culture Collection Network (USCCN; http://usccn.org) is “to facilitate the safe and responsible utilization of microbial resources for research, education, industry, medicine, and agriculture for the betterment of human kind.” Microbial culture collections are a key component of life science research, biotechnology, and emerging global biobased economies. Representatives and users of several microbial culture collections from the United States and Europe gathered at the University of California, Davis, to discuss how collections of microorganisms can better serve users and stakeholders and to showcase existing resources available in public culture collections. PMID:26092453

  10. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 1, Issue 2

    Science.gov (United States)

    2011-01-01

    area and the researchers working on these projects. Also inside: news from the AHPCRC consortium partners at Morgan State University and the NASA ...Computing Research Center is provided by the supercomputing and research facilities at Stanford University and at the NASA Ames Research Center at...atomic and molecular level, he said. He noted that “every general would like to have” a Star Trek -like holodeck, where holographic avatars could

  11. USING RESEARCH METHODS IN HUMAN COMPUTER INTERACTION TO DESIGN TECHNOLOGY FOR RESILIENCE

    OpenAIRE

    Lopes, Arminda Guerra

    2016-01-01

    ABSTRACT Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, ...

  12. Applications of computer-graphics animation for motion-perception research

    Science.gov (United States)

    Proffitt, D. R.; Kaiser, M. K.

    1986-01-01

    The advantages and limitations of using computer animated stimuli in studying motion perception are presented and discussed. Most current programs of motion perception research could not be pursued without the use of computer graphics animation. Computer generated displays afford latitudes of freedom and control that are almost impossible to attain through conventional methods. There are, however, limitations to this presentational medium. At present, computer generated displays present simplified approximations of the dynamics in natural events. Very little is known about how the differences between natural events and computer simulations influence perceptual processing. In practice, the differences are assumed to be irrelevant to the questions under study, and that findings with computer generated stimuli will generalize to natural events.

  13. High Performance Computing and Visualization Infrastructure for Simultaneous Parallel Computing and Parallel Visualization Research

    Science.gov (United States)

    2016-11-09

    Total Number: Sub Contractors (DD882) Names of Personnel receiving masters degrees Names of personnel receiving PHDs Names of other research staff...Broadcom 5720 QP 1Gb Network Daughter Card (2) Intel Xeon E5-2680 v3 2.5GHz, 30M Cache, 9.60GT/s QPI, Turbo, HT , 12C/24T (120W...Broadcom 5720 QP 1Gb Network Daughter Card (2) Intel Xeon E5-2680 v3 2.5GHz, 30M Cache, 9.60GT/s QPI, Turbo, HT , 12C/24T (120W

  14. Suitable exposure conditions for CB Throne? New model cone beam computed tomography unit for dental use

    International Nuclear Information System (INIS)

    Tanabe, Kouji; Nishikawa, Keiichi; Yajima, Aya; Mizuta, Shigeru; Sano, Tsukasa; Yajima, Yasutomo; Nakagawa, Kanichi; Kousuge, Yuuji

    2008-01-01

    The CB Throne is a cone beam computed tomography unit for dental use, and the smaller version of the CB MercuRay developed by Hitachi Medico Co. We investigated which exposure conditions were suitable in the clinical use. Suitable exposure conditions were determined by simple subjective comparisons. The right temporomandibular joint of the head phantom was scanned at all possible combinations of tube voltage (60, 80, 100, 120 kV) and tube current (10, 15 mA). Oblique-sagittal images of the same position were obtained using multiplanar reconstruction (MPR) function. Images obtained at 120 kV and 15 mA, which are the highest exposure conditions and certain to produce images of the best quality, were used to establish the standard. Eight oral radiologists observed each image and standard image on a LCD monitor. They compared subjectively spatial resolution and noise between each image and standard image using a 10 cm scale. Evaluation points were obtained from the check positions on the scales. The Steel method was used to determine significant differences. The images at 60 kV/10 mA and 80 kV/15 mA showed significantly lower evaluation points on spatial resolution. The images at 60 kV/10 mA, 60 kV/15 mA and 80 kV/10 mA showed significantly lower evaluation points on noise. In conclusion, even if exposure conditions are reduced to 100 kV/10 mA, 100 kV/15 mA or 120 kV/10 mA, the CB Throne will produce images of the best quality. (author)

  15. Absolute Hounsfield unit measurement on noncontrast computed tomography cannot accurately predict struvite stone composition.

    Science.gov (United States)

    Marchini, Giovanni Scala; Gebreselassie, Surafel; Liu, Xiaobo; Pynadath, Cindy; Snyder, Grace; Monga, Manoj

    2013-02-01

    The purpose of our study was to determine, in vivo, whether single-energy noncontrast computed tomography (NCCT) can accurately predict the presence/percentage of struvite stone composition. We retrospectively searched for all patients with struvite components on stone composition analysis between January 2008 and March 2012. Inclusion criteria were NCCT prior to stone analysis and stone size ≥4 mm. A single urologist, blinded to stone composition, reviewed all NCCT to acquire stone location, dimensions, and Hounsfield unit (HU). HU density (HUD) was calculated by dividing mean HU by the stone's largest transverse diameter. Stone analysis was performed via Fourier transform infrared spectrometry. Independent sample Student's t-test and analysis of variance (ANOVA) were used to compare HU/HUD among groups. Spearman's correlation test was used to determine the correlation between HU and stone size and also HU/HUD to % of each component within the stone. Significance was considered if pR=0.017; p=0.912) and negative with HUD (R=-0.20; p=0.898). Overall, 3 (6.8%) had stones (n=5) with other miscellaneous stones (n=39), no difference was found for HU (p=0.09) but HUD was significantly lower for pure stones (27.9±23.6 v 72.5±55.9, respectively; p=0.006). Again, significant overlaps were seen. Pure struvite stones have significantly lower HUD than mixed struvite stones, but overlap exists. A low HUD may increase the suspicion for a pure struvite calculus.

  16. Salamander chytrid fungus (Batrachochytrium salamandrivorans) in the United States—Developing research, monitoring, and management strategies

    Science.gov (United States)

    Grant, Evan H. Campbell; Muths, Erin L.; Katz, Rachel A.; Canessa, Stefano; Adams, Michael J.; Ballard, Jennifer R.; Berger, Lee; Briggs, Cheryl J.; Coleman, Jeremy; Gray, Matthew J.; Harris, M. Camille; Harris, Reid N.; Hossack, Blake R.; Huyvaert, Kathryn P.; Kolby, Jonathan E.; Lips, Karen R.; Lovich, Robert E.; McCallum, Hamish I.; Mendelson, Joseph R.; Nanjappa, Priya; Olson, Deanna H.; Powers, Jenny G.; Richgels, Katherine L. D.; Russell, Robin E.; Schmidt, Benedikt R.; Spitzen-van der Sluijs, Annemarieke; Watry, Mary Kay; Woodhams, Douglas C.; White, C. LeAnn

    2016-01-20

    The recently (2013) identified pathogenic chytrid fungus, Batrachochytrium salamandrivorans (Bsal), poses a severe threat to the distribution and abundance of salamanders within the United States and Europe. Development of a response strategy for the potential, and likely, invasion of Bsal into the United States is crucial to protect global salamander biodiversity. A formal working group, led by Amphibian Research and Monitoring Initiative (ARMI) scientists from the U.S. Geological Survey (USGS) Patuxent Wildlife Research Center, Fort Collins Science Center, and Forest and Rangeland Ecosystem Science Center, was held at the USGS Powell Center for Analysis and Synthesis in Fort Collins, Colorado, United States from June 23 to June 25, 2015, to identify crucial Bsal research and monitoring needs that could inform conservation and management strategies for salamanders in the United States. Key findings of the workshop included the following: (1) the introduction of Bsal into the United States is highly probable, if not inevitable, thus requiring development of immediate short-term and long-term intervention strategies to prevent Bsal establishment and biodiversity decline; (2) management actions targeted towards pathogen containment may be ineffective in reducing the long-term spread of Bsal throughout the United States; and (3) early detection of Bsal through surveillance at key amphibian import locations, among high-risk wild populations, and through analysis of archived samples is necessary for developing management responses. Top research priorities during the preinvasion stage included the following: (1) deployment of qualified diagnostic methods for Bsal and establishment of standardized laboratory practices, (2) assessment of susceptibility for amphibian hosts (including anurans), and (3) development and evaluation of short- and long-term pathogen intervention and management strategies. Several outcomes were achieved during the workshop, including development

  17. Estimating resting motor thresholds in transcranial magnetic stimulation research and practice: a computer simulation evaluation of best methods.

    Science.gov (United States)

    Borckardt, Jeffrey J; Nahas, Ziad; Koola, Jejo; George, Mark S

    2006-09-01

    Resting motor threshold is the basic unit of dosing in transcranial magnetic stimulation (TMS) research and practice. There is little consensus on how best to estimate resting motor threshold with TMS, and only a few tools and resources are readily available to TMS researchers. The current study investigates the accuracy and efficiency of 5 different approaches to motor threshold assessment for TMS research and practice applications. Computer simulation models are used to test the efficiency and accuracy of 5 different adaptive parameter estimation by sequential testing (PEST) procedures. For each approach, data are presented with respect to the mean number of TMS trials necessary to reach the motor threshold estimate as well as the mean accuracy of the estimates. A simple nonparametric PEST procedure appears to provide the most accurate motor threshold estimates, but takes slightly longer (on average, 3.48 trials) to complete than a popular parametric alternative (maximum likelihood PEST). Recommendations are made for the best starting values for each of the approaches to maximize both efficiency and accuracy. In light of the computer simulation data provided in this article, the authors review and suggest which techniques might best fit different TMS research and clinical situations. Lastly, a free user-friendly software package is described and made available on the world wide web that allows users to run all of the motor threshold estimation procedures discussed in this article for clinical and research applications.

  18. Review of the research proposal for the steam generator retired from Kori unit 1

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joung Soo; Han, Joung Ho; Kim, Hong Pyo; Lim, Yun Soo; Lee, Deok Hyun; Hwang, Seong Sik; Hur, Do Haeng [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    The tubes of the steam generator retired form Kori unit 1 have many different kinds of failures, such as denting pitting, wastage, ODSCC, PWSCC.Korea Electric Power Research Institute (KEPRI) submitted a research proposal for the steam generator to the Korea Institute S and T Evaluation and Planning (KSITEP). The KISTEP requested Korea Atomic Energy Research Institute to review the proposal by organizing a committee which should be composed of the specialists of the related domestic research institutes. Opinions of the committee on the objectives, research fields, economic benefit and validity in the research proposal were reviewed and suggested optimal research fields to be fulfilled successfully for the retired steam generator. Also, the rolls for the participants in the research works were allocated, which is critical in order to do the project effectively. 6 figs., 5 tabs. (Author)

  19. Potsdam Institute for Climate Impact Research: Computer simulation -climate impact research. Final report

    International Nuclear Information System (INIS)

    1993-07-01

    Climate impact assessment is a new field of research which, owing to international and national efforts to understand and cope with the impending global climate changes on a global, regional and local level, has rapidly become a central field of research of the Federal Ministry of Research and Technology. In contrast to other countries, Germany had no research facilities and infrastructure that could be used. The Potsdam Institute for Climate Impact Research was to provide the infrastructure basis for climate impact research in Germany. The Institute was founded by the BMFT and the Land of Brandenburg. (orig.) [de

  20. Breaking the Boundaries: Academic Applications of Multidisciplinary Research in Computer Science and Dentistry

    Directory of Open Access Journals (Sweden)

    Patricia Witt

    2016-12-01

    Full Text Available Undergrad students are trained on a specific set of skills matching their corresponding careers, as modern sciences trend to specialization; however, it has promoted the creation of a virtual boundary among different professions. In this regard, state-of-the-art dental research involves the application of ever-increasing complex computational solutions; thus, requiring of multidisciplinary research teams. Multidisciplinarity is often achieved on a higher research context (e.g., postgrad; but involves a high degree of difficulty for both factions. The aim of this work is to present a novel application of multidisciplinary research to the learning process of undergrad students in computer sciences and dentistry careers. In order to do so, we leveraged previous research on computational intelligence and image processing techniques applied to dental diagnosis, and integrated it with the clinical assessment and software engineering subjects on dental and computer engineering careers correspondently. With this, we explored the possibility to enhance diagnosis skills of dental students, while improving the software engineering skills of computer sciences students; furthermore, we intended to introduce the concepts of applied computational intelligence, multidisciplinarity, and collaboration on both sides.

  1. Establishing the Research Agenda for Increasing the Representation of Women in Engineering and Computing.

    Science.gov (United States)

    Buse, Kathleen; Hill, Catherine; Benson, Kathleen

    2017-01-01

    While there is an extensive body of research on gender equity in engineering and computing, there have been few efforts to glean insight from a dialog among experts. To encourage collaboration and to develop a shared vision of the future research agenda, a 2 day workshop of 50 scholars who work on the topic of gender in engineering and computing was held at a rural conference center. The structure of the conference and the location allowed for time to reflect, dialog, and to craft an innovative research agenda aimed at increasing the representation of women in engineering and computing. This paper has been written by the conference organizers and details the ideas and recommendations from the scholars. The result is an innovative, collaborative approach to future research that focuses on identifying effective interventions. The new approach includes the creation of partnerships with stakeholders including businesses, government agencies, non-profits and academic institutions to allow a broader voice in setting research priorities. Researchers recommend incorporating multiple disciplines and methodologies, while expanding the use of data analytics, merging and mining existing databases and creating new datasets. The future research agenda is detailed and includes studies focused on socio-cultural interventions particularly on career choice, within undergraduate and graduate programs, and for women in professional careers. The outcome is a vision for future research that can be shared with researchers, practitioners and other stakeholders that will lead to gender equity in the engineering and computing professions.

  2. The NIHR Public Health Research Programme: responding to local authority research needs in the United Kingdom.

    Science.gov (United States)

    Dorling, Hannah; Cook, Andrew; Ollerhead, Liz; Westmore, Matt

    2015-12-11

    The remit of the National Institute for Health Research Public Health Research (PHR) Programme is to evaluate public health interventions, providing new knowledge on the benefits, costs, acceptability and wider impacts of interventions, set outside of the National Health Service, intended to improve the health of the public and reduce inequalities. This paper illustrates how the PHR Programme is providing new knowledge for public health decision makers, based on the nine key areas for local authority public health action, described by the King's Fund. Many funded PHR projects are evaluating interventions, applied in a range of settings, across the identified key areas for local authority influence. For example, research has been funded on children and young people, and for some of the wider determinants of health, such as housing and travel. Other factors, such as spatial planning, or open and green spaces and leisure, are less represented in the PHR Programme. Further opportunities in research include interventions to improve the health of adolescents, adults in workplaces, and communities. Building evidence for public health interventions at local authority level is important to prioritise and implement effective changes to improve population health.

  3. Initial Flight Test of the Production Support Flight Control Computers at NASA Dryden Flight Research Center

    Science.gov (United States)

    Carter, John; Stephenson, Mark

    1999-01-01

    The NASA Dryden Flight Research Center has completed the initial flight test of a modified set of F/A-18 flight control computers that gives the aircraft a research control law capability. The production support flight control computers (PSFCC) provide an increased capability for flight research in the control law, handling qualities, and flight systems areas. The PSFCC feature a research flight control processor that is "piggybacked" onto the baseline F/A-18 flight control system. This research processor allows for pilot selection of research control law operation in flight. To validate flight operation, a replication of a standard F/A-18 control law was programmed into the research processor and flight-tested over a limited envelope. This paper provides a brief description of the system, summarizes the initial flight test of the PSFCC, and describes future experiments for the PSFCC.

  4. Using Computers in Educational and Psychological Research: Using Information Technolgies to Support the Research Process

    Science.gov (United States)

    Willis, Jerry; Kim, Seung H.

    2006-01-01

    This book has been designed to assist researchers in the social sciences and education fields who are interested in learning how information technologies can help them successfully navigate the research process. Most researchers are familiar with the use of programs like SPSS to analyze data, but many are not aware of other ways information…

  5. The Cerebral Palsy Research Registry: Development and Progress Toward National Collaboration in the United States

    Science.gov (United States)

    Hurley, Donna S.; Sukal-Moulton, Theresa; Msall, Michael E.; Gaebler-Spira, Deborah; Krosschell, Kristin J.; Dewald, Julius P.

    2011-01-01

    Cerebral palsy is the most common neurodevelopmental motor disability in children. The condition requires medical, educational, social, and rehabilitative resources throughout the life span. Several countries have developed population-based registries that serve the purpose of prospective longitudinal collection of etiologic, demographic, and functional severity. The United States has not created a comprehensive program to develop such a registry. Barriers have been large population size, poor interinstitution collaboration, and decentralized medical and social systems. The Cerebral Palsy Research Registry was created to fill the gap between population and clinical-based cerebral palsy registries and promote research in the field. This is accomplished by connecting persons with cerebral palsy, as well as their families, to a network of regional researchers. This article describes the development of an expandable cerebral palsy research registry, its current status, and the potential it has to affect families and persons with cerebral palsy in the United States and abroad. PMID:21677201

  6. Cooperative Fish and Wildlife Research Units Program—2017 year in review postcard

    Science.gov (United States)

    Organ, John F.; Thompson, John D.; Dennerline, Donald E.; Childs, Dawn E.

    2018-02-08

    This postcard provides details about the Cooperative Fish and Wildlife Research Units Program—2017 Year in Review, U.S. Geological Survey Circular 1438, now available at https://doi.org/10.3133/cir1438. In this report, you will find details about the Cooperative Fish and Wildlife Research Units (CRU) Program relating to its background, fish and wildlife science, students, staffing, vacancies, research funding, outreach and training, science themes, accolades, and professional services. You will see snapshots of CRU projects with information on how results have been or are being applied by cooperators. This is the essence of what we do: science that matters.Throughout the year, keep up with CRU research projects at http://www.coopunits.org.

  7. Mechanical properties of regular porous biomaterials made from truncated cube repeating unit cells: Analytical solutions and computational models.

    Science.gov (United States)

    Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A

    2016-03-01

    Additive manufacturing (AM) has enabled fabrication of open-cell porous biomaterials based on repeating unit cells. The micro-architecture of the porous biomaterials and, thus, their physical properties could then be precisely controlled. Due to their many favorable properties, porous biomaterials manufactured using AM are considered as promising candidates for bone substitution as well as for several other applications in orthopedic surgery. The mechanical properties of such porous structures including static and fatigue properties are shown to be strongly dependent on the type of the repeating unit cell based on which the porous biomaterial is built. In this paper, we study the mechanical properties of porous biomaterials made from a relatively new unit cell, namely truncated cube. We present analytical solutions that relate the dimensions of the repeating unit cell to the elastic modulus, Poisson's ratio, yield stress, and buckling load of those porous structures. We also performed finite element modeling to predict the mechanical properties of the porous structures. The analytical solution and computational results were found to be in agreement with each other. The mechanical properties estimated using both the analytical and computational techniques were somewhat higher than the experimental data reported in one of our recent studies on selective laser melted Ti-6Al-4V porous biomaterials. In addition to porosity, the elastic modulus and Poisson's ratio of the porous structures were found to be strongly dependent on the ratio of the length of the inclined struts to that of the uninclined (i.e. vertical or horizontal) struts, α, in the truncated cube unit cell. The geometry of the truncated cube unit cell approaches the octahedral and cube unit cells when α respectively approaches zero and infinity. Consistent with those geometrical observations, the analytical solutions presented in this study approached those of the octahedral and cube unit cells when

  8. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    Science.gov (United States)

    Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J

    2017-12-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  9. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    Directory of Open Access Journals (Sweden)

    Rebecca F Alford

    2017-12-01

    Full Text Available Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  10. The impact of changing computing technology on EPRI [Electric Power Research Institute] nuclear analysis codes

    International Nuclear Information System (INIS)

    Breen, R.J.

    1988-01-01

    The Nuclear Reload Management Program of the Nuclear Power Division (NPD) of the Electric Power Research Institute (EPRI) has the responsibility for initiating and managing applied research in selected nuclear engineering analysis functions for nuclear utilities. The computer systems that result from the research projects consist of large FORTRAN programs containing elaborate computational algorithms used to access such areas as core physics, fuel performance, thermal hydraulics, and transient analysis. This paper summarizes a study of computing technology trends sponsored by the NPD. The approach taken was to interview hardware and software vendors, industry observers, and utility personnel focusing on expected changes that will occur in the computing industry over the next 3 to 5 yr. Particular emphasis was placed on how these changes will impact engineering/scientific computer code development, maintenance, and use. In addition to the interviews, a workshop was held with attendees from EPRI, Power Computing Company, industry, and utilities. The workshop provided a forum for discussing issues and providing input into EPRI's long-term computer code planning process

  11. Research in progress in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  12. Educational Technology Research Journals: "International Journal of Computer-Supported Collaborative Learning," 2006-2014

    Science.gov (United States)

    Howland, Shiloh M. J.; Martin, M. Troy; Bodily, Robert; Faulconer, Christian; West, Richard E.

    2015-01-01

    The authors analyzed all research articles from the first issue of the "International Journal of Computer-Supported Collaborative Learning" in 2006 until the second issue of 2014. They determined the research methodologies, most frequently used author-supplied keywords as well as two- and three-word phrases, and most frequently published…

  13. Computer-based measurement and automatizatio aplication research in nuclear technology fields

    International Nuclear Information System (INIS)

    Jiang Hongfei; Zhang Xiangyang

    2003-01-01

    This paper introduces computer-based measurement and automatization application research in nuclear technology fields. The emphasis of narration are the role of software in the development of system, and the network measurement and control software model which has optimistic application foreground. And presents the application examples of research and development. (authors)

  14. Computerized Games and Simulations in Computer-Assisted Language Learning: A Meta-Analysis of Research

    Science.gov (United States)

    Peterson, Mark

    2010-01-01

    This article explores research on the use of computerized games and simulations in language education. The author examined the psycholinguistic and sociocultural constructs proposed as a basis for the use of games and simulations in computer-assisted language learning. Research in this area is expanding rapidly. However, to date, few studies have…

  15. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  16. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Peisert, Sean [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Davis, CA (United States); Potok, Thomas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jones, Todd [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-03

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues included research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the

  17. A cloud computing based platform for sleep behavior and chronic diseases collaborative research.

    Science.gov (United States)

    Kuo, Mu-Hsing; Borycki, Elizabeth; Kushniruk, Andre; Huang, Yueh-Min; Hung, Shu-Hui

    2014-01-01

    The objective of this study is to propose a Cloud Computing based platform for sleep behavior and chronic disease collaborative research. The platform consists of two main components: (1) a sensing bed sheet with textile sensors to automatically record patient's sleep behaviors and vital signs, and (2) a service-oriented cloud computing architecture (SOCCA) that provides a data repository and allows for sharing and analysis of collected data. Also, we describe our systematic approach to implementing the SOCCA. We believe that the new cloud-based platform can provide nurse and other health professional researchers located in differing geographic locations with a cost effective, flexible, secure and privacy-preserved research environment.

  18. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  19. The Need for Comparative Education Research to Concentrate on the Cultural Revolution within the United States.

    Science.gov (United States)

    Petit, M. Loretta

    Comparative education research and courses are needed to identify real revolutionary movements in the current cultural revolution in the United States. The presence of cultural revolution is indicated by, among other things, the development of microcultures. Intranational instead of cross-national studies are of importance in the next few years to…

  20. Research Directions: Multimodal Books in Science-Literacy Units: Language and Visual Images for Meaning Making

    Science.gov (United States)

    Pappas, Christine C.; Varelas, Maria

    2009-01-01

    This article presents a review of the author's long-term research in urban classrooms. The authors explore six illustrated information books created by children as culminating activities of integrated science-literacy units, Forest and Matter, that they developed, implemented, and studied in several 1st-3rd grade classrooms in Chicago Public…

  1. ISCB Ebola Award for Important Future Research on the Computational Biology of Ebola Virus.

    Directory of Open Access Journals (Sweden)

    Peter D. Karp

    2015-01-01

    Full Text Available Speed is of the essence in combating Ebola; thus, computational approaches should form a significant component of Ebola research. As for the development of any modern drug, computational biology is uniquely positioned to contribute through comparative analysis of the genome sequences of Ebola strains as well as 3-D protein modeling. Other computational approaches to Ebola may include large-scale docking studies of Ebola proteins with human proteins and with small-molecule libraries, computational modeling of the spread of the virus, computational mining of the Ebola literature, and creation of a curated Ebola database. Taken together, such computational efforts could significantly accelerate traditional scientific approaches. In recognition of the need for important and immediate solutions from the field of computational biology against Ebola, the International Society for Computational Biology (ISCB announces a prize for an important computational advance in fighting the Ebola virus. ISCB will confer the ISCB Fight against Ebola Award, along with a prize of US$2,000, at its July 2016 annual meeting (ISCB Intelligent Systems for Molecular Biology [ISMB] 2016, Orlando, Florida.

  2. Porting of the transfer-matrix method for multilayer thin-film computations on graphics processing units

    Science.gov (United States)

    Limmer, Steffen; Fey, Dietmar

    2013-07-01

    Thin-film computations are often a time-consuming task during optical design. An efficient way to accelerate these computations with the help of graphics processing units (GPUs) is described. It turned out that significant speed-ups can be achieved. We investigate the circumstances under which the best speed-up values can be expected. Therefore we compare different GPUs among themselves and with a modern CPU. Furthermore, the effect of thickness modulation on the speed-up and the runtime behavior depending on the input data is examined.

  3. The African Research Cloud - A Friendly Entry Point to Research Computing

    OpenAIRE

    Walt, Anelda Van der; Pretorius, Boeta

    2016-01-01

    The slides were presented at the first African Research Cloud Workshop in Pretoria, South Africa on 27 - 28 October 2016 [1].The slides formed part of an introduction for a session about using cloud infrastructure, and specifically the African Research Cloud, for training purposes.The session was co-facilitated with Dr Bradley Frank and Dr Michelle Cluver.[1] "African Research Cloud Workshop." The Institute for Data Intensive Astronomy (IDIA). IDIA, n.d. Web. 3 Nov. 2016.  

  4. Improving brain computer interface research through user involvement - The transformative potential of integrating civil society organisations in research projects

    Science.gov (United States)

    Wakunuma, Kutoma; Rainey, Stephen; Hansen, Christian

    2017-01-01

    Research on Brain Computer Interfaces (BCI) often aims to provide solutions for vulnerable populations, such as individuals with diseases, conditions or disabilities that keep them from using traditional interfaces. Such research thereby contributes to the public good. This contribution to the public good corresponds to a broader drive of research and funding policy that focuses on promoting beneficial societal impact. One way of achieving this is to engage with the public. In practical terms this can be done by integrating civil society organisations (CSOs) in research. The open question at the heart of this paper is whether and how such CSO integration can transform the research and contribute to the public good. To answer this question the paper describes five detailed qualitative case studies of research projects including CSOs. The paper finds that transformative impact of CSO integration is possible but by no means assured. It provides recommendations on how transformative impact can be promoted. PMID:28207882

  5. Improving brain computer interface research through user involvement - The transformative potential of integrating civil society organisations in research projects.

    Science.gov (United States)

    Stahl, Bernd Carsten; Wakunuma, Kutoma; Rainey, Stephen; Hansen, Christian

    2017-01-01

    Research on Brain Computer Interfaces (BCI) often aims to provide solutions for vulnerable populations, such as individuals with diseases, conditions or disabilities that keep them from using traditional interfaces. Such research thereby contributes to the public good. This contribution to the public good corresponds to a broader drive of research and funding policy that focuses on promoting beneficial societal impact. One way of achieving this is to engage with the public. In practical terms this can be done by integrating civil society organisations (CSOs) in research. The open question at the heart of this paper is whether and how such CSO integration can transform the research and contribute to the public good. To answer this question the paper describes five detailed qualitative case studies of research projects including CSOs. The paper finds that transformative impact of CSO integration is possible but by no means assured. It provides recommendations on how transformative impact can be promoted.

  6. United States Air Force Research Initiation Program. 1984 Research Reports. Volume 3.

    Science.gov (United States)

    1986-05-01

    right terminal of lamp 13. Position the second switch below switch A such that the handle may be pulled toward you or pushed away from you. The second...position. 42. Pull the handle of s.itch B toward you to light lamp B. 47. Fush.the handle of switch B all the way forward to light both lamps A and B... goalI was to obtain information that could lead to the stabilization of a . Nd:YAG laser. III. APPROACH At the beginning of this research, some of the

  7. The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit (GPU) Computing

    Science.gov (United States)

    2017-08-01

    used for its GPU computing capability during the experiment. It has Nvidia Tesla K40 GPU accelerators containing 32 GPU nodes consisting of 1024...cores. CUDA is a parallel computing platform and application programming interface (API) model that was created and designed by Nvidia to give direct...Agricultural and Forest Meteorology. 1995:76:277–291, ISSN 0168-1923. 3. GPU vs. CPU? What is GPU computing? Santa Clara (CA): Nvidia Corporation; 2017

  8. Summaries of research and development activities by using JAEA computer system in FY2007. April 1, 2007 - March 31, 2008

    International Nuclear Information System (INIS)

    2008-11-01

    Center for Computational Science and e-Systems (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2007 (April 1, 2007 - March 31, 2008). (author)

  9. Summaries of research and development activities by using JAEA computer system in FY2009. April 1, 2009 - March 31, 2010

    International Nuclear Information System (INIS)

    2010-11-01

    Center for Computational Science and e-Systems (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2009 (April 1, 2009 - March 31, 2010). (author)

  10. Young Researchers Advancing Computational Science: Perspectives of the Young Scientists Conference 2015

    CERN Document Server

    Boukhanovsky, Alexander V; Krzhizhanovskaya, Valeria V; Athanassoulis, Gerassimos A; Klimentov, Alexei A; Sloot, Peter M A

    2015-01-01

    We present an annual international Young Scientists Conference (YSC) on computational science http://ysc.escience.ifmo.ru/, which brings together renowned experts and young researchers working in high-performance computing, data-driven modeling, and simulation of large-scale complex systems. The first YSC event was organized in 2012 by the University of Amsterdam, the Netherlands and ITMO University, Russia with the goal of opening a dialogue on the present and the future of computational science and its applications. We believe that the YSC conferences will strengthen the ties between young scientists in different countries, thus promoting future collaboration. In this paper we briefly introduce the challenges the millennial generation is facing; describe the YSC conference history and topics; and list the keynote speakers and program committee members. This volume of Procedia Computer Science presents selected papers from the 4th International Young Scientists Conference on Computational Science held on 25 ...

  11. UNI C - A True Internet Pioneer, the Danish Computing Centre for Research and Education

    DEFF Research Database (Denmark)

    Olesen, Dorte

    2015-01-01

    that small computers could now be purchased for local use by the university departments whereas the need for high performance computing could only be satisfied by a joint national purchase and advanced network access to this central computer facility.The new center was named UNI-C and succeeded in helping...... Danish frontline research to use innovative computing techniques and have major breakthroughs using the first massively parallel computer architectures, but the greatest impact of UNI-C on Danish society was the successful early roll out of the Internet to universities with a follow-up of establishing...... the first Danish Internet service to ordinary PC users. This very first Internet service became a great success and helped to put Denmark on the international map as one of the very early Internet adopters. It also meant that UNI-C was tasked by the Ministry of Education with delivering a number...

  12. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  13. Use of micro X-ray computed tomography for development and research into waterjets

    OpenAIRE

    Souček, K. (Kamil); Sitek, L. (Libor); Gurková, L. (Lucie); Georgiovská, L. (Lucie)

    2015-01-01

    Non-destructive methods for analysis of various types of materials have been increasingly applied recently. One of these methods is the industrial micro X-ray computed tomography (CT). This paper presents an overview of experience in using the industrial micro X-ray computed tomography during research activities at the Institute of Geonics of the CAS. It discusses possibilities of the nondestructive visualization of the inner structures of a wide range of materials and objects, includin...

  14. State-of-the-art computer technologies used to train nuclear specialists and to conduct research

    International Nuclear Information System (INIS)

    Korovin, Yu.A.; Tikhonenko, A.V.

    2011-01-01

    The paper discusses innovative methods used in the process of training nuclear specialists and conducting research which are based on state-of-the-art computer technologies. The approach proposed makes wide use of mathematical modeling and state-of-the-art programming techniques. It is based on the development, improvement and application of problem-oriented computer codes to support the teaching process and to solve fundamental and applied problems of nuclear physics and nuclear engineering.

  15. Challenges and strategies for quantitative and qualitative field research in the United Arab Emirates.

    Science.gov (United States)

    Aw, Tar-Ching; Zoubeidi, Taoufik; Al-Maskari, Fatma; Blair, Iain

    2011-01-01

    Clinical and public health research depends on factors including national systems, socio-cultural influences, and access to organisations and individuals. As a 'new' country, the United Arab Emirates (UAE) has yet to develop strong support for population research. However, there is interest in research. The challenges for quantitative and qualitative research include the varied composition and mobility of the UAE population, with limited health records and disease registries. Long-term follow-up of patients, and tracing foreign workers who may only be in the UAE for a few years, are two major obstacles for longitudinal studies. There can also be a reluctance shown by parts of the population to participate in studies, especially those that require responding to what is perceived as sensitive questions. Successful execution of population research in the UAE requires an understanding of socio-cultural aspects of the study population, and good communication between researchers and participants.

  16. Research on unit commitment with large-scale wind power connected power system

    Science.gov (United States)

    Jiao, Ran; Zhang, Baoqun; Chi, Zhongjun; Gong, Cheng; Ma, Longfei; Yang, Bing

    2017-01-01

    Large-scale integration of wind power generators into power grid brings severe challenges to power system economic dispatch due to its stochastic volatility. Unit commitment including wind farm is analyzed from the two parts of modeling and solving methods. The structures and characteristics can be summarized after classification has been done according to different objective function and constraints. Finally, the issues to be solved and possible directions of research and development in the future are discussed, which can adapt to the requirements of the electricity market, energy-saving power generation dispatching and smart grid, even providing reference for research and practice of researchers and workers in this field.

  17. [Ethics of research in psychiatry. Comparison of France and the United States].

    Science.gov (United States)

    Lemoine, P; Pacault-Legendre, V

    1983-01-01

    This article presents a comparison of research ethics in psychopharmacology in France and the United States. The authors present some elements of definition, etymology and of history. In addition, they study how this very specific research is actually done. Many questions are discussed, including the right of the patient, and the problem of normal volunteers. Other aspects are more technical i.e. remuneration, protocol, and the product. Finally the role of the ethics committees is investigated. These committees comprise the third component of the trial that consists of the research and the subject.

  18. Incorporation of personal computers in a research reactor instrumentation system for data monitoring and analysis

    International Nuclear Information System (INIS)

    Leopando, L.S.

    1998-01-01

    The research contract was implemented by obtaining off-the shelf personal computer hardware and data acquisition cards, designing the interconnection with the instrumentation system, writing and debugging the software, and the assembling and testing the set-up. The hardware was designed to allow all variables monitored by the instrumentation system to be accessible to the computers, without requiring any major modification of the instrumentation system and without compromising reactor safety in any way. The computer hardware addition was also designed to have no effect on any existing function of the instrumentation system. The software was designed to implement only graphical display and automated logging of reactor variables. Additional functionality could be easily added in the future with software revision because all the reactor variables are already available in the computer. It would even be possible to ''close the loop'' and control the reactor through software. It was found that most of the effort in an undertaking of this sort will be in software development, but the job can be done even by non-computer specialized reactor people working with programming languages they are already familiar with. It was also found that the continuing rapid advance of personal computer technology makes it essential that such a project be undertaken with inevitability of future hardware upgrading in mind. The hardware techniques and the software developed may find applicability in other research reactors, especially those with a generic analog research reactor TRIGA console. (author)

  19. From computer-assisted intervention research to clinical impact: The need for a holistic approach.

    Science.gov (United States)

    Ourselin, Sébastien; Emberton, Mark; Vercauteren, Tom

    2016-10-01

    The early days of the field of medical image computing (MIC) and computer-assisted intervention (CAI), when publishing a strong self-contained methodological algorithm was enough to produce impact, are over. As a community, we now have substantial responsibility to translate our scientific progresses into improved patient care. In the field of computer-assisted interventions, the emphasis is also shifting from the mere use of well-known established imaging modalities and position trackers to the design and combination of innovative sensing, elaborate computational models and fine-grained clinical workflow analysis to create devices with unprecedented capabilities. The barriers to translating such devices in the complex and understandably heavily regulated surgical and interventional environment can seem daunting. Whether we leave the translation task mostly to our industrial partners or welcome, as researchers, an important share of it is up to us. We argue that embracing the complexity of surgical and interventional sciences is mandatory to the evolution of the field. Being able to do so requires large-scale infrastructure and a critical mass of expertise that very few research centres have. In this paper, we emphasise the need for a holistic approach to computer-assisted interventions where clinical, scientific, engineering and regulatory expertise are combined as a means of moving towards clinical impact. To ensure that the breadth of infrastructure and expertise required for translational computer-assisted intervention research does not lead to a situation where the field advances only thanks to a handful of exceptionally large research centres, we also advocate that solutions need to be designed to lower the barriers to entry. Inspired by fields such as particle physics and astronomy, we claim that centralised very large innovation centres with state of the art technology and health technology assessment capabilities backed by core support staff and open

  20. Application of Computer Technology to Educational Administration in the United States.

    Science.gov (United States)

    Bozeman, William C.; And Others

    1991-01-01

    Description of evolution of computer applications in U.S. educational administration is followed by an overview of the structure and governance of public education and Visscher's developmental framework. Typical administrative computer applications in education are discussed, including student records, personnel management, budgeting, library…

  1. Product- and Process Units in the CRITT Translation Process Research Database

    DEFF Research Database (Denmark)

    Carl, Michael

    than 300 hours of text production. The database provides the raw logging data, as well as Tables of pre-processed product- and processing units. The TPR-DB includes various types of simple and composed product and process units that are intended to support the analysis and modelling of human text......The first version of the "Translation Process Research Database" (TPR DB v1.0) was released In August 2012, containing logging data of more than 400 translation and text production sessions. The current version of the TPR DB, (v1.4), contains data from more than 940 sessions, which represents more...

  2. Current state and future direction of computer systems at NASA Langley Research Center

    Science.gov (United States)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  3. Developing nursing research in the United Arab Emirates: a narrative review.

    Science.gov (United States)

    McCreaddie, M; Kuzemski, D; Griffiths, J; Sojka, E M; Fielding, M; Al Yateem, N; Williams, J J

    2018-03-01

    This article identified, critically analysed and synthesized the literature on international nursing and midwifery research capacity building and standards. The United Arab Emirates is heavily dependent up on expatriate nurses. Only 4% of nurses working within the country are Emirati. The nation is therefore committed to developing nurses and nursing as a profession. The United Arab Emirates' Nursing and Midwifery Council was formed in 2009 and initially focused on regulation, education and specialization. This review was undertaken to inform the work of the Council's newly established Scientific Research Sub-Committee. A rapid narrative review was conducted using the Cumulative Index of Nursing and Allied Health Literature database, key words, Boolean operators, parameters and a journal-specific search. An inclusion/exclusion criterion was identified. The search provided 332 articles with 45 included in the final review. The literature on nursing research 'standards' and 'capacity building' is diverse and inconsistent across continents and in approaches. Nursing research has evolved to varying degrees across the globe. Nevertheless, irrespective of the locale, there are similar problems encountered in growing research, for example nursing faculty shortage, lack of collaborative research, funding. There are also specific challenges in the Middle East and North Africa region. The review was constrained by time and access. There are specific challenges for the United Arab Emirates. However, the country is well placed to learn from the experiences of colleagues elsewhere. Time and commitment is required to build the solid foundations necessary to ensure robust, sustained growth. Identifying research capacity as both a process and outcome at the outset may also assist. Further, it may be prudent to consider initiating a Gulf Coast Countries' collaborative approach to building research capacity to harness scare resources and create a larger critical mass. © 2017

  4. Computer Graphic Design Using Auto-CAD and Plug Nozzle Research

    Science.gov (United States)

    Rogers, Rayna C.

    2004-01-01

    The purpose of creating computer generated images varies widely. They can be use for computational fluid dynamics (CFD), or as a blueprint for designing parts. The schematic that I will be working on the summer will be used to create nozzles that are a part of a larger system. At this phase in the project, the nozzles needed for the systems have been fabricated. One part of my mission is to create both three dimensional and two dimensional models on Auto-CAD 2002 of the nozzles. The research on plug nozzles will allow me to have a better understanding of how they assist in the thrust need for a missile to take off. NASA and the United States military are working together to develop a new design concept. On most missiles a convergent-divergent nozzle is used to create thrust. However, the two are looking into different concepts for the nozzle. The standard convergent-divergent nozzle forces a mixture of combustible fluids and air through a smaller area in comparison to where the combination was mixed. Once it passes through the smaller area known as A8 it comes out the end of the nozzle which is larger the first or area A9. This creates enough thrust for the mechanism whether it is an F-18 fighter jet or a missile. The A9 section of the convergent-divergent nozzle has a mechanism that controls how large A9 can be. This is needed because the pressure of the air coming out nozzle must be equal to that of the ambient pressure other wise there will be a loss of performance in the machine. The plug nozzle however does not need to have an A9 that can vary. When the air flow comes out it can automatically sense what the ambient pressure is and will adjust accordingly. The objective of this design is to create a plug nozzle that is not as complicated mechanically as it counterpart the convergent-divergent nozzle.

  5. Measurement of parameters for the quality control of X-ray units by using PIN diodes and a personal computer

    International Nuclear Information System (INIS)

    Ramirez, F.; Gaytan, E.; Mercado, I.; Estrada, M.; Cerdeira, A.

    2000-01-01

    The design of a new system for the measurement of the main parameters of X-ray units used in medicine is presented. The system measures automatically the exposure time, high voltage applied, waveform of the detected signal, exposure ratio and the total exposure (dose). The X-ray detectors employed are PIN diodes developed at CINVESTAV, the measurements are done in one single shot, without invasion of the X-ray unit. The results are shown in the screen of the computer and can be saved in a file for later analysis. The proposed system is intended to be used in the quality control of X-rays units for clinical radio-diagnosis. It is a simple and inexpensive equipment if compared with available commercial equipment that uses ionization chambers and accurate electrometers that small facilities and hospitals cannot afford

  6. FINAL INTERIM REPORT VERIFICATION SURVEY ACTIVITIES IN FINAL STATUS SURVEY UNITS 7, 8, 9, 10, 11, 13 and 14 AT THE SEPARATIONS PROCESS RESEARCH UNIT, NISKAYUNA, NEW YORK

    International Nuclear Information System (INIS)

    Jadick, M.G.

    2010-01-01

    The Separations Process Research Unit (SPRU) facilities were constructed in the late 1940s to research the chemical separation of plutonium and uranium. SPRU operated between February 1950 and October 1953. The research activities ceased following the successful development of the reduction/oxidation and plutonium/uranium extraction processes that were subsequently used by the Hanford and the Savannah River sites.

  7. A continuing success - The United States Foreign Research Reactor Spent Nuclear Fuel Acceptance Program

    International Nuclear Information System (INIS)

    Mustin, Tracy P.; Clapper, Maureen; Reilly, Jill E.

    2000-01-01

    The United States Department of Energy, in consultation with the Department of State, adopted the Nuclear Weapons Nonproliferation Policy Concerning Foreign Research Reactor Spent Nuclear Fuel in May 1996. To date, the Foreign Research Reactor (FRR) Spent Nuclear Fuel (SNF) Acceptance Program, established under this policy, has completed 16 spent fuel shipments. 2,651 material test reactor (MTR) assemblies, one Slowpoke core containing less than 1 kilogram of U.S.-origin enriched uranium, 824 Training, Research, Isotope, General Atomic (TRIGA) rods, and 267 TRIGA pins from research reactors around the world have been shipped to the United States so far under this program. As the FRR SNF Acceptance Program progresses into the fifth year of implementation, a second U.S. cross country shipment has been completed, as well as a second overland truck shipment from Canada. Both the cross country shipment and the Canadian shipment were safely and successfully completed, increasing our knowledge and experience in these types of shipments. In addition, two other shipments were completed since last year's RERTR meeting. Other program activities since the last meeting included: taking pre-emptive steps to avoid license amendment pitfalls/showstoppers for spent fuel casks, publication of a revision to the Record of Decision allowing up to 16 casks per ocean going vessel, and the issuance of a cable to 16 of the 41 eligible countries reminding their governments and the reactor operators that the U.S.-origin uranium in their research reactors may be eligible for return to the United States under the Acceptance Program and urging them to begin discussions on shipping schedules. The FRR SNF program has also supported the Department's implementation of the competitive pricing policy for uranium and resumption of shipments of fresh uranium for fabrication into assemblies for research reactors. The United States Foreign Research Reactor Spent Nuclear Fuel Acceptance Program continues

  8. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    Science.gov (United States)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  9. U.S. Geological Survey Cooperative Fish and Wildlife Research Units Program—2016–2017 Research Abstracts

    Science.gov (United States)

    Dennerline, Donald E.; Childs, Dawn E.

    2017-04-20

    The U.S. Geological Survey (USGS) has several strategic goals that focus its efforts on serving the American people. The USGS Ecosystems Mission Area has responsibility for the following objectives under the strategic goal of “Science to Manage and Sustain Resources for Thriving Economies and Healthy Ecosystems”:Understand, model, and predict change in natural systemsConserve and protect wildlife and fish species and their habitatsReduce or eliminate the threat of invasive species and wildlife diseaseThis report provides abstracts of the majority of ongoing research investigations of the USGS Cooperative Fish and Wildlife Research Units program and is intended to complement the 2016 Cooperative Research Units Program Year in Review Circular 1424 (https://doi.org/10.3133/cir1424). The report is organized by the following major science themes that contribute to the objectives of the USGS:Advanced TechnologiesClimate ScienceDecision ScienceEcological FlowsEcosystem ServicesEndangered Species Conservation, Recovery, and Proactive StrategiesEnergyHuman DimensionsInvasive SpeciesLandscape EcologySpecies of Greatest Conservation NeedSpecies Population, Habitat, and Harvest ManagementWildlife Health and Disease

  10. Ethnographic research into nursing in acute adult mental health units: a review.

    Science.gov (United States)

    Cleary, Michelle; Hunt, Glenn E; Horsfall, Jan; Deacon, Maureen

    2011-01-01

    Acute inpatient mental health units are busy and sometimes chaotic settings, with high bed occupancy rates. These settings include acutely unwell patients, busy staff, and a milieu characterised by unpredictable interactions and events. This paper is a report of a literature review conducted to identify, analyse, and synthesize ethnographic research in adult acute inpatient mental health units. Several electronic databases were searched using relevant keywords to identify studies published from 1990-present. Additional searches were conducted using reference lists. Ethnographic studies published in English were included if they investigated acute inpatient care in adult settings. Papers were excluded if the unit under study was not exclusively for patients in the acute phase of their mental illness, or where the original study was not fully ethnographic. Ten research studies meeting our criteria were found (21 papers). Findings were grouped into the following overarching categories: (1) Micro-skills; (2) Collectivity; (3) Pragmatism; and (4) Reframing of nursing activities. The results of this ethnographic review reveal the complexity, patient-orientation, and productivity of some nursing interventions that may not have been observed or understood without the use of this research method. Additional quality research should focus on redefining clinical priorities and philosophies to ensure everyday care is aligned constructively with the expectations of stakeholders and is consistent with policy and the realities of the organisational setting. We have more to learn from each other with regard to the effective nursing care of inpatients who are acutely disturbed.

  11. United States private-sector physicians and pharmaceutical contract research: a qualitative study.

    Science.gov (United States)

    Fisher, Jill A; Kalbaugh, Corey A

    2012-01-01

    There have been dramatic increases over the past 20 years in the number of nonacademic, private-sector physicians who serve as principal investigators on US clinical trials sponsored by the pharmaceutical industry. However, there has been little research on the implications of these investigators' role in clinical investigation. Our objective was to study private-sector clinics involved in US pharmaceutical clinical trials to understand the contract research arrangements supporting drug development, and specifically how private-sector physicians engaged in contract research describe their professional identities. We conducted a qualitative study in 2003-2004 combining observation at 25 private-sector research organizations in the southwestern United States and 63 semi-structured interviews with physicians, research staff, and research participants at those clinics. We used grounded theory to analyze and interpret our data. The 11 private-sector physicians who participated in our study reported becoming principal investigators on industry clinical trials primarily because contract research provides an additional revenue stream. The physicians reported that they saw themselves as trial practitioners and as businesspeople rather than as scientists or researchers. Our findings suggest that in addition to having financial motivation to participate in contract research, these US private-sector physicians have a professional identity aligned with an industry-based approach to research ethics. The generalizability of these findings and whether they have changed in the intervening years should be addressed in future studies. Please see later in the article for the Editors' Summary.

  12. United States private-sector physicians and pharmaceutical contract research: a qualitative study.

    Directory of Open Access Journals (Sweden)

    Jill A Fisher

    Full Text Available There have been dramatic increases over the past 20 years in the number of nonacademic, private-sector physicians who serve as principal investigators on US clinical trials sponsored by the pharmaceutical industry. However, there has been little research on the implications of these investigators' role in clinical investigation. Our objective was to study private-sector clinics involved in US pharmaceutical clinical trials to understand the contract research arrangements supporting drug development, and specifically how private-sector physicians engaged in contract research describe their professional identities.We conducted a qualitative study in 2003-2004 combining observation at 25 private-sector research organizations in the southwestern United States and 63 semi-structured interviews with physicians, research staff, and research participants at those clinics. We used grounded theory to analyze and interpret our data. The 11 private-sector physicians who participated in our study reported becoming principal investigators on industry clinical trials primarily because contract research provides an additional revenue stream. The physicians reported that they saw themselves as trial practitioners and as businesspeople rather than as scientists or researchers.Our findings suggest that in addition to having financial motivation to participate in contract research, these US private-sector physicians have a professional identity aligned with an industry-based approach to research ethics. The generalizability of these findings and whether they have changed in the intervening years should be addressed in future studies. Please see later in the article for the Editors' Summary.

  13. FFUSION yearbook 1996. Annual report of the Finnish research unit. Association EURATOM-TEKES

    Energy Technology Data Exchange (ETDEWEB)

    Karttunen, S; Paettikangas, T [eds.; VTT Energy, Espoo (Finland)

    1997-05-01

    Finnish fusion programme (FFUSION) is one of the eleven national energy research programmes funded by the Technological Development Centre of Finland (TEKES). The FFUSION programme was fully integrated into European Fusion Programme just after Finland joined the European Union. The contract of Association Euratom and Tekes was signed in 1995 and extends to the end of 1999. Finland became a member of JET Joint Undertaking in 1996, other contracts with Euratom include NET agreement and the Staff Mobility Agreement. FFUSION programme with participating research institutes and universities forms the Fusion Research Unit of the Association Euratom-Tekes. This annual report summarises the research activities of the Finnish Research Unit in 1996. The programme consists of two parts: Physics and Technology. The research areas of the physics are: Fusion plasma engineering, Radio-frequency heating and plasma diagnostics, and Plasma-wall interactions - ion-beam studies. The technology is focused into three areas: Fusion reactor materials (first wall components and joining techniques), Remote handling and viewing systems, and Superconductors

  14. The role of Clinical Trial Units in investigator- and industry-initiated research projects.

    Science.gov (United States)

    von Niederhäusern, Belinda; Fabbro, Thomas; Pauli-Magnus, Christiane

    2015-01-01

    Six multidisciplinary competence centres (Clinical Trial Units, CTUs) in Basel, Berne, Geneva, Lausanne, St. Gallen and Zurich provide professional support to clinical researchers in the planning, implementation, conduct and evaluation of clinical studies. Through their coordinated network, these units promote high-quality, nationally harmonised and internationally standardised clinical research conduct in Switzerland. We will describe why this network has been established, how it has been successful in stilling the growing need for clinical research support, which training and education opportunities it offers, and how it created national awareness for the still-existing hurdles towards clinical research excellence in Switzerland. Taking the CTU Basel as an example, we show that a considerable number (25%) of the studies submitted for regulatory approval in 2013 were supported by the CTU, decreasing the number of findings in ethics reviews by about one-third. We conclude that these achievements, together with a Swiss national funding model for clinical research, and improved national coordination, will be critical factors to successfully position Swiss clinical research at the international forefront.

  15. Moving into the 21st century - The United States' Research Reactor Spent Nuclear Fuel Acceptance Program

    International Nuclear Information System (INIS)

    Huizenga, David G.; Mustin, Tracy P.; Saris, Elizabeth C.; Reilly, Jill E.

    1999-01-01

    Since 1996, when the United States Department of Energy and the Department of State jointly adopted the Nuclear Weapons Nonproliferation Policy Concerning Foreign Research Reactor Spent Nuclear Fuel, twelve shipments totaling 2,985 MTR and TRIGA spent nuclear fuel assemblies from research reactors around the world have been accepted into the United States. These shipments have contained approximately 1.7 metric tons of HEU and 0.6 metric tons of LEU. Foreign research reactor operators played a significant role in this success. A new milestone in the acceptance program occurred during the summer of 1999 with the arrival of TRIGA spent nuclear fuel from Europe through the Charleston Naval Weapons Station via the Savannah River Site to the Idaho National Engineering and Environmental Laboratory. This shipment consisted of five casks of TRIGA spent nuclear fuel from research reactors in Germany, Italy, Slovenia, and Romania. These casks were transported by truck approximately 2,400 miles across the United States (one cask packaged in an ISO container per truck). Drawing upon lessons learned in previous shipments, significant technical, legal, and political challenges were addressed to complete this cross-country shipment. Other program activities since the last RERTR meeting have included: formulation of a methodology to determine the quantity of spent nuclear fuel in a damaged condition that may be transported in a particular cask (containment analysis for transportation casks); publication of clarification of the fee policy; and continued planning for the outyears of the acceptance policy including review of reactors and eligible material quantities. The United States Foreign Research Reactor Spent Nuclear Fuel Acceptance Program continues to demonstrate success due to the continuing commitment between the United States and the research reactor community to make this program work. We strongly encourage all eligible research reactors to decide as soon as possible to

  16. Computer-aided modeling of aluminophosphate zeolites as packings of building units

    KAUST Repository

    Peskov, Maxim; Blatov, Vladislav A.; Ilyushin, Gregory D.; Schwingenschlö gl, Udo

    2012-01-01

    New building schemes of aluminophosphate molecular sieves from packing units (PUs) are proposed. We have investigated 61 framework types discovered in zeolite-like aluminophosphates and have identified important PU combinations using a recently

  17. Critical Vulnerability: Defending the Decisive Point of United States Computer Networked Information Systems

    National Research Council Canada - National Science Library

    Virden, Roy

    2003-01-01

    .... The military's use of computer networked information systems is thus a critical strength. These systems are then critical vulnerabilities because they may lack adequate protection and are open to enemy attack...

  18. Research on Key Technologies of Unit-Based CNC Machine Tool Assembly Design

    Directory of Open Access Journals (Sweden)

    Zhongqi Sheng

    2014-01-01

    Full Text Available Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to assembly connection strength to depict the assembling difficulty level. The transmissibility based on trust relationship was applied on the assembly connection strength. Assembly unit partition based on assembly connection strength was conducted, and interferential assembly units were identified and revised. The assembly sequence planning and optimization of parts in each assembly unit and between assembly units was conducted using genetic algorithm. With certain type of high speed CNC turning center, as an example, this paper explored into the assembly modeling, assembly unit partition, and assembly sequence planning and optimization and realized the optimized assembly sequence of headstock of CNC machine tool.

  19. Investigational research on the design of computational materials; Keisanki zairyo sekkei no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    Computer chemistry was investigationally studied. The advance of theoretical chemistry is indispensable to the design of materials, and the theory and high speed computational method are expected which can simulate the real system with more accuracy. It is basic to simulate structures and physical properties of structural molecules and the aggregate, but the meso region, the intermedium region between structural molecules and the aggregate, has became regarded as important. Rough visualization models in high polymer materials and the progress of computational software/hardware of quantum chemistry/molecular dynamics such as catalyst become necessary. Seamless zooming is proposed as a concept of the software which simulates materials from micro/macro/meso viewpoints. Moreover, to make the most of computer chemistry, an integrated system is necessary which generally handles computational software, database, etc. For the development of software, indispensable is the demonstrative verification by a combination of experiments and researchers. Under a commission from NEDO, the investigational research was conducted as a leading study during fiscal 1996 and 1997 to view the course of the research. 17 refs., 37 figs., 5 tabs.

  20. The United States foreign research reactor spent nuclear fuel acceptance program: Proposal to modify the program

    International Nuclear Information System (INIS)

    Messick, C.E.

    2005-01-01

    The United States Department of Energy (DOE), in consultation with the Department of State (DOS), adopted the Nuclear Weapons Nonproliferation Policy Concerning Foreign Research Reactor Spent Nuclear Fuel in May 1996. The policy was slated to expire in May 2009. However, in October 2003, a petition requesting a program extension was delivered to the United States Secretary of Energy from a group of research reactor operators from foreign countries. In April 2004, the Secretary directed DOE undertake an analysis, as required by the National Environmental Policy Act (NEPA), to consider potential extension of the Program. On December 1, 2004, a Federal Register Notice was issued approving the program extension. This paper discusses the findings from the NEPA analysis and the potential changes in the program that may result from implementation of the proposed changes. (author)

  1. The Applied Meteorology Unit: Nineteen Years Successfully Transitioning Research Into Operations for America's Space Program

    Science.gov (United States)

    Madura, John T.; Bauman, William H., III; Merceret, Francis J.; Roeder, William P.; Brody, Frank C.; Hagemeyer, Bartlett C.

    2011-01-01

    The Applied Meteorology Unit (AMU) provides technology development and transition services to improve operational weather support to America's space program . The AMU was founded in 1991 and operates under a triagency Memorandum of Understanding (MOU) between the National Aeronautics and Space Administration (NASA), the United States Air Force (USAF) and the National Weather Service (NWS) (Ernst and Merceret, 1995). It is colocated with the 45th Weather Squadron (45WS) at Cape Canaveral Air Force Station (CCAFS) and funded by the Space Shuttle Program . Its primary customers are the 45WS, the Spaceflight Meteorology Group (SMG) operated for NASA by the NWS at the Johnson Space Center (JSC) in Houston, TX, and the NWS forecast office in Melbourne, FL (MLB). The gap between research and operations is well known. All too frequently, the process of transitioning research to operations fails for various reasons. The mission of the AMU is in essence to bridge this gap for America's space program.

  2. Research on the application in disaster reduction for using cloud computing technology

    Science.gov (United States)

    Tao, Liang; Fan, Yida; Wang, Xingling

    Cloud Computing technology has been rapidly applied in different domains recently, promotes the progress of the domain's informatization. Based on the analysis of the state of application requirement in disaster reduction and combining the characteristics of Cloud Computing technology, we present the research on the application of Cloud Computing technology in disaster reduction. First of all, we give the architecture of disaster reduction cloud, which consists of disaster reduction infrastructure as a service (IAAS), disaster reduction cloud application platform as a service (PAAS) and disaster reduction software as a service (SAAS). Secondly, we talk about the standard system of disaster reduction in five aspects. Thirdly, we indicate the security system of disaster reduction cloud. Finally, we draw a conclusion the use of cloud computing technology will help us to solve the problems for disaster reduction and promote the development of disaster reduction.

  3. Seven Years after the Manifesto: Literature Review and Research Directions for Technologies in Animal Computer Interaction

    Directory of Open Access Journals (Sweden)

    Ilyena Hirskyj-Douglas

    2018-06-01

    Full Text Available As technologies diversify and become embedded in everyday lives, the technologies we expose to animals, and the new technologies being developed for animals within the field of Animal Computer Interaction (ACI are increasing. As we approach seven years since the ACI manifesto, which grounded the field within Human Computer Interaction and Computer Science, this thematic literature review looks at the technologies developed for (non-human animals. Technologies that are analysed include tangible and physical, haptic and wearable, olfactory, screen technology and tracking systems. The conversation explores what exactly ACI is whilst questioning what it means to be animal by considering the impact and loop between machine and animal interactivity. The findings of this review are expected to form the first grounding foundation of ACI technologies informing future research in animal computing as well as suggesting future areas for exploration.

  4. A review of small canned computer programs for survey research and demographic analysis.

    Science.gov (United States)

    Sinquefield, J C

    1976-12-01

    A variety of small canned computer programs for survey research and demographic analysis appropriate for use in developing countries are reviewed in this article. The programs discussed are SPSS (Statistical Package for the Social Sciences); CENTS, CO-CENTS, CENTS-AID, CENTS-AIE II; MINI-TAB EDIT, FREQUENCIES, TABLES, REGRESSION, CLIENT RECORD, DATES, MULT, LIFE, and PREGNANCY HISTORY; FIVFIV and SINSIN; DCL (Demographic Computer Library); MINI-TAB Population Projection, Functional Population Projection, and Family Planning Target Projection. A description and evaluation for each program of uses, instruction manuals, computer requirements, and procedures for obtaining manuals and programs are provided. Such information is intended to facilitate and encourage the use of the computer by data processors in developing countries.

  5. Research priorities for specialized nursing practice in the United Arab Emirates.

    Science.gov (United States)

    Al-Yateem, N; Al-Tamimi, M; Brenner, M; Altawil, H; Ahmad, A; Brownie, S

    2017-08-25

    Globally, nurses are undertaking expanded and more specialized roles in healthcare planning and service delivery in response to changing patterns and levels of health service demand. This means the nursing profession is increasingly considered as leaders in health service policy, research and practice. The United Arab Emirates has strengthened nursing governance and practice by establishing a Nursing and Midwifery Council and increasing the activity of nursing specialization, service leadership and research. This study aimed to identify clinically relevant research priorities to facilitate nursing contributions to evidence-based care and strengthening health services in the country. A two-stage Delphi study design was used. The first round involved 783 participants. The second round involved 1116 participants, as more clinical settings were accessed. In total, 58 research priorities across a variety of nursing specialties (paediatrics, emergency care, intensive care, labour and maternity care, operating theatre and long-term care) were identified as highly important. These identified priorities will guide a more informed programme of research in each nursing specialty, with the aim of strengthening the evidence base to improving outcomes for patients and their families in the United Arab Emirates. The findings provide guidance on key areas for nurses to focus research contributions to enhance evidence-based care and strengthen health systems. The identified priorities may also guide researchers in academic institutions to conduct research informed by current, clinically relevant issues. The findings may help inform funders and policymakers to support allocation of funding to research that has potential to contribute to enhancing nursing care in specialist areas. © 2017 International Council of Nurses.

  6. Aerodynamic research of a racing car based on wind tunnel test and computational fluid dynamics

    Directory of Open Access Journals (Sweden)

    Wang Jianfeng

    2018-01-01

    Full Text Available Wind tunnel test and computational fluid dynamics (CFD simulation are two main methods for the study of automotive aerodynamics. CFD simulation software solves the results in calculation by using the basic theory of aerodynamic. Calculation will inevitably lead to bias, and the wind tunnel test can effectively simulate the real driving condition, which is the most effective aerodynamics research method. This paper researches the aerodynamic characteristics of the wing of a racing car. Aerodynamic model of a racing car is established. Wind tunnel test is carried out and compared with the simulation results of computational fluid dynamics. The deviation of the two methods is small, and the accuracy of computational fluid dynamics simulation is verified. By means of CFD software simulation, the coefficients of six aerodynamic forces are fitted and the aerodynamic equations are obtained. Finally, the aerodynamic forces and torques of the racing car travel in bend are calculated.

  7. NATO Advanced Research Workshop on Exploiting Mental Imagery with Computers in Mathematics Education

    CERN Document Server

    Mason, John

    1995-01-01

    The advent of fast and sophisticated computer graphics has brought dynamic and interactive images under the control of professional mathematicians and mathematics teachers. This volume in the NATO Special Programme on Advanced Educational Technology takes a comprehensive and critical look at how the computer can support the use of visual images in mathematical problem solving. The contributions are written by researchers and teachers from a variety of disciplines including computer science, mathematics, mathematics education, psychology, and design. Some focus on the use of external visual images and others on the development of individual mental imagery. The book is the first collected volume in a research area that is developing rapidly, and the authors pose some challenging new questions.

  8. Re-visioning the doctoral research degree in nursing in the United Kingdom.

    Science.gov (United States)

    Burton, Christopher R; Duxbury, Joy; French, Beverley; Monks, Rob; Carter, Bernie

    2009-05-01

    In the light of concerns about the wider social and economic value of the PhD training programme, this article discusses the challenges being directed primarily at the traditional doctoral programme of study. While the PhD is primarily concerned with the student making an original contribution to knowledge, the value-added component of the doctoral research degree needs to respond to the needs of a wider market of purchasers, and to meet practice and policy requirements for research leadership. The United Kingdom Research Councils (UK GRAD, 2001. Joint Skills Statement of Skills Training Requirements. Available at http://www.grad.ac.uk/downloads/documents/general/Joint%20Skills%20Statementpdf. (last accessed 1st April 2008.) suggest a range of seven skill domains over and above research design and management that should be offered to students. The seven domains are research skills and techniques, participation in the research environment, research management, personal effectiveness, communication, networking and team working, and career management. This article develops and extends these skill domains for the current healthcare context and considers how these should guide the development and evaluation of the value-added components of doctoral research degree programmes in nursing. The challenges that these issues present to academic departments are also discussed. Our conclusion is that PhD research training needs re-visioning and broadening so that the students' experience includes these value-added components.

  9. Performance evaluation for volumetric segmentation of multiple sclerosis lesions using MATLAB and computing engine in the graphical processing unit (GPU)

    Science.gov (United States)

    Le, Anh H.; Park, Young W.; Ma, Kevin; Jacobs, Colin; Liu, Brent J.

    2010-03-01

    Multiple Sclerosis (MS) is a progressive neurological disease affecting myelin pathways in the brain. Multiple lesions in the white matter can cause paralysis and severe motor disabilities of the affected patient. To solve the issue of inconsistency and user-dependency in manual lesion measurement of MRI, we have proposed a 3-D automated lesion quantification algorithm to enable objective and efficient lesion volume tracking. The computer-aided detection (CAD) of MS, written in MATLAB, utilizes K-Nearest Neighbors (KNN) method to compute the probability of lesions on a per-voxel basis. Despite the highly optimized algorithm of imaging processing that is used in CAD development, MS CAD integration and evaluation in clinical workflow is technically challenging due to the requirement of high computation rates and memory bandwidth in the recursive nature of the algorithm. In this paper, we present the development and evaluation of using a computing engine in the graphical processing unit (GPU) with MATLAB for segmentation of MS lesions. The paper investigates the utilization of a high-end GPU for parallel computing of KNN in the MATLAB environment to improve algorithm performance. The integration is accomplished using NVIDIA's CUDA developmental toolkit for MATLAB. The results of this study will validate the practicality and effectiveness of the prototype MS CAD in a clinical setting. The GPU method may allow MS CAD to rapidly integrate in an electronic patient record or any disease-centric health care system.

  10. Current research activities: Applied and numerical mathematics, fluid mechanics, experiments in transition and turbulence and aerodynamics, and computer science

    Science.gov (United States)

    1992-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, fluid mechanics including fluid dynamics, acoustics, and combustion, aerodynamics, and computer science during the period 1 Apr. 1992 - 30 Sep. 1992 is summarized.

  11. Using Mental Imagery Processes for Teaching and Research in Mathematics and Computer Science

    Science.gov (United States)

    Arnoux, Pierre; Finkel, Alain

    2010-01-01

    The role of mental representations in mathematics and computer science (for teaching or research) is often downplayed or even completely ignored. Using an ongoing work on the subject, we argue for a more systematic study and use of mental representations, to get an intuition of mathematical concepts, and also to understand and build proofs. We…

  12. A Multidisciplinary Research Team Approach to Computer-Aided Drafting (CAD) System Selection. Final Report.

    Science.gov (United States)

    Franken, Ken; And Others

    A multidisciplinary research team was assembled to review existing computer-aided drafting (CAD) systems for the purpose of enabling staff in the Design Drafting Department at Linn Technical College (Missouri) to select the best system out of the many CAD systems in existence. During the initial stage of the evaluation project, researchers…

  13. Identity and collective action via computer-mediated communication: A review and agenda for future research

    NARCIS (Netherlands)

    Priante, Anna; Ehrenhard, Michel L; van den Broek, Tijs; Need, Ariana

    2017-01-01

    Since the start of large-scale waves of mobilisation in 2011, the importance of identity in the study of collective action via computer-mediated communication (CMC) has been a source of contention. Hence, our research sets out to systematically review and synthesise empirical findings on identity

  14. The Use of a Relational Database in Qualitative Research on Educational Computing.

    Science.gov (United States)

    Winer, Laura R.; Carriere, Mario

    1990-01-01

    Discusses the use of a relational database as a data management and analysis tool for nonexperimental qualitative research, and describes the use of the Reflex Plus database in the Vitrine 2001 project in Quebec to study computer-based learning environments. Information systems are also discussed, and the use of a conceptual model is explained.…

  15. Using Computer Simulations of Negotiation for Educational and Research Purposes in Business Schools.

    Science.gov (United States)

    Conlon, Donald E.

    1989-01-01

    Discussion of educational and research advantages of using computer-based experimental simulations for the study of negotiation and dispute resolution in business schools focuses on two studies of undergraduates that used simulation exercises. The influence of time pressure on mediation is examined, and differences in student behavior are…

  16. Research Summary 3-D Computational Fluid Dynamics (CFD) Model Of The Human Respiratory System

    Science.gov (United States)

    The U.S. EPA’s Office of Research and Development (ORD) has developed a 3-D computational fluid dynamics (CFD) model of the human respiratory system that allows for the simulation of particulate based contaminant deposition and clearance, while being adaptable for age, ethnicity,...

  17. Ark of Inquiry: Responsible Research and Innovation through Computer-Based Inquiry Learning

    NARCIS (Netherlands)

    Margus Pedaste; Leo Siiman; Bregje de Vries; Mirjam Burget; Tomi Jaakkola; Emanuele Bardone; Meelis Brikker; Mario Mäeots; Marianne Lind; Koen Veermans

    2015-01-01

    Ark of Inquiry is a learning platform that uses a computer-based inquiry learning approach to raise youth awareness to Responsible Research and Innovation (RRI). It is developed in the context of a large-scale European project (http://www.arkofinquiry.eu) and provides young European citizens

  18. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    Science.gov (United States)

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  19. Computer codes for tasks in the fields of isotope and radiation research

    International Nuclear Information System (INIS)

    Friedrich, K.; Gebhardt, O.

    1978-11-01

    Concise descriptions of computer codes developed for solving problems in the fields of isotope and radiation research at the Zentralinstitut fuer Isotopen- und Strahlenforschung (ZfI) are compiled. In part two the structure of the ZfI program library MABIF is outlined and a complete list of all codes available is given

  20. Using Rule-Based Computer Programming to Unify Communication Rules Research.

    Science.gov (United States)

    Sanford, David L.; Roach, J. W.

    This paper proposes the use of a rule-based computer programming language as a standard for the expression of rules, arguing that the adoption of a standard would enable researchers to communicate about rules in a consistent and significant way. Focusing on the formal equivalence of artificial intelligence (AI) programming to different types of…