WorldWideScience

Sample records for computing center stuttgart

  1. High Performance Computing in Science and Engineering '16 : Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2016

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2016. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  2. High performance computing in science and engineering '09: transactions of the High Performance Computing Center, Stuttgart (HLRS) 2009

    National Research Council Canada - National Science Library

    Nagel, Wolfgang E; Kröner, Dietmar; Resch, Michael

    2010-01-01

    ...), NIC/JSC (J¨ u lich), and LRZ (Munich). As part of that strategic initiative, in May 2009 already NIC/JSC has installed the first phase of the GCS HPC Tier-0 resources, an IBM Blue Gene/P with roughly 300.000 Cores, this time in J¨ u lich, With that, the GCS provides the most powerful high-performance computing infrastructure in Europe alread...

  3. Visit to MPA Stuttgart - Universitaet Stuttgart

    International Nuclear Information System (INIS)

    1980-01-01

    The booklet contains the introduction lectures to the following demonstration tests of the MPA Stuttgart: - Large scale specimen tensile testing; - full size vessel; - high speed tensile machine; - explosion high speed tensile machine; - heat affected zone simulation. (RW)

  4. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  5. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  6. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  7. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  8. Pasarela suspendida en Stuttgart

    Directory of Open Access Journals (Sweden)

    Leonhardt, Fritz

    1963-06-01

    Full Text Available Following the gardening exhibition held in Stuttgart in 1961, the parks and gardens along the river Neckar have been greatly improved and modified, and as there is a considerable pedestrian traffic in this zone, it became necessary to build a footbridge to cross one of the main roadways. This footbridge leads on one side towards the station and on the other in the direction of the Theatre Palace. A public competition was organised to find the best design for this structure, and various firms submitted projects. The chosen one consists of a suspended, metal, flattened arch structure, and a thin walking deck. The arch has 90 ms span, and approach ramps, since the large number of pedestrians precluded the use of steps. The arched box girder is 0.5 ms deep and 5.50 ms wide. The beam is continuous, and hangs from cables which are attached at five points of the bridge, at points separated from each other 18, 17, 17, 17 and 18 ms. These cables run over a metal pillar. To give the pillar greater stability, a number of piles were driven into the ground, and the foundation block for the pillar was placed on these piles. Although the soil is not sufficiently stable to avoid small settlements of the foundations, this is not too important, since the structure is suspended, and small settlements, of even a few centimetres, would not modify the strength and stability of the project.Como consecuencia de la exposición de jardinería, celebrada en la ciudad de Stuttgart en 1961, los jardines y parques que se extienden hacia el río Neckar han experimentado una profunda modificación, y, como en esta zona el tráfico de peatones es de consideración, se ha creído necesario levantar una pasarela que cruce una avenida de gran circulación rodada. Este paso superior se bifurca, a un lado, en dos direcciones distintas cuyas rampas se orientan: una hacia la estación y la otra hacia la zona del palacio de la Diputación y Teatros. La estructura se sacó a concurso p

  9. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  10. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  11. High Performance Computing in Science and Engineering '99 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2000-01-01

    The book contains reports about the most significant projects from science and engineering of the Federal High Performance Computing Center Stuttgart (HLRS). They were carefully selected in a peer-review process and are showcases of an innovative combination of state-of-the-art modeling, novel algorithms and the use of leading-edge parallel computer technology. The projects of HLRS are using supercomputer systems operated jointly by university and industry and therefore a special emphasis has been put on the industrial relevance of results and methods.

  12. High Performance Computing in Science and Engineering '98 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    1999-01-01

    The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.

  13. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  14. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  15. Center for Advanced Computational Technology

    Science.gov (United States)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  16. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  17. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  18. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  19. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  20. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  1. AHPCRC - Army High Performance Computing Research Center

    Science.gov (United States)

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  2. Torre de televisión, Stuttgart

    Directory of Open Access Journals (Sweden)

    Leonhardt, F.

    1957-11-01

    Full Text Available Descripción general y estudio estático de la torre de televisión, construida en Stuttgart, de unos 310 m de altura total —antena metálica incluida— y provista de una especie de tronco de cono invertido, de mayor diámetro medio que el del fuste de la torre que, situado en la parte superior de la torre entre los 138 y 150 m de altura respecto al suelo, se ha subdividido, en altura, formando cuatro plantas, en las que se han alojado los servicies y un restaurante. Por encima de este cuerpo superior troncocónico se ha construido una terraza para poder disfrutar de espléndidas vistas panorámicas. La torre propiamente dicha tiene 10,8 m de diámetro en la base y 5,04 m en la parte superior, con una altura de 138 m entre estas dos secciones extremas.

  3. A Computer Learning Center for Environmental Sciences

    Science.gov (United States)

    Mustard, John F.

    2000-01-01

    In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.

  4. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  5. A research update for the Stuttgart National Aquaculture Research Center

    Science.gov (United States)

    Aquaculture (fish farming) has played an ever-increasing role in providing people with fish, shrimp, and shellfish. Aquaculture is currently the fastest growing sector of global food production and in 2016 totaled 90 million tons valued at $180 billion. The production of food-fish from aquaculture...

  6. A multipurpose computing center with distributed resources

    Science.gov (United States)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  7. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  8. Center for Computing Research Summer Research Proceedings 2015.

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, Andrew Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parks, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  9. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITYEvaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  10. Building the Teraflops/Petabytes Production Computing Center

    International Nuclear Information System (INIS)

    Kramer, William T.C.; Lucas, Don; Simon, Horst D.

    1999-01-01

    In just one decade, the 1990s, supercomputer centers have undergone two fundamental transitions which require rethinking their operation and their role in high performance computing. The first transition in the early to mid-1990s resulted from a technology change in high performance computing architecture. Highly parallel distributed memory machines built from commodity parts increased the operational complexity of the supercomputer center, and required the introduction of intellectual services as equally important components of the center. The second transition is happening in the late 1990s as centers are introducing loosely coupled clusters of SMPs as their premier high performance computing platforms, while dealing with an ever-increasing volume of data. In addition, increasing network bandwidth enables new modes of use of a supercomputer center, in particular, computational grid applications. In this paper we describe what steps NERSC is taking to address these issues and stay at the leading edge of supercomputing centers.; N

  11. The solar-assisted district heating concept in Stuttgart-Burgholzhof; Das solarunterstuetzte Nahwaermekonzept Stuttgart-Burgholzhof

    Energy Technology Data Exchange (ETDEWEB)

    Grosshans, D. [Neckarwerke Stuttgart AG (Germany)

    1998-12-31

    The housing area Burgholzhof is located on a hill above the valley of Stuttgart in the city borough of Canstatt. The building area has a size of 13.4 hectare and will contain 1,360 flats with a total heated area of 86,000 square metres that can house 2,800 people. Heat will be supplied by a heating power station, a plant for thermal use of solar energy and a heat distribution system built by Neckarwerke Stuttgart (Stuttgart utility company). The Neckarwerke Stuttgart paid the construction costs of the plant for thermal utilisation of solar energy. They will not be included in the energy price charged to the consumer. (orig.) [Deutsch] Das Wohngebiet `Burgholzhof` liegt auf einer Anhoehe ueber dem Talkessel der Landeshauptstadt Stuttgart im Stadtteil Bad-Cannstatt. Die Bebauungsflaeche betraegt nach einer im Jahr 1997 durchgefuehrten Arrondierung 13,4 ha. Im Endausbau werden etwa 1360 Wohneinheiten mit einer beheizten Gesamtflaeche von etwa 86 000 m{sup 2} entstanden sein. Somit koennen etwa 2800 Buergerinnen und Buerger in diesem neuen Stadtteil wohnen. Zur Waermeversorgung wurde ein Heizwerk, eine Anlage zur thermischen Nutzung der Sonnenenergie und ein Waermeverteilsystem von den NWS erstellt. Die Kosten fuer den Bau der Anlage zur thermischen Nutzung der Sonnenenergie wurden von den NWS uebernommen. Sie sind somit kein Bestandteil des an die Nutzer zu verrechnenden Waermepreises. (orig.)

  12. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2002-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  13. THE CENTER FOR DATA INTENSIVE COMPUTING

    International Nuclear Information System (INIS)

    GLIMM, J.

    2001-01-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook

  14. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2001-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  15. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2003-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  16. Human-centered Computing: Toward a Human Revolution

    OpenAIRE

    Jaimes, Alejandro; Gatica-Perez, Daniel; Sebe, Nicu; Huang, Thomas S.

    2007-01-01

    Human-centered computing studies the design, development, and deployment of mixed-initiative human-computer systems. HCC is emerging from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.

  17. Senior Computational Scientist | Center for Cancer Research

    Science.gov (United States)

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP),

  18. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  19. Building a High Performance Computing Infrastructure for Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Belov, S; Kaplin, V; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2011-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies (ICT), and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of the computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for the particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. Recently a dedicated optical network with the initial bandwidth of 10 Gbps connecting these three facilities was built in order to make it possible to share the computing resources among the research communities of participating institutes, thus providing a common platform for building the computing infrastructure for various scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technologies based on XEN and KVM platforms. The solution implemented was tested thoroughly within the computing environment of KEDR detector experiment which is being carried out at BINP, and foreseen to be applied to the use cases of other HEP experiments in the upcoming future.

  20. Computational-physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1982-02-01

    The computational physics group is ivolved in several areas of fusion research. One main area is the application of multidimensional Fokker-Planck, transport and combined Fokker-Planck/transport codes to both toroidal and mirror devices. Another major area is the investigation of linear and nonlinear resistive magnetohydrodynamics in two and three dimensions, with applications to all types of fusion devices. The MHD work is often coupled with the task of numerically generating equilibria which model experimental devices. In addition to these computational physics studies, investigations of more efficient numerical algorithms are being carried out

  1. Lecture 4: Cloud Computing in Large Computer Centers

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    This lecture will introduce Cloud Computing concepts identifying and analyzing its characteristics, models, and applications. Also, you will learn how CERN built its Cloud infrastructure and which tools are been used to deploy and manage it. About the speaker: Belmiro Moreira is an enthusiastic software engineer passionate about the challenges and complexities of architecting and deploying Cloud Infrastructures in ve...

  2. The computational physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers

  3. Landgericht Stuttgart condemns power price boycotters to payment

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    With its judgment of 18th December 1980 - 10 0 164/80 - the LG Stuttgart has condemned an electricity customer who, being an opponent of nuclear energy, had been keeping back ten per cent of the power price and transfering this amount to a trust account to the payment of the arrears. A public utility company cooperating with a nuclear power plant and supplying its customers with electric power from this npp had submitted its claim. (orig./HP) [de

  4. Making Space Cool - Successful Outreach at Yuri's Night Stuttgart

    Science.gov (United States)

    Hill, Christine; Bretschneider, Jens; Nathanson, Emil; Grossmann, Agnes

    Yuri’s Night - also known as the “World Space Party” - is the annual celebration commemorating Gagarin’s historic flight on April 12, 1961, and the maiden voyage of the American space shuttle on April 12, 1981. It was created by young space enthusiasts in 2000 at the annual Space Generation Congress and was first celebrated in 2001, registering more than 60 events around the world from the start. Since then the interest in celebrating human spaceflight grew constantly to over 350 events across all seven continents in 2013. The honoring of Yuri Gagarin’s first spaceflight in Stuttgart started in 2007 and resulted in one of the largest events outside the US, with five parties following in 2008, 2009, 2010, 2012 and 2013. The Stuttgart event was originally organized as space party for an audience at the age of 20 and beyond including informative aspects at the afternoon and a following party far into the night. Since 2010 the focus of the Yuri’s Night Stuttgart is to bring awareness of space exploration to people of all ages, including particularly many participatory hands-on space activities for kids and families that attract hundreds of visitors every year. As much as Yuri’s Night is a worldwide party, the events in Stuttgart successfully concentrate on educational aspects that help to inspire new generations of space enthusiasts who will ultimately shape the future of space exploration. It is therefore not only a look back to one of the greatest achievements of the 20th Century, but it is also a look into the future: from multinational cooperation on the International Space Station to benefit of space flight to the introduction of the next generation of space technology. This paper will introduce the celebrations of Yuri’s Night in Stuttgart of the past four years and compare them to the early events. It provides a summary of the development of the Yuri’s Night including educational aspects, public relations and media attraction and gives

  5. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  6. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  7. High Performance Computing in Science and Engineering '08 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2009-01-01

    The discussions and plans on all scienti?c, advisory, and political levels to realize an even larger “European Supercomputer” in Germany, where the hardware costs alone will be hundreds of millions Euro – much more than in the past – are getting closer to realization. As part of the strategy, the three national supercomputing centres HLRS (Stuttgart), NIC/JSC (Julic ¨ h) and LRZ (Munich) have formed the Gauss Centre for Supercomputing (GCS) as a new virtual organization enabled by an agreement between the Federal Ministry of Education and Research (BMBF) and the state ministries for research of Baden-Wurttem ¨ berg, Bayern, and Nordrhein-Westfalen. Already today, the GCS provides the most powerful high-performance computing - frastructure in Europe. Through GCS, HLRS participates in the European project PRACE (Partnership for Advances Computing in Europe) and - tends its reach to all European member countries. These activities aligns well with the activities of HLRS in the European HPC infrastructur...

  8. Argonne's Laboratory computing center - 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific

  9. Supporting Human Activities - Exploring Activity-Centered Computing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob

    2002-01-01

    In this paper we explore an activity-centered computing paradigm that is aimed at supporting work processes that are radically different from the ones known from office work. Our main inspiration is healthcare work that is characterized by an extreme degree of mobility, many interruptions, ad-hoc...

  10. Applied Computational Fluid Dynamics at NASA Ames Research Center

    Science.gov (United States)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  11. Computational geometry lectures at the morningside center of mathematics

    CERN Document Server

    Wang, Ren-Hong

    2003-01-01

    Computational geometry is a borderline subject related to pure and applied mathematics, computer science, and engineering. The book contains articles on various topics in computational geometry, which are based on invited lectures and some contributed papers presented by researchers working during the program on Computational Geometry at the Morningside Center of Mathematics of the Chinese Academy of Science. The opening article by R.-H. Wang gives a nice survey of various aspects of computational geometry, many of which are discussed in more detail in other papers in the volume. The topics include problems of optimal triangulation, splines, data interpolation, problems of curve and surface design, problems of shape control, quantum teleportation, and others.

  12. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Sørensen, Jens Nørkær

    2014-01-01

    In order to design and operate a wind farm optimally it is necessary to know in detail how the wind behaves and interacts with the turbines in a farm. This not only requires knowledge about meteorology, turbulence and aerodynamics, but it also requires access to powerful computers and efficient s...... software. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence was established in 2010 in order to create a world-leading cross-disciplinary flow center that covers all relevant disciplines within wind farm meteorology and aerodynamics.......In order to design and operate a wind farm optimally it is necessary to know in detail how the wind behaves and interacts with the turbines in a farm. This not only requires knowledge about meteorology, turbulence and aerodynamics, but it also requires access to powerful computers and efficient...

  13. UC Merced Center for Computational Biology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  14. Integrity of power plant components. Research performed by Professor K. Kussmaul at MPA Stuttgart

    International Nuclear Information System (INIS)

    Roos, E.; Maile, K.

    1998-01-01

    The history of the Staatliche Materialpruefungsanstalt - the State Material Testing Institute - (MPA) Stuttgart is characterized by the scientific qualifications of its directors. During the 40 years of work at the MPA, Professor Karl Kussmaul has come up to the expectations of this tradition and has with his works firmly established the renowned 'Stuttgarter Schule' - the School of Stuttgart - founded by Bach. (orig.) [de

  15. Stuttgart's city archive believes in ice bank technology from Isocal. Fresh keeping treatment for the history of Stuttgart; Stuttgarter Stadtarchiv setzt auf Eisspeicher-Technologie von IsocalL. Frischhaltekur fuer Stuttgarts Geschichte

    Energy Technology Data Exchange (ETDEWEB)

    Herzog, Uwe

    2011-03-15

    Ancient scrolls, historical construction plans or handwritten fragments of well-known literati from the region - the treasures of the Stuttgart's city archive are irretrievable. With the reopening of the archive on 24th January, 2011 in a completely retrofitted historical warehouse ensemble in Bad Cannstatt (Federal Republic of Germany) these valuable documents already are stored in the new magazine. Related with this, significant improvements are accompanied by a substantially larger available space, short ways and mobile shelf walls. Furthermore, the climatic storage conditions for the sensitive archives now were put at the highest level: Henceforth, the old papers, drawings, photos or paintings can be stored at a constant temperature of 18 Celsius and at a air humidity of 50 %.

  16. Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center

    Directory of Open Access Journals (Sweden)

    E. V. Vorozhtsov

    2011-12-01

    Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.

  17. The NIRA computer program package (photonuclear data center). Final report

    International Nuclear Information System (INIS)

    Vander Molen, H.J.; Gerstenberg, H.M.

    1976-02-01

    The Photonuclear Data Center's NIRA library of programs, executable from mass storage on the National Bureau of Standard's central computer facility, is described. Detailed instructions are given (with examples) for the use of the library to analyze, evaluate, synthesize, and produce for publication camera-ready tabular and graphical presentations of digital photonuclear reaction cross-section data. NIRA is the acronym for Nuclear Information Research Associate

  18. The role of dedicated data computing centers in the age of cloud computing

    Science.gov (United States)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  19. Secure data exchange between intelligent devices and computing centers

    Science.gov (United States)

    Naqvi, Syed; Riguidel, Michel

    2005-03-01

    The advent of reliable spontaneous networking technologies (commonly known as wireless ad-hoc networks) has ostensibly raised stakes for the conception of computing intensive environments using intelligent devices as their interface with the external world. These smart devices are used as data gateways for the computing units. These devices are employed in highly volatile environments where the secure exchange of data between these devices and their computing centers is of paramount importance. Moreover, their mission critical applications require dependable measures against the attacks like denial of service (DoS), eavesdropping, masquerading, etc. In this paper, we propose a mechanism to assure reliable data exchange between an intelligent environment composed of smart devices and distributed computing units collectively called 'computational grid'. The notion of infosphere is used to define a digital space made up of a persistent and a volatile asset in an often indefinite geographical space. We study different infospheres and present general evolutions and issues in the security of such technology-rich and intelligent environments. It is beyond any doubt that these environments will likely face a proliferation of users, applications, networked devices, and their interactions on a scale never experienced before. It would be better to build in the ability to uniformly deal with these systems. As a solution, we propose a concept of virtualization of security services. We try to solve the difficult problems of implementation and maintenance of trust on the one hand, and those of security management in heterogeneous infrastructure on the other hand.

  20. New computer system for the Japan Tier-2 center

    CERN Multimedia

    Hiroyuki Matsunaga

    2007-01-01

    The ICEPP (International Center for Elementary Particle Physics) of the University of Tokyo has been operating an LCG Tier-2 center dedicated to the ATLAS experiment, and is going to switch over to the new production system which has been recently installed. The system will be of great help to the exciting physics analyses for coming years. The new computer system includes brand-new blade servers, RAID disks, a tape library system and Ethernet switches. The blade server is DELL PowerEdge 1955 which contains two Intel dual-core Xeon (WoodCrest) CPUs running at 3GHz, and a total of 650 servers will be used as compute nodes. Each of the RAID disks is configured to be RAID-6 with 16 Serial ATA HDDs. The equipment as well as the cooling system is placed in a new large computer room, and both are hooked up to UPS (uninterruptible power supply) units for stable operation. As a whole, the system has been built with redundant configuration in a cost-effective way. The next major upgrade will take place in thre...

  1. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  2. Building a Prototype of LHC Analysis Oriented Computing Centers

    Science.gov (United States)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  3. Building a Prototype of LHC Analysis Oriented Computing Centers

    International Nuclear Information System (INIS)

    Bagliesi, G; Boccali, T; Della Ricca, G; Donvito, G; Paganoni, M

    2012-01-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  4. Activities of the MPA Stuttgart in connection with QA of nuclear power plants

    International Nuclear Information System (INIS)

    Maier, H.J.

    1980-01-01

    The MPA Stuttgart (Institute of Testing of Materials, University of Stuttgart) is concerned with quality assurance of german nuclear power plants, additional to the technical inspection organizations and the plant owners. The orders are given by state authorities, federal authorities and the RSK (reactor security commission). There are shown some examples of this work, concerning with materials and testing of materials. (orig./RW)

  5. Development of a computer system at La Hague center

    International Nuclear Information System (INIS)

    Mimaud, Robert; Malet, Georges; Ollivier, Francis; Fabre, J.-C.; Valois, Philippe; Desgranges, Patrick; Anfossi, Gilbert; Gentizon, Michel; Serpollet, Roger.

    1977-01-01

    The U.P.2 plant, built at La Hague Center is intended mainly for the reprocessing of spent fuels coming from (as metal) graphite-gas reactors and (as oxide) light-water, heavy-water and breeder reactors. In each of the five large nuclear units the digital processing of measurements was dealt with until 1974 by CAE 3030 data processors. During the period 1974-1975 a modern industrial computer system was set up. This system, equipped with T 2000/20 material from the Telemecanique company, consists of five measurement acquisition devices (for a total of 1500 lines processed) and two central processing units (CPU). The connection of these two PCU (Hardware and Software) enables an automatic connection of the system either on the first CPU or on the second one. The system covers, at present, data processing, threshold monitoring, alarm systems, display devices, periodical listing, and specific calculations concerning the process (balances etc), and at a later stage, an automatic control of certain units of the Process [fr

  6. M-center growth in alkali halides: computer simulation

    International Nuclear Information System (INIS)

    Aguilar, M.; Jaque, F.; Agullo-Lopez, F.

    1983-01-01

    The heterogeneous interstitial nucleation model previously proposed to explain F-center growth curves in irradiated alkali halides has been extended to account for M-center kinetics. The interstitials produced during the primary irradiation event are assumed to be trapped at impurities and interstitial clusters or recombine with F and M centers. For M-center formation two cases have been considered: (a) diffusion and aggregation of F centers, and (b) statistical generation and pairing of F centers. Process (b) is the only one consistent with the quadratic relationship between M and F center concentrations. However, to account for the F/M ratios experimentally observed as well as for the role of dose-rate, a modified statistical model involving random creation and association of F + -F pairs has been shown to be adequate. (author)

  7. History and practice of material research on the examples of Material Testing and Materialpruefungsanstalt (MPA) Stuttgart, liquid crystals and screen technology as well as superconductivity. An interdisciplinary teaching project of the University of Stuttgart; Geschichte und Praxis der Materialforschung an den Beispielen Materialpruefung und Materialpruefungsanstalt (MPA) Stuttgart, Fluessigkristalle und Bildschirmtechnik sowie Supraleitung. Ein interdisziplinaeres Lehrprojekt der Universitaet Stuttgart

    Energy Technology Data Exchange (ETDEWEB)

    Hentschel, Klaus; Webel, Josef (eds.)

    2016-07-01

    The knowledge of material research and its history is not very common among scientists and engineers alike. Within the scope of an interdisciplinary teaching project carried out for the first time in the summer semester 2014 and ever since then every summer semester at the University of Stuttgart, an attempt is made to approach material research both from a scientific, technical and historical perspective. The Material Testing and Materials Testing Institute in Stuttgart (MPA), the liquid crystals and the screen technology as well as the superconductivity were selected as topics, which have a long tradition in research and teaching in Stuttgart. In this anthology the materials of the teaching project are summarized. [German] Das Wissen um die Materialforschung und ihre Geschichte ist selbst unter Naturwissenschaftlern und Ingenieuren wenig verbreitet. Im Rahmen eines erstmals im Sommersemester 2014 und seither dann jedes Sommersemester an der Universitaet Stuttgart durchgefuehrten interdisziplinaeren Lehrprojektes wird deshalb der Versuch unternommen, sich der Materialforschung sowohl aus naturwissenschaftlicher und technischer als auch aus historischer Perspektive anzunaehern. Als Themenbereiche wurden dafuer die Materialpruefung und Materialpruefungsanstalt (MPA) Stuttgart, die Fluessigkristalle und die Bildschirmtechnik sowie die Supraleitung ausgewaehlt, die in Stuttgart auf eine lange Tradition in Forschung und Lehre zurueckblicken. Im vorliegenden Sammelband sind die Materialien des Lehrprojektes zusammengefasst.

  8. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  9. Center for computation and visualization of geometric structures. [Annual], Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-02-12

    The mission of the Center is to establish a unified environment promoting research, education, and software and tool development. The work is centered on computing, interpreted in a broad sense to include the relevant theory, development of algorithms, and actual implementation. The research aspects of the Center are focused on geometry; correspondingly the computational aspects are focused on three (and higher) dimensional visualization. The educational aspects are likewise centered on computing and focused on geometry. A broader term than education is `communication` which encompasses the challenge of explaining to the world current research in mathematics, and specifically geometry.

  10. TAPP - Stuttgart technique and result of a large single center series

    Directory of Open Access Journals (Sweden)

    Bittner R

    2006-01-01

    Full Text Available Laparoscopic hernioplasty is assessed as a difficult operation. Operative technique determines the frequency of complications, the time of recovery and the rate of recurrences. A proper technique is absolutely necessary to achieve results that are superior to open hernia surgery. Technique: The key points in our technique are 1 use of nondisposable instruments; 2 use of blunt trocars, consisting of expanding and non-incisive cone-shaped tips; 3 spacious and curved opening to the peritoneum, high above all possible hernia openings; 4 meticulous dissection of the entire pelvic floor; 5 complete reduction of the hernial sac; 6 wide parietalization of the peritoneal sac, at least down to the mid of psoas muscle; 7 implantation of a large mesh, at least 10 cm x 15 cm; 8 fixation of the mesh by clip to Cooper′s ligament, to the rectus muscle and lateral to the epigastric vessels, high above the ileopubic tract; 9 the use of glue allows fixation also to the latero-caudial region; and 10 closure of the peritoneum by running suture. Results: With this technique in 12,678 hernia repairs, the following results could be achieved: operating time - 40 min; morbidity - 2.9%; recurrence rate - 0.7%; disability of work - 14 days. In all types of hernias (recurrence after previous open surgery, recurrence after previous preperitoneal operation, scrotal hernia, hernia in patients after transabdominal prostate resection, similar results could be achieved. Summary: Laparoscopic hernia repair can be performed successfully in clinical practice even by surgeons in training. Precondition for the success is a strictly standardized operative technique and a well-structured educational program.

  11. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  12. Diamond NV centers for quantum computing and quantum networks

    NARCIS (Netherlands)

    Childress, L.; Hanson, R.

    2013-01-01

    The exotic features of quantum mechanics have the potential to revolutionize information technologies. Using superposition and entanglement, a quantum processor could efficiently tackle problems inaccessible to current-day computers. Nonlocal correlations may be exploited for intrinsically secure

  13. Performance of Cloud Computing Centers with Multiple Priority Classes

    NARCIS (Netherlands)

    Ellens, W.; Zivkovic, Miroslav; Akkerboom, J.; Litjens, R.; van den Berg, Hans Leo

    In this paper we consider the general problem of resource provisioning within cloud computing. We analyze the problem of how to allocate resources to different clients such that the service level agreements (SLAs) for all of these clients are met. A model with multiple service request classes

  14. National Energy Research Scientific Computing Center 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  15. Conception of a computer for the nuclear medical department of the Augsburg hospital center

    International Nuclear Information System (INIS)

    Graf, G.; Heidenreich, P.

    1984-01-01

    A computer system based on the Siemens R30 process computer has been employed at the Institute of Nuclear Medicine of the Augsburg Hospital Center since early 1981. This system, including the development and testing of organ-specific evaluation programs, was used as a basis for the conception of the new computer system for the department of nuclear medicine of the Augsburg Hospital Center. The computer system was extended and installed according to this conception when the new 1400-bed hospital was opened in the 3rd phase of construction in autumn 1982. (orig.) [de

  16. Center for Computer Security newsletter. Volume 2, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    None

    1983-05-01

    The Fifth Computer Security Group Conference was held November 16 to 18, 1982, at the Knoxville Hilton in Knoxville, Tennessee. Attending were 183 people, representing the Department of Energy, DOE contractors, other government agencies, and vendor organizations. In these papers are abridgements of most of the papers presented in Knoxville. Less than half-a-dozen speakers failed to furnish either abstracts or full-text papers of their Knoxville presentations.

  17. Computer-aided dispatch--traffic management center field operational test : state of Utah final report

    Science.gov (United States)

    2006-07-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch Traffic Management Center Integration Field Operations Test in the State of Utah. The document discusses evaluation findings in the followin...

  18. Computer-aided dispatch--traffic management center field operational test : Washington State final report

    Science.gov (United States)

    2006-05-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch - Traffic Management Center Integration Field Operations Test in the State of Washington. The document discusses evaluation findings in the foll...

  19. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    Science.gov (United States)

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  20. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    Science.gov (United States)

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  1. Intention and Usage of Computer Based Information Systems in Primary Health Centers

    Science.gov (United States)

    Hosizah; Kuntoro; Basuki N., Hari

    2016-01-01

    The computer-based information system (CBIS) is adopted by almost all of in health care setting, including the primary health center in East Java Province Indonesia. Some of softwares available were SIMPUS, SIMPUSTRONIK, SIKDA Generik, e-puskesmas. Unfortunately they were most of the primary health center did not successfully implemented. This…

  2. A Descriptive Study towards Green Computing Practice Application for Data Centers in IT Based Industries

    Directory of Open Access Journals (Sweden)

    Anthony Jnr. Bokolo

    2018-01-01

    Full Text Available The progressive upsurge in demand for processing and computing power has led to a subsequent upsurge in data center carbon emissions, cost incurred, unethical waste management, depletion of natural resources and high energy utilization. This raises the issue of the sustainability attainment in data centers of Information Technology (IT based industries. Green computing practice can be applied to facilitate sustainability attainment as IT based industries utilizes data centers to provide services to staffs, practitioners and end users. But it is a known fact that enterprise servers utilize huge quantity of energy and incur other expenditures in cooling operations and it is difficult to address the needs of accuracy and efficiency in data centers while yet encouraging a greener application practice alongside cost reduction. Thus this research study focus on the practice application of Green computing in data centers which houses servers and as such presents the Green computing life cycle strategies and best practices to be practiced for better management in data centers in IT based industries. Data was collected through questionnaire from 133 respondents in industries that currently operate their in-house data centers. The analysed data was used to verify the Green computing life cycle strategies presented in this study. Findings from the data shows that each of the life cycles strategies is significant in assisting IT based industries apply Green computing practices in their data centers. This study would be of interest to knowledge and data management practitioners as well as environmental manager and academicians in deploying Green data centers in their organizations.

  3. Scientific visualization in computational aerodynamics at NASA Ames Research Center

    Science.gov (United States)

    Bancroft, Gordon V.; Plessel, Todd; Merritt, Fergus; Walatka, Pamela P.; Watson, Val

    1989-01-01

    The visualization methods used in computational fluid dynamics research at the NASA-Ames Numerical Aerodynamic Simulation facility are examined, including postprocessing, tracking, and steering methods. The visualization requirements of the facility's three-dimensional graphical workstation are outlined and the types hardware and software used to meet these requirements are discussed. The main features of the facility's current and next-generation workstations are listed. Emphasis is given to postprocessing techniques, such as dynamic interactive viewing on the workstation and recording and playback on videodisk, tape, and 16-mm film. Postprocessing software packages are described, including a three-dimensional plotter, a surface modeler, a graphical animation system, a flow analysis software toolkit, and a real-time interactive particle-tracer.

  4. High-voltage shared-service line in the Stuttgart area

    Energy Technology Data Exchange (ETDEWEB)

    Goerler, W; Benz, A [Technische Werke der Stadt Stuttgart A.G. (F.R. Germany)

    1976-01-01

    In congested areas the line construction engineer has to cope with a great variety of difficulties - amenity problems, line crossings, and road crossings. The authors describe the prerequisites for and the construction of a HV shared-service line of approx. 25 km in the congested area of Stuttgart, where several three-phase and single- phase a.c. systems are run on one set of pylons.

  5. The effective use of virtualization for selection of data centers in a cloud computing environment

    Science.gov (United States)

    Kumar, B. Santhosh; Parthiban, Latha

    2018-04-01

    Data centers are the places which consist of network of remote servers to store, access and process the data. Cloud computing is a technology where users worldwide will submit the tasks and the service providers will direct the requests to the data centers which are responsible for execution of tasks. The servers in the data centers need to employ the virtualization concept so that multiple tasks can be executed simultaneously. In this paper we proposed an algorithm for data center selection based on energy of virtual machines created in server. The virtualization energy in each of the server is calculated and total energy of the data center is obtained by the summation of individual server energy. The tasks submitted are routed to the data center with least energy consumption which will result in minimizing the operational expenses of a service provider.

  6. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    Science.gov (United States)

    Meyer, Jörg; Quadt, Arnulf; Weber, Pavel; ATLAS Collaboration

    2011-12-01

    GoeGrid is a grid resource center located in Göttingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields of grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community, GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and personpower resources.

  7. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 1, Issue 2

    Science.gov (United States)

    2011-01-01

    area and the researchers working on these projects. Also inside: news from the AHPCRC consortium partners at Morgan State University and the NASA ...Computing Research Center is provided by the supercomputing and research facilities at Stanford University and at the NASA Ames Research Center at...atomic and molecular level, he said. He noted that “every general would like to have” a Star Trek -like holodeck, where holographic avatars could

  8. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    International Nuclear Information System (INIS)

    Bogdanov, A.V.; Yuzhanin, N.V.; Zolotarev, V.I.; Ezhakova, T.R.

    2017-01-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is reviewed and development of the corresponding elements of the system is described in the present paper.

  9. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    Science.gov (United States)

    Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.

    2017-12-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.

  10. Technical Data Management Center: a focal point for meteorological and other environmental transport computing technology

    International Nuclear Information System (INIS)

    McGill, B.; Maskewitz, B.F.; Trubey, D.K.

    1981-01-01

    The Technical Data Management Center, collecting, packaging, analyzing, and distributing information, computer technology and data which includes meteorological and other environmental transport work is located at the Oak Ridge National Laboratory, within the Engineering Physics Division. Major activities include maintaining a collection of computing technology and associated literature citations to provide capabilities for meteorological and environmental work. Details of the activities on behalf of TDMC's sponsoring agency, the US Nuclear Regulatory Commission, are described

  11. History and practice of material research on the examples of Material Testing and Materialpruefungsanstalt (MPA) Stuttgart, liquid crystals and screen technology as well as superconductivity. An interdisciplinary teaching project of the University of Stuttgart

    International Nuclear Information System (INIS)

    Hentschel, Klaus; Webel, Josef

    2016-01-01

    The knowledge of material research and its history is not very common among scientists and engineers alike. Within the scope of an interdisciplinary teaching project carried out for the first time in the summer semester 2014 and ever since then every summer semester at the University of Stuttgart, an attempt is made to approach material research both from a scientific, technical and historical perspective. The Material Testing and Materials Testing Institute in Stuttgart (MPA), the liquid crystals and the screen technology as well as the superconductivity were selected as topics, which have a long tradition in research and teaching in Stuttgart. In this anthology the materials of the teaching project are summarized. [de

  12. CENTER CONDITIONS AND CYCLICITY FOR A FAMILY OF CUBIC SYSTEMS: COMPUTER ALGEBRA APPROACH.

    Science.gov (United States)

    Ferčec, Brigita; Mahdi, Adam

    2013-01-01

    Using methods of computational algebra we obtain an upper bound for the cyclicity of a family of cubic systems. We overcame the problem of nonradicality of the associated Bautin ideal by moving from the ring of polynomials to a coordinate ring. Finally, we determine the number of limit cycles bifurcating from each component of the center variety.

  13. CNC Turning Center Operations and Prove Out. Computer Numerical Control Operator/Programmer. 444-334.

    Science.gov (United States)

    Skowronski, Steven D.

    This student guide provides materials for a course designed to instruct the student in the recommended procedures used when setting up tooling and verifying part programs for a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 discusses course content and reviews and demonstrates set-up procedures…

  14. 6. GESA symposium on experimental stress analysis, May 6 and 7, 1982 Stuttgart

    Energy Technology Data Exchange (ETDEWEB)

    1982-04-01

    Under the scientific chairmanship of Dr. H. Wolf, KWU Muehlheim/Ruhr, the 6th Symposium of the Gemeinschaft Experimentelle Spannungsanalyse (GESA = Experimental Stress Analysis Association) takes place in the Schwabenlandhalle at Fellbach near Stuttgart. The meeting will be organized by VDI/VDE-Gesellschaft Mess- und Regelungstechnik (GMR = VDI/VDE Society for Instrumentation and Control Engineering) located at Duesseldorf. It will be associated with an exposition of firms working in the field of experimental mechanics and presenting among other things developments in the fields of measuring transmitters, data acquisition and processing.

  15. 6. GESA symposium on experimental stress analysis, May 6 and 7, 1982 Stuttgart

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    Under the scientific chairmanship of Dr. H. Wolf, KWU Muehlheim/Ruhr the 6th Symposium of the Gemeinschaft Experimentelle Spannungsanalyse (GESA = Experimental Stress Analysis Association) takes place in the Schwabenlandhalle at Fellbach near Stuttgart. The meeting will be organized by VDI/VDE-Gesellschaft Mess- und Regelungstechnik (GMR = VDI/VDE Society for Instrumentation and Control Engineering) located at Duesseldorf. It will be associated with an exposition of firms working in the field of experimental mechanics and presenting among other things developments in the fields of measuring transmitters, data acquisition and processing. (orig./RW) [de

  16. Accurate Computation of Periodic Regions' Centers in the General M-Set with Integer Index Number

    Directory of Open Access Journals (Sweden)

    Wang Xingyuan

    2010-01-01

    Full Text Available This paper presents two methods for accurately computing the periodic regions' centers. One method fits for the general M-sets with integer index number, the other fits for the general M-sets with negative integer index number. Both methods improve the precision of computation by transforming the polynomial equations which determine the periodic regions' centers. We primarily discuss the general M-sets with negative integer index, and analyze the relationship between the number of periodic regions' centers on the principal symmetric axis and in the principal symmetric interior. We can get the centers' coordinates with at least 48 significant digits after the decimal point in both real and imaginary parts by applying the Newton's method to the transformed polynomial equation which determine the periodic regions' centers. In this paper, we list some centers' coordinates of general M-sets' k-periodic regions (k=3,4,5,6 for the index numbers α=−25,−24,…,−1 , all of which have highly numerical accuracy.

  17. The University of Stuttgart IKE/University of Arizona student research program

    International Nuclear Information System (INIS)

    Seale, R.L.

    1988-01-01

    The University of Stuttgart's Institut fuer Kernenergetik und Energiesysteme (IKE) and the University of Arizona have had a joint program in which graduate students from the IKE spend 1 yr on the University of Arizona campus. This program started in 1982 largely as the result of an initiative begun by K.H. Hoecker, then director of IKE. Since 1985, Alfred Voss has been director and the program has continued without interruption. Under the program, the Deutscher Akademisher Austauschdienst, a government agency of the Federal Republic of Germany has funded scholarships for students from IKE, which provide support for 1 yr during which they attend the University of Arizona as visiting student scholars and engage in a research project under the direction of one of our faculty, which satisfies a part of the requirements for the Ingenieur-Diplom Fachrichtung Maschinenbau. The students get credit for their research from the University of Stuttgart. The topics have a broad range and include software development, artificial intelligence, radiation transport, and energy management studies

  18. Working Group 'Air pollution abatement' of the University of Stuttgart -ALS. Annual report 1990

    International Nuclear Information System (INIS)

    1991-01-01

    Despite considerable efforts for air pollution abatement - examples are here desulphurization and nitrogen removal in power and large combustion plants as well as catalytic converters for automobiles there are still many problems to solve. Many small and medium-size companies still have to reduce production-related pollutant emissions, traffic still is a major source of pollutants. Air pollution abatement in the new Federal states and other Eastern European countries is a particularly urgent task and reductions of CO 2 emissions from energy production processes with fossil fuels are not least a great challenge. Apart from industry, legislation and administration especially science is called upon to find solutions to these problems. The university of Stuttgart takes up the challenge. Numerous institutes - 17 of 8 faculties -united in the working group ''air pollution abatement'' of the university of Stuttgart which carries out in interdisciplinary cooperation research work in the area of air pollution abatement. In this annual report activities of individual member states institutes in the area of air pollution abatement (fields of study, current research projects, cooperations and publications in 1991) as well as joint projects are presented. (orig./KW) [de

  19. Use of computers and Internet among people with severe mental illnesses at peer support centers.

    Science.gov (United States)

    Brunette, Mary F; Aschbrenner, Kelly A; Ferron, Joelle C; Ustinich, Lee; Kelly, Michael; Grinley, Thomas

    2017-12-01

    Peer support centers are an ideal setting where people with severe mental illnesses can access the Internet via computers for online health education, peer support, and behavioral treatments. The purpose of this study was to assess computer use and Internet access in peer support agencies. A peer-assisted survey assessed the frequency with which consumers in all 13 New Hampshire peer support centers (n = 702) used computers to access Internet resources. During the 30-day survey period, 200 of the 702 peer support consumers (28%) responded to the survey. More than 3 quarters (78.5%) of respondents had gone online to seek information in the past year. About half (49%) of respondents were interested in learning about online forums that would provide information and peer support for mental health issues. Peer support centers may be a useful venue for Web-based approaches to education, peer support, and intervention. Future research should assess facilitators and barriers to use of Web-based resources among people with severe mental illness in peer support centers. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  1. The Role of the Radiation Safety Information Computational Center (RSICC) in Knowledge Management

    International Nuclear Information System (INIS)

    Valentine, T.

    2016-01-01

    Full text: The Radiation Safety Information Computational Center (RSICC) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 packages that have been provided by contributors from various agencies. RSICC’s customers obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to help ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programmes both domestically and internationally, as the majority of RSICC’s customers are students attending U.S. universities. RSICC also supports and promotes workshops and seminars in nuclear science and technology to further the use and/or development of computational tools and data. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC’s activities, services, and systems that support knowledge management and education and training in the nuclear field. (author

  2. Current state and future direction of computer systems at NASA Langley Research Center

    Science.gov (United States)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  3. Radiation Shielding Information Center: a source of computer codes and data for fusion neutronics studies

    International Nuclear Information System (INIS)

    McGill, B.L.; Roussin, R.W.; Trubey, D.K.; Maskewitz, B.F.

    1980-01-01

    The Radiation Shielding Information Center (RSIC), established in 1962 to collect, package, analyze, and disseminate information, computer codes, and data in the area of radiation transport related to fission, is now being utilized to support fusion neutronics technology. The major activities include: (1) answering technical inquiries on radiation transport problems, (2) collecting, packaging, testing, and disseminating computing technology and data libraries, and (3) reviewing literature and operating a computer-based information retrieval system containing material pertinent to radiation transport analysis. The computer codes emphasize methods for solving the Boltzmann equation such as the discrete ordinates and Monte Carlo techniques, both of which are widely used in fusion neutronics. The data packages include multigroup coupled neutron-gamma-ray cross sections and kerma coefficients, other nuclear data, and radiation transport benchmark problem results

  4. The Role of Computers in Research and Development at Langley Research Center

    Science.gov (United States)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  5. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  6. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  7. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  8. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  9. The psychology of computer displays in the modern mission control center

    Science.gov (United States)

    Granaas, Michael M.; Rhea, Donald C.

    1988-01-01

    Work at NASA's Western Aeronautical Test Range (WATR) has demonstrated the need for increased consideration of psychological factors in the design of computer displays for the WATR mission control center. These factors include color perception, memory load, and cognitive processing abilities. A review of relevant work in the human factors psychology area is provided to demonstrate the need for this awareness. The information provided should be relevant in control room settings where computerized displays are being used.

  10. The Radiation Safety Information Computational Center (RSICC): A Resource for Nuclear Science Applications

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [ORNL

    2009-01-01

    The Radiation Safety Information Computational Center (RSICC) has been in existence since 1963. RSICC collects, organizes, evaluates and disseminates technical information (software and nuclear data) involving the transport of neutral and charged particle radiation, and shielding and protection from the radiation associated with: nuclear weapons and materials, fission and fusion reactors, outer space, accelerators, medical facilities, and nuclear waste management. RSICC serves over 12,000 scientists and engineers from about 100 countries.

  11. Neural Networks in R Using the Stuttgart Neural Network Simulator: RSNNS

    Directory of Open Access Journals (Sweden)

    Christopher Bergmeir

    2012-01-01

    Full Text Available Neural networks are important standard machine learning procedures for classification and regression. We describe the R package RSNNS that provides a convenient interface to the popular Stuttgart Neural Network Simulator SNNS. The main features are (a encapsulation of the relevant SNNS parts in a C++ class, for sequential and parallel usage of different networks, (b accessibility of all of the SNNSalgorithmic functionality from R using a low-level interface, and (c a high-level interface for convenient, R-style usage of many standard neural network procedures. The package also includes functions for visualization and analysis of the models and the training procedures, as well as functions for data input/output from/to the original SNNSfile formats.

  12. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    Science.gov (United States)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  13. Mitigation of urban heat stress – a modelling case study for the area of Stuttgart

    Directory of Open Access Journals (Sweden)

    Fallmann, Joachim

    2014-04-01

    Full Text Available In 2050 the fraction of urban global population will increase to over 69%, which means that around 6.3 billion people are expected to live in urban areas (UN 2011. Cities are the predominant habitation places for humans to live and are vulnerable to extreme weather events aggravating phenomena like heat stress. Finding mitigation strategies to sustain future development is of great importance, given expected influences on human health. In this study, the mesoscale numerical model WRF is used on a regional scale for the urban area of Stuttgart, to simulate the effect of urban planning strategies on dynamical processes affecting urban climate. After comparing two urban parameterisation schemes, a sensitivity study for different scenarios is performed; it shows that a change of the reflective properties of surfaces has the highest impact on near-surface temperatures compared to an increase of urban green areas or a decrease of building density. The Urban Heat Island (UHI describes the temperature difference between urban and rural temperatures; it characterises regional urban climate and is responsible for urban-rural circulation patterns. Applying urban planning measures may decrease the intensity of the UHI in the study area by up to 2 °C by using heat-reflective roof paints or by 1 °C through replacing impervious surfaces by natural vegetation in the urban vicinity – compared to a value of 2.5 °C for the base case. Because of its topographical location in a valley and the overall high temperatures in this region, the area of Stuttgart suffers from heat stress to a comparatively large extent.

  14. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    Science.gov (United States)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  15. Spectrum of tablet computer use by medical students and residents at an academic medical center

    Directory of Open Access Journals (Sweden)

    Robert Robinson

    2015-07-01

    Full Text Available Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians.Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM in July and August of 2012.Results. There were 76 medical student responses (26% response rate and 66 resident/fellow responses to this survey (21% response rate. Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035. The most common reported uses were for accessing medical reference applications (46%, e-Books (45%, and board study (32%. Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010, review radiology images (27% vs. 12%, p = 0.019, and enter patient care orders (26% vs. 3%, p < 0.001.Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks.Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  16. Initial constructs for patient-centered outcome measures to evaluate brain-computer interfaces.

    Science.gov (United States)

    Andresen, Elena M; Fried-Oken, Melanie; Peters, Betts; Patrick, Donald L

    2016-10-01

    The authors describe preliminary work toward the creation of patient-centered outcome (PCO) measures to evaluate brain-computer interface (BCI) as an assistive technology (AT) for individuals with severe speech and physical impairments (SSPI). In Phase 1, 591 items from 15 existing measures were mapped to the International Classification of Functioning, Disability and Health (ICF). In Phase 2, qualitative interviews were conducted with eight people with SSPI and seven caregivers. Resulting text data were coded in an iterative analysis. Most items (79%) were mapped to the ICF environmental domain; over half (53%) were mapped to more than one domain. The ICF framework was well suited for mapping items related to body functions and structures, but less so for items in other areas, including personal factors. Two constructs emerged from qualitative data: quality of life (QOL) and AT. Component domains and themes were identified for each. Preliminary constructs, domains and themes were generated for future PCO measures relevant to BCI. Existing instruments are sufficient for initial items but do not adequately match the values of people with SSPI and their caregivers. Field methods for interviewing people with SSPI were successful, and support the inclusion of these individuals in PCO research. Implications for Rehabilitation Adapted interview methods allow people with severe speech and physical impairments to participate in patient-centered outcomes research. Patient-centered outcome measures are needed to evaluate the clinical implementation of brain-computer interface as an assistive technology.

  17. Computer Vision Syndrome among Call Center Employees at Telecommunication Company in Bandung

    Directory of Open Access Journals (Sweden)

    Ghea Nursyifa

    2016-06-01

    Full Text Available Background: The occurrence of Computer Vision Syndrome (CVS at the workplace has increased within decades due to theprolonged use of computers. Knowledge of CVS is necessary in order to develop an awareness of how to prevent and alleviate itsprevalence . The objective of this study was to assess the knowledge of CVS among call center employees and to explore the most frequent CVS symptom experienced by the workers. Methods: A descriptive cross sectional study was conducted during the period of September to November 2014 at Telecommunication Company in Bandung using a questionnaire consisting of 30 questions. Out of the 30 questions/statements, 15 statements were about knowledge of CVS and other 15 questions were about the occurrence of CVS and its symptoms. In this study 125 call center employees participated as respondents using consecutive sampling. The level of knowledge was divided into 3 categories: good (76–100%, fair (75–56% and poor (<56%. The collected data was presented in frequency tabulation. Results: There was 74.4% of the respondents had poor knowledge of CVS. The most symptom experienced by the respondents was asthenopia. Conclusions: The CVS occurs in call center employees with various symptoms and signs. This situation is not supported by good knowledge of the syndrome which can hamper prevention programs.

  18. Modeling Remote I/O versus Staging Tradeoff in Multi-Data Center Computing

    International Nuclear Information System (INIS)

    Suslu, Ibrahim H

    2014-01-01

    In multi-data center computing, data to be processed is not always local to the computation. This is a major challenge especially for data-intensive Cloud computing applications, since large amount of data would need to be either moved the local sites (staging) or accessed remotely over the network (remote I/O). Cloud application developers generally chose between staging and remote I/O intuitively without making any scientific comparison specific to their application data access patterns since there is no generic model available that they can use. In this paper, we propose a generic model for the Cloud application developers which would help them to choose the most appropriate data access mechanism for their specific application workloads. We define the parameters that potentially affect the end-to-end performance of the multi-data center Cloud applications which need to access large datasets over the network. To test and validate our models, we implemented a series of synthetic benchmark applications to simulate the most common data access patterns encountered in Cloud applications. We show that our model provides promising results in different settings with different parameters, such as network bandwidth, server and client capabilities, and data access ratio

  19. Computed tomography-guided core-needle biopsy of lung lesions: an oncology center experience

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Marcos Duarte; Fonte, Alexandre Calabria da; Chojniak, Rubens, E-mail: marcosduarte@yahoo.com.b [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Radiology and Imaging Diagnosis; Andrade, Marcony Queiroz de [Hospital Alianca, Salvador, BA (Brazil); Gross, Jefferson Luiz [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Chest Surgery

    2011-03-15

    Objective: The present study is aimed at describing the experience of an oncology center with computed tomography guided core-needle biopsy of pulmonary lesions. Materials and Methods: Retrospective analysis of 97 computed tomography-guided core-needle biopsy of pulmonary lesions performed in the period between 1996 and 2004 in a Brazilian reference oncology center (Hospital do Cancer - A.C. Camargo). Information regarding material appropriateness and the specific diagnoses were collected and analyzed. Results: Among 97 lung biopsies, 94 (96.9%) supplied appropriate specimens for histological analyses, with 71 (73.2%) cases being diagnosed as malignant lesions and 23 (23.7%) diagnosed as benign lesions. Specimens were inappropriate for analysis in three cases. The frequency of specific diagnosis was 83 (85.6%) cases, with high rates for both malignant lesions with 63 (88.7%) cases and benign lesions with 20 (86.7%). As regards complications, a total of 12 cases were observed as follows: 7 (7.2%) cases of hematoma, 3 (3.1%) cases of pneumothorax and 2 (2.1%) cases of hemoptysis. Conclusion: Computed tomography-guided core needle biopsy of lung lesions demonstrated high rates of material appropriateness and diagnostic specificity, and low rates of complications in the present study. (author)

  20. Rational use of energy at the University of Stuttgart building environment. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, F.; Freihofer, H. [Forschungsinst. fuer Kerntechnik und Energiewandlung e.V., Stuttgart (Germany). Abt. Wissensverarbeitung und Numerik; Stergiaropoulos, K.; Claus, G. [Forschungsgesellschaft Heizung-Lueftung-Klimatechnik Stuttgart mbH (Germany); Harter, J.; Ast, H. [IFB, Dr. R. Braschel GmbH, Stuttgart (Germany); Will, M.; Haerther, H.; Franke, H. [Sulzer-Infra Deutschland GmbH, Stuttgart (Germany)

    1999-07-01

    We have demonstrated with the project REUSE that it is possible to optimise complex building ensembles (properties) energetically by applying the contracting model. However, there are some basic requirements which have to be fullfilled to reach such a goal. They include 1. basic consense among those dealing with specific aspects concerning energy use of the buildings considered, 2. transparent and most actual measured data of energy consumption, 3. unified and reliable system for evaluation of measures taken to save energy (base line), 4. partners who are able to define measures which have to be undertaken to save energy in a specific building and who are able to implement these measures effectively and user friendly. This report describes how we have fullfilled these requirements at the campus 'Pfaffenwald' of the University of Stuttgart. Numerous daily life difficulties had to be overcome before the project became a success and we were able to derive valuable consequences. These consequences went far beyond the original goal to save energy worth about 3 to 4 Million Deutsche Marks and finally resulted in a new thinking about energy use at the campus. Therefore, we see the project REUSE as extremely successful and hope it will encourage similar projects and provide valuable hints to them. (orig.) [German] Im Vorhaben REUSE wurde gezeigt, dass und wie es moeglich ist, komplexe Liegenschaften nach dem Contracting Modell energetisch zu optimieren. Voraussetzungen dafuer sind: 1. ein gemeinsames Basisverstaendnis all derer mit Energiefragen befassten, 2. eine transparente und zeitnahe Erfassung der Verbraeuche, 3. ein einheitliches und verlaessliches System zur Bewertung der Massnahmen zur Energieeinsparung (Baseline) und 4. Partner, die Massnahmen an Einzelobjekten effektiv und kundenfreundlich umsetzen koennen. Dieser Bericht zeigt auf, wie wir an der Liegenschaft CAMPUS Pfaffenwald der Universitaet Stuttgart diese Voraussetzungen geschaffen haben, wie die

  1. Threat and vulnerability analysis and conceptual design of countermeasures for a computer center under construction

    International Nuclear Information System (INIS)

    Rozen, A.; Musacchio, J.M.

    1988-01-01

    This project involved the assessment of a new computer center to be used as the main national data processing facility of a large European bank. This building serves as the principal facility in the country with all other branches utilizing the data processing center. As such, the building is a crucial target which may attract terrorist attacks. Threat and vulnerability assessments were performed as a basis to define and overall fully-integrated security system of passive and active countermeasures for the facility. After separately assessing the range of threats and vulnerabilities, a combined matrix of threats and vulnerabilities was used to identify the crucial combinations. A set of architectural-structural passive measures was added to the active components of the security system

  2. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    Energy Technology Data Exchange (ETDEWEB)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art.

  3. Polymer waveguides for electro-optical integration in data centers and high-performance computers.

    Science.gov (United States)

    Dangel, Roger; Hofrichter, Jens; Horst, Folkert; Jubin, Daniel; La Porta, Antonio; Meier, Norbert; Soganci, Ibrahim Murat; Weiss, Jonas; Offrein, Bert Jan

    2015-02-23

    To satisfy the intra- and inter-system bandwidth requirements of future data centers and high-performance computers, low-cost low-power high-throughput optical interconnects will become a key enabling technology. To tightly integrate optics with the computing hardware, particularly in the context of CMOS-compatible silicon photonics, optical printed circuit boards using polymer waveguides are considered as a formidable platform. IBM Research has already demonstrated the essential silicon photonics and interconnection building blocks. A remaining challenge is electro-optical packaging, i.e., the connection of the silicon photonics chips with the system. In this paper, we present a new single-mode polymer waveguide technology and a scalable method for building the optical interface between silicon photonics chips and single-mode polymer waveguides.

  4. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    International Nuclear Information System (INIS)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art

  5. Initial Flight Test of the Production Support Flight Control Computers at NASA Dryden Flight Research Center

    Science.gov (United States)

    Carter, John; Stephenson, Mark

    1999-01-01

    The NASA Dryden Flight Research Center has completed the initial flight test of a modified set of F/A-18 flight control computers that gives the aircraft a research control law capability. The production support flight control computers (PSFCC) provide an increased capability for flight research in the control law, handling qualities, and flight systems areas. The PSFCC feature a research flight control processor that is "piggybacked" onto the baseline F/A-18 flight control system. This research processor allows for pilot selection of research control law operation in flight. To validate flight operation, a replication of a standard F/A-18 control law was programmed into the research processor and flight-tested over a limited envelope. This paper provides a brief description of the system, summarizes the initial flight test of the PSFCC, and describes future experiments for the PSFCC.

  6. The Radiation Safety Information Computational Center (RSICC): A Resource for Nuclear Science Applications

    International Nuclear Information System (INIS)

    Kirk, Bernadette Lugue

    2009-01-01

    The Radiation Safety Information Computational Center (RSICC) has been in existence since 1963. RSICC collects, organizes, evaluates and disseminates technical information (software and nuclear data) involving the transport of neutral and charged particle radiation, and shielding and protection from the radiation associated with: nuclear weapons and materials, fission and fusion reactors, outer space, accelerators, medical facilities, and nuclear waste management. RSICC serves over 12,000 scientists and engineers from about 100 countries. An important activity of RSICC is its participation in international efforts on computational and experimental benchmarks. An example is the Shielding Integral Benchmarks Archival Database (SINBAD), which includes shielding benchmarks for fission, fusion and accelerators. RSICC is funded by the United States Department of Energy, Department of Homeland Security and Nuclear Regulatory Commission.

  7. HYPERSPECTRAL REMOTE SENSING WITH THE UAS "STUTTGARTER ADLER" – CHALLENGES, EXPERIENCES AND FIRST RESULTS

    Directory of Open Access Journals (Sweden)

    A. Buettner

    2013-08-01

    Full Text Available The UAS "Stuttgarter Adler" was designed as a flexible and cost-effective remote-sensing platform for acquisition of high quality environmental data. Different missions for precision agriculture applications and BRDF-research have been successfully performed with a multispectral camera system and a spectrometer as main payloads. Currently, an imaging spectrometer is integrated in the UAS as a new payload, which enables the recording of hyperspectral data in more than 200 spectral bands in the visible and near infrared spectrum. The recording principle of the hyperspectral instrument is based on a line scanner. Each line is stored as a matrix image with spectral information in one axis and spatial information in the other axis of the image. Besides a detailed specification of the system concept and instrument design, the calibration procedure of the hyperspectral sensor system is discussed and results of the laboratory calibration are presented. The complete processing chain of measurement data is described and first results of measurement-flights over agricultural test sites are presented.

  8. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  9. The Erasmus Computing Grid - Building a Super-Computer Virtually for Free at the Erasmus Medical Center and the Hogeschool Rotterdam

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); L.V. de Zeeuw (Luc)

    2006-01-01

    textabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live- science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop

  10. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    Science.gov (United States)

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  11. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    Science.gov (United States)

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  12. Changing the batch system in a Tier 1 computing center: why and how

    Science.gov (United States)

    Chierici, Andrea; Dal Pra, Stefano

    2014-06-01

    At the Italian Tierl Center at CNAF we are evaluating the possibility to change the current production batch system. This activity is motivated mainly because we are looking for a more flexible licensing model as well as to avoid vendor lock-in. We performed a technology tracking exercise and among many possible solutions we chose to evaluate Grid Engine as an alternative because its adoption is increasing in the HEPiX community and because it's supported by the EMI middleware that we currently use on our computing farm. Another INFN site evaluated Slurm and we will compare our results in order to understand pros and cons of the two solutions. We will present the results of our evaluation of Grid Engine, in order to understand if it can fit the requirements of a Tier 1 center, compared to the solution we adopted long ago. We performed a survey and a critical re-evaluation of our farming infrastructure: many production softwares (accounting and monitoring on top of all) rely on our current solution and changing it required us to write new wrappers and adapt the infrastructure to the new system. We believe the results of this investigation can be very useful to other Tier-ls and Tier-2s centers in a similar situation, where the effort of switching may appear too hard to stand. We will provide guidelines in order to understand how difficult this operation can be and how long the change may take.

  13. An Audit on the Appropriateness of Coronary Computed Tomography Angiography Referrals in a Tertiary Cardiac Center.

    Science.gov (United States)

    Alderazi, Ahmed Ali; Lynch, Mary

    2017-01-01

    In response to growing concerns regarding the overuse of coronary computed tomography angiography (CCTA) in the clinical setting, multiple societies, including the American College of Cardiology Foundation, have jointly published revised criteria regarding the appropriate use of this imaging modality. However, previous research indicates significant discrepancies in the rate of adherence to these guidelines. To assess the appropriateness of CCTA referrals in a tertiary cardiac center in Bahrain. This retrospective clinical audit examined the records of patients referred to CCTA between the April 1, 2015 and December 31, 2015 in Mohammed bin Khalifa Cardiac Center. Using information from medical records, each case was meticulously audited against guidelines to categorize it as appropriate, inappropriate, or uncertain. Of the 234 records examined, 176 (75.2%) were appropriate, 47 (20.1%) were uncertain, and 11 (4.7%) were inappropriate. About 74.4% of all referrals were to investigate coronary artery disease (CAD). The most common indication that was deemed appropriate was the detection of CAD in the setting of suspected ischemic equivalent in patients with an intermediate pretest probability of CAD (65.9%). Most referrals deemed inappropriate were requested to detect CAD in asymptomatic patients at low or intermediate risk of CAD (63.6%). This audit demonstrates a relatively low rate of inappropriate CCTA referrals, indicating the appropriate and efficient use of this resource in the Mohammed bin Khalifa Cardiac Center. Agreement on and reclassification of "uncertain" cases by guideline authorities would facilitate a deeper understanding of referral appropriateness.

  14. Examining the Fundamental Obstructs of Adopting Cloud Computing for 9-1-1 Dispatch Centers in the USA

    Science.gov (United States)

    Osman, Abdulaziz

    2016-01-01

    The purpose of this research study was to examine the unknown fears of embracing cloud computing which stretches across measurements like fear of change from leaders and the complexity of the technology in 9-1-1 dispatch centers in USA. The problem that was addressed in the study was that many 9-1-1 dispatch centers in USA are still using old…

  15. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    Energy Technology Data Exchange (ETDEWEB)

    Pagliosa, Andre; Raucci-Neto, Walter; Silva-Souza, Yara Teresinha Correa; Alfredo, Edson, E-mail: ysousa@unaerp.br [Universidade de Ribeirao Preto (UNAERP), SP (Brazil). Fac. de Odontologia; Sousa-Neto, Manoel Damiao; Versiani, Marco Aurelio [Universidade de Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Fac. de Odoentologia

    2015-03-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape. (author)

  16. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    Directory of Open Access Journals (Sweden)

    André PAGLIOSA

    2015-01-01

    Full Text Available Abstract : The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10, according to the biomechanical preparative system used: Hero 642 (HR, Liberator (LB, ProTaper (PT, and Twisted File (TF. The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05. The results demonstrated no significant difference (p > 0.05 in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR; -0.004 ± 0.044 mm (LB; -0.003 ± 0.064 mm (PT; -0.021 ± 0.064 mm (TF. The mean canal centering ability was: -0.093 ± 0.147 mm (HR; -0.001 ± 0.100 mm (LB; -0.002 ± 0.134 mm (PT; -0.033 ± 0.133 mm (TF. Also, there was no significant difference among the root segments (p > 0.05. It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape.

  17. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    International Nuclear Information System (INIS)

    Pagliosa, Andre; Raucci-Neto, Walter; Silva-Souza, Yara Teresinha Correa; Alfredo, Edson; Sousa-Neto, Manoel Damiao; Versiani, Marco Aurelio

    2015-01-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape. (author)

  18. Walkability is Only Part of the Story: Walking for Transportation in Stuttgart, Germany

    Directory of Open Access Journals (Sweden)

    Maren Reyer

    2014-05-01

    Full Text Available In modern Western societies people often lead inactive and sedentary lifestyles, even though there is no doubt that physical activity and health are related. From an urban planning point of view it would be highly desirable to develop built environments in a way that supports people in leading more active and healthy lifestyles. Within this context there are several methods, predominantly used in the US, to measure the suitability of built environments for walking and cycling. Empirical studies show that people living in highly walkable areas are more physically active (for example, walk more or cycle more. The question is, however, whether these results are also valid for European cities given their different urban planning characteristics and infrastructure standards. To answer this question we used the Walkability-Index and the Walk Score to empirically investigate the associations between walkability and active transportation in the city of Stuttgart, Germany. In a sample of household survey data (n = 1.871 we found a noticeable relationship between walkability and active transportation—the more walkable an area was, the more active residents were. Although the statistical effect is small, the health impact might be of relevance. Being physically active is multi-determined and not only affected by the walkability of an area. We highlight these points with an excursion into research that the health and exercise sciences contribute to the topic. We propose to strengthen interdisciplinary research between the disciplines and to specifically collect data that captures the influence of the environment on physical activity in the future.

  19. Walkability is only part of the story: walking for transportation in Stuttgart, Germany.

    Science.gov (United States)

    Reyer, Maren; Fina, Stefan; Siedentop, Stefan; Schlicht, Wolfgang

    2014-05-30

    In modern Western societies people often lead inactive and sedentary lifestyles, even though there is no doubt that physical activity and health are related. From an urban planning point of view it would be highly desirable to develop built environments in a way that supports people in leading more active and healthy lifestyles. Within this context there are several methods, predominantly used in the US, to measure the suitability of built environments for walking and cycling. Empirical studies show that people living in highly walkable areas are more physically active (for example, walk more or cycle more). The question is, however, whether these results are also valid for European cities given their different urban planning characteristics and infrastructure standards. To answer this question we used the Walkability-Index and the Walk Score to empirically investigate the associations between walkability and active transportation in the city of Stuttgart, Germany. In a sample of household survey data (n = 1.871) we found a noticeable relationship between walkability and active transportation-the more walkable an area was, the more active residents were. Although the statistical effect is small, the health impact might be of relevance. Being physically active is multi-determined and not only affected by the walkability of an area. We highlight these points with an excursion into research that the health and exercise sciences contribute to the topic. We propose to strengthen interdisciplinary research between the disciplines and to specifically collect data that captures the influence of the environment on physical activity in the future.

  20. Pain, Work-related Characteristics, and Psychosocial Factors among Computer Workers at a University Center.

    Science.gov (United States)

    Mainenti, Míriam Raquel Meira; Felicio, Lilian Ramiro; Rodrigues, Erika de Carvalho; Ribeiro da Silva, Dalila Terrinha; Vigário Dos Santos, Patrícia

    2014-04-01

    [Purpose] Complaint of pain is common in computer workers, encouraging the investigation of pain-related workplace factors. This study investigated the relationship among work-related characteristics, psychosocial factors, and pain among computer workers from a university center. [Subjects and Methods] Fifteen subjects (median age, 32.0 years; interquartile range, 26.8-34.5 years) were subjected to measurement of bioelectrical impedance; photogrammetry; workplace measurements; and pain complaint, quality of life, and motivation questionnaires. [Results] The low back was the most prevalent region of complaint (76.9%). The number of body regions for which subjects complained of pain was greater in the no rest breaks group, which also presented higher prevalences of neck (62.5%) and low back (100%) pain. There were also observed associations between neck complaint and quality of life; neck complaint and head protrusion; wrist complaint and shoulder angle; and use of a chair back and thoracic pain. [Conclusion] Complaint of pain was associated with no short rest breaks, no use of a chair back, poor quality of life, high head protrusion, and shoulder angle while using the mouse of a computer.

  1. Concurrent validity of an automated algorithm for computing the center of pressure excursion index (CPEI).

    Science.gov (United States)

    Diaz, Michelle A; Gibbons, Mandi W; Song, Jinsup; Hillstrom, Howard J; Choe, Kersti H; Pasquale, Maria R

    2018-01-01

    Center of Pressure Excursion Index (CPEI), a parameter computed from the distribution of plantar pressures during stance phase of barefoot walking, has been used to assess dynamic foot function. The original custom program developed to calculate CPEI required the oversight of a user who could manually correct for certain exceptions to the computational rules. A new fully automatic program has been developed to calculate CPEI with an algorithm that accounts for these exceptions. The purpose of this paper is to compare resulting CPEI values computed by these two programs on plantar pressure data from both asymptomatic and pathologic subjects. If comparable, the new program offers significant benefits-reduced potential for variability due to rater discretion and faster CPEI calculation. CPEI values were calculated from barefoot plantar pressure distributions during comfortable paced walking on 61 healthy asymptomatic adults, 19 diabetic adults with moderate hallux valgus, and 13 adults with mild hallux valgus. Right foot data for each subject was analyzed with linear regression and a Bland-Altman plot. The automated algorithm yielded CPEI values that were linearly related to the original program (R 2 =0.99; Pcomputation methods. Results of this analysis suggest that the new automated algorithm may be used to calculate CPEI on both healthy and pathologic feet. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. New developments in delivering public access to data from the National Center for Computational Toxicology at the EPA

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...

  3. 14th annual Results and Review Workshop on High Performance Computing in Science and Engineering

    CERN Document Server

    Nagel, Wolfgang E; Resch, Michael M; Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2011; High Performance Computing in Science and Engineering '11

    2012-01-01

    This book presents the state-of-the-art in simulation on supercomputers. Leading researchers present results achieved on systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2011. The reports cover all fields of computational science and engineering, ranging from CFD to computational physics and chemistry, to computer science, with a special emphasis on industrially relevant applications. Presenting results for both vector systems and microprocessor-based systems, the book allows readers to compare the performance levels and usability of various architectures. As HLRS

  4. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  5. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    Science.gov (United States)

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi

    2011-01-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  7. Hydrogen `96: From vision to reality. 11. world hydrogen enery conference in Stuttagrt; Hydrogen `96: Von der Vision zur Realitaet. 11. Welt-Wasserstoffenergie-Konferenz in Stuttgart

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1996-08-01

    More than 700 attendants from 45 countries, 140 technical lectures, 6 plenary lectures, 220 poster presentations, 18 exhibitors from sectors such as high-performance electrolysis for hydrogen generation, fuel cells, hydrogen-powered motor cars and hydrogen filling stations - such is, in a few numbers, the statistics of the Hydrogen `96 Conference. The 11th world conference on energy from hydrogen took place in Stuttgart from 23 to 28 June 1996. The event was opened by Dr. Angela Merkel, the federal German minister for environmental affairs, nature conservancy and reactor safety. The organization of the conference was taken care of by DECHEMA (Deutsche Gesellschaft fuer Chemisches Apparatewesen, Chemische Technik und Biotechnologie), Frankfurt, who were assisted by experienced partners. To mention only a few: Deutsche Forschungsgesellschaft fuer Luft- und Raumfahrt, the VDI Energietechnik Society, the Center for Solar Energy and Hydrogen Research, and the European Federation of Chemical Engineering. (orig.) [Deutsch] Mehr als 700 Teilnehmer aus 45 Laendern, 140 Fachvortraege, 6 Plenarvortraege, 220 Posterpraesentationen, dazu 18 Aussteller aus Bereichen, wie Hochleistungselektrolysen zur Wasserstofferzeugung, Brennstoffzellen, wasserstoffgetriebene Autos und Tankstellen fuer Wasserstoff, das ist, in wenigen Zahlen aufgelistet, die Tagungsstatistik der Hydrogen `96. Die 11. Welt-Wasserstoffenergie-Konferenz fand vom 23. bis 28. Juni 1996 in Stuttgart statt. Eroeffnet wurde die Veranstaltung von der Bundesministerin fuer Umwelt, Naturschutz und Reaktorsicherheit, Dr. Angela Merkel. Die organisatorische Kompetenz lag bei der DECHEMA, der Deutschen Gesellschaft fuer Chemisches Apparatewesen, Chemische Technik und Biotechnologie in Frankfurt, die bei der Vorbereitung der Konferenz auf bewaehrte Partner zurueckgreifen konnte. Genannt seien nur die Deutsche Forschungsgesellschaft fuer Luft- und Raumfahrt, die VDI-Gesellschaft Energietechnik, das Zentrum fuer Sonnenenergie

  8. Elephants produce their electricity by themselves. Wilhelma an integrated energy efficiency concept of Stuttgart (SEE); Elefanten machen ihren Strom selbst. Die Wilhelma als integriertes Energieeffizienzkonzept der Stadt Stuttgart (SEE)

    Energy Technology Data Exchange (ETDEWEB)

    Hilse, Annika; Leix, Carmen; Fischer, Klaus; Kranert, Martin [Stuttgart Univ. (Germany). Inst. fuer Siedlungswasserbau, Wasserguete- und Abfallwirtschaft

    2013-10-01

    As a part of the overall project Stuttgart city with energy efficiency - SEE an integrated bioenergy concept Wilhelma at the Institute for Sanitary Engineering, Water Quality and Waste Management (ISWA) in cooperation with the Institute of Energy Economics and the Rational Use of Energy (IER) is created. The biomass potential analysis was recently completed; a differentiated analysis of energy demand is still pending. The Stuttgart Zoo Wilhelma has a significant biomass potential. With about 340 acres of gardens and parks which fall Wilhelma to care and the residue biomass of around 9000 zoo animals, offers the Stuttgart Zoo and Garden Wilhelma, together with the urban green space a high biomass potential, which is currently unused. For energy recovery through anaerobic digestion in a biogas plant are suitable 3900 t/a of biomass, which are 87% of the total exploitable biomass. For energy recovery by incineration are suitable 600 t/a of biomass, equivalent to the remaining 13% of recoverable biomass. This could be a total energy potential of about 6219 MWh/year are covered if the biomass is fully developed. Of these, 64% come from the fermentation and 36% from burning. About the determined biomass potential can be expected to be covered the electricity and heat demand of up to 16% (integrated bio-energy concept). To fully cover the energy requirements possibility of further use of renewable energy sources (e.g. solar panels on the roofs) must be examined and evaluated. (orig.)

  9. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, A.; Limaye, A. S.

    2011-12-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula's "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA's National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA's SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT's experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  10. A hypothesis on the formation of the primary ossification centers in the membranous neurocranium: a mathematical and computational model.

    Science.gov (United States)

    Garzón-Alvarado, Diego A

    2013-01-21

    This article develops a model of the appearance and location of the primary centers of ossification in the calvaria. The model uses a system of reaction-diffusion equations of two molecules (BMP and Noggin) whose behavior is of type activator-substrate and its solution produces Turing patterns, which represents the primary ossification centers. Additionally, the model includes the level of cell maturation as a function of the location of mesenchymal cells. Thus the mature cells can become osteoblasts due to the action of BMP2. Therefore, with this model, we can have two frontal primary centers, two parietal, and one, two or more occipital centers. The location of these centers in the simplified computational model is highly consistent with those centers found at an embryonic level. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. On the complaint of unconstitutionality of the Stuttgart Court decisions against non-payers and part payers of electricity bills

    International Nuclear Information System (INIS)

    Fischerhof, H.

    1980-01-01

    In a decision dated December 20, 1979, the Federal Constitutional Court refused to accept the complaint of unconstitutionality brought by the Technische Werke (Municipal Utilities) of the city of Stuttgart (TWS) against two decisions by the Stuttgart Municipal Court in favor of non-payers and part payers of electricity bills. The reasons given for the refusal to accept the complaint state that there was every indication of the Stuttgart judgements being faulty. On the basis of this finding, TWS can continue to demand payment in full of their electricity bills. The Federal Constitutional Court maintains that civil rights could not be applied to TWS as a corporation under private law, whose activities exclusively consisted in providing the public with means of existence and whose shares were held in full by an agency with rights of jurisdiction. In a footnote, the author argues that the refusal to grant protection of civil rights to TWS was in conflict with the equal rights principle. (HSCH) [de

  12. Whatever works: a systematic user-centered training protocol to optimize brain-computer interfacing individually.

    Directory of Open Access Journals (Sweden)

    Elisabeth V C Friedrich

    Full Text Available This study implemented a systematic user-centered training protocol for a 4-class brain-computer interface (BCI. The goal was to optimize the BCI individually in order to achieve high performance within few sessions for all users. Eight able-bodied volunteers, who were initially naïve to the use of a BCI, participated in 10 sessions over a period of about 5 weeks. In an initial screening session, users were asked to perform the following seven mental tasks while multi-channel EEG was recorded: mental rotation, word association, auditory imagery, mental subtraction, spatial navigation, motor imagery of the left hand and motor imagery of both feet. Out of these seven mental tasks, the best 4-class combination as well as most reactive frequency band (between 8-30 Hz was selected individually for online control. Classification was based on common spatial patterns and Fisher's linear discriminant analysis. The number and time of classifier updates varied individually. Selection speed was increased by reducing trial length. To minimize differences in brain activity between sessions with and without feedback, sham feedback was provided in the screening and calibration runs in which usually no real-time feedback is shown. Selected task combinations and frequency ranges differed between users. The tasks that were included in the 4-class combination most often were (1 motor imagery of the left hand (2, one brain-teaser task (word association or mental subtraction (3, mental rotation task and (4 one more dynamic imagery task (auditory imagery, spatial navigation, imagery of the feet. Participants achieved mean performances over sessions of 44-84% and peak performances in single-sessions of 58-93% in this user-centered 4-class BCI protocol. This protocol is highly adjustable to individual users and thus could increase the percentage of users who can gain and maintain BCI control. A high priority for future work is to examine this protocol with severely

  13. Rethinking Human-Centered Computing: Finding the Customer and Negotiated Interactions at the Airport

    Science.gov (United States)

    Wales, Roxana; O'Neill, John; Mirmalek, Zara

    2003-01-01

    The breakdown in the air transportation system over the past several years raises an interesting question for researchers: How can we help improve the reliability of airline operations? In offering some answers to this question, we make a statement about Huuman-Centered Computing (HCC). First we offer the definition that HCC is a multi-disciplinary research and design methodology focused on supporting humans as they use technology by including cognitive and social systems, computational tools and the physical environment in the analysis of organizational systems. We suggest that a key element in understanding organizational systems is that there are external cognitive and social systems (customers) as well as internal cognitive and social systems (employees) and that they interact dynamically to impact the organization and its work. The design of human-centered intelligent systems must take this outside-inside dynamic into account. In the past, the design of intelligent systems has focused on supporting the work and improvisation requirements of employees but has often assumed that customer requirements are implicitly satisfied by employee requirements. Taking a customer-centric perspective provides a different lens for understanding this outside-inside dynamic, the work of the organization and the requirements of both customers and employees In this article we will: 1) Demonstrate how the use of ethnographic methods revealed the important outside-inside dynamic in an airline, specifically the consequential relationship between external customer requirements and perspectives and internal organizational processes and perspectives as they came together in a changing environment; 2) Describe how taking a customer centric perspective identifies places where the impact of the outside-inside dynamic is most critical and requires technology that can be adaptive; 3) Define and discuss the place of negotiated interactions in airline operations, identifying how these

  14. Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

    OpenAIRE

    Buyya, Rajkumar; Beloglazov, Anton; Abawajy, Jemal

    2010-01-01

    Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational cos...

  15. Astigmatic single photon emission computed tomography imaging with a displaced center of rotation

    International Nuclear Information System (INIS)

    Wang, H.; Smith, M.F.; Stone, C.D.; Jaszczak, R.J.

    1998-01-01

    A filtered backprojection algorithm is developed for single photon emission computed tomography (SPECT) imaging with an astigmatic collimator having a displaced center of rotation. The astigmatic collimator has two perpendicular focal lines, one that is parallel to the axis of rotation of the gamma camera and one that is perpendicular to this axis. Using SPECT simulations of projection data from a hot rod phantom and point source arrays, it is found that a lack of incorporation of the mechanical shift in the reconstruction algorithm causes errors and artifacts in reconstructed SPECT images. The collimator and acquisition parameters in the astigmatic reconstruction formula, which include focal lengths, radius of rotation, and mechanical shifts, are often partly unknown and can be determined using the projections of a point source at various projection angles. The accurate determination of these parameters by a least squares fitting technique using projection data from numerically simulated SPECT acquisitions is studied. These studies show that the accuracy of parameter determination is improved as the distance between the point source and the axis of rotation of the gamma camera is increased. The focal length to the focal line perpendicular to the axis of rotation is determined more accurately than the focal length to the focal line parallel to this axis. copyright 1998 American Association of Physicists in Medicine

  16. Establishment of computed tomography reference dose levels in Onassis Cardiac Surgery Center

    International Nuclear Information System (INIS)

    Tsapaki, V.; Kyrozi, E.; Syrigou, T.; Mastorakou, I.; Kottou, S.

    2001-01-01

    The purpose of the study was to apply European Commission (EC) Reference Dose Levels (RDL) in Computed Tomography (CT) examinations at Onassis Cardiac Surgery Center (OCSC). These are weighted CT Dose Index (CTDI w ) for a single slice and Dose-Length Product (DLP) for a complete examination. During the period 1998-1999, the total number of CT examinations, every type of CT examination, patient related data and technical parameters of the examinations were recorded. The most frequent examinations were chosen for investigation which were the head, chest, abdomen and pelvis. CTDI measurements were performed and CTDI w and DLP were calculated. Third Quartile values of CTDI w were chosen to be 43mGy for head, 8mGy for chest, and 22mGy for abdomen and pelvis examinations. Third quartile values of DLP were chosen to be 740mGycm for head, 370mGycm for chest, 490mGycm for abdomen and 420mGycm for pelvis examination. Results confirm that OCSC follows successfully the proposed RDL for the head, chest, abdomen and pelvis examinations in terms of radiation dose. (author)

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  18. Annual report of R and D activities in Center for Promotion of Computational Science and Engineering and Center for Computational Science and e-Systems from April 1, 2005 to March 31, 2006

    International Nuclear Information System (INIS)

    2007-03-01

    This report provides an overview of research and development activities in Center for Computational Science and Engineering (CCSE), JAERI in the former half of the fiscal year 2005 (April 1, 2005 - Sep. 30, 2006) and those in Center for Computational Science and e-Systems (CCSE), JAEA, in the latter half of the fiscal year 2005(Oct 1, 2005 - March 31, 2006). In the former half term, the activities have been performed by 5 research groups, Research Group for Computational Science in Atomic Energy, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics Group in CCSE. At the beginning of the latter half term, these 5 groups were integrated into two offices, Simulation Technology Research and Development Office and Computer Science Research and Development Office at the moment of the unification of JNC (Japan Nuclear Cycle Development Institute) and JAERI (Japan Atomic Energy Research Institute), and the latter-half term activities were operated by the two offices. A big project, ITBL (Information Technology Based Laboratory) project and fundamental computational research for atomic energy plant were performed mainly by two groups, the R and D Group for Computer Science and the Research Group for Computational Science in Atomic Energy in the former half term and their integrated office, Computer Science Research and Development Office in the latter half one, respectively. The main result was verification by using structure analysis for real plant executable on the Grid environment, and received Honorable Mentions of Analytic Challenge in the conference 'Supercomputing (SC05)'. The materials science and bioinformatics in atomic energy research field were carried out by three groups, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics

  19. User-centered design in brain-computer interfaces-a case study.

    Science.gov (United States)

    Schreuder, Martijn; Riccio, Angela; Risetti, Monica; Dähne, Sven; Ramsay, Andrew; Williamson, John; Mattia, Donatella; Tangermann, Michael

    2013-10-01

    The array of available brain-computer interface (BCI) paradigms has continued to grow, and so has the corresponding set of machine learning methods which are at the core of BCI systems. The latter have evolved to provide more robust data analysis solutions, and as a consequence the proportion of healthy BCI users who can use a BCI successfully is growing. With this development the chances have increased that the needs and abilities of specific patients, the end-users, can be covered by an existing BCI approach. However, most end-users who have experienced the use of a BCI system at all have encountered a single paradigm only. This paradigm is typically the one that is being tested in the study that the end-user happens to be enrolled in, along with other end-users. Though this corresponds to the preferred study arrangement for basic research, it does not ensure that the end-user experiences a working BCI. In this study, a different approach was taken; that of a user-centered design. It is the prevailing process in traditional assistive technology. Given an individual user with a particular clinical profile, several available BCI approaches are tested and - if necessary - adapted to him/her until a suitable BCI system is found. Described is the case of a 48-year-old woman who suffered from an ischemic brain stem stroke, leading to a severe motor- and communication deficit. She was enrolled in studies with two different BCI systems before a suitable system was found. The first was an auditory event-related potential (ERP) paradigm and the second a visual ERP paradigm, both of which are established in literature. The auditory paradigm did not work successfully, despite favorable preconditions. The visual paradigm worked flawlessly, as found over several sessions. This discrepancy in performance can possibly be explained by the user's clinical deficit in several key neuropsychological indicators, such as attention and working memory. While the auditory paradigm relies

  20. Bridging the digital divide by increasing computer and cancer literacy: community technology centers for head-start parents and families.

    Science.gov (United States)

    Salovey, Peter; Williams-Piehota, Pamela; Mowad, Linda; Moret, Marta Elisa; Edlund, Denielle; Andersen, Judith

    2009-01-01

    This article describes the establishment of two community technology centers affiliated with Head Start early childhood education programs focused especially on Latino and African American parents of children enrolled in Head Start. A 6-hour course concerned with computer and cancer literacy was presented to 120 parents and other community residents who earned a free, refurbished, Internet-ready computer after completing the program. Focus groups provided the basis for designing the structure and content of the course and modifying it during the project period. An outcomes-based assessment comparing program participants with 70 nonparticipants at baseline, immediately after the course ended, and 3 months later suggested that the program increased knowledge about computers and their use, knowledge about cancer and its prevention, and computer use including health information-seeking via the Internet. The creation of community computer technology centers requires the availability of secure space, capacity of a community partner to oversee project implementation, and resources of this partner to ensure sustainability beyond core funding.

  1. Certification of version 1.2 of the PORFLO-3 code for the WHC scientific and engineering computational center

    International Nuclear Information System (INIS)

    Kline, N.W.

    1994-01-01

    Version 1.2 of the PORFLO-3 Code has migrated from the Hanford Cray computer to workstations in the WHC Scientific and Engineering Computational Center. The workstation-based configuration and acceptance testing are inherited from the CRAY-based configuration. The purpose of this report is to document differences in the new configuration as compared to the parent Cray configuration, and summarize some of the acceptance test results which have shown that the migrated code is functioning correctly in the new environment

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  3. The Internet and Computer User Profile: a questionnaire for determining intervention targets in occupational therapy at mental health vocational centers.

    Science.gov (United States)

    Regev, Sivan; Hadas-Lidor, Noami; Rosenberg, Limor

    2016-08-01

    In this study, the assessment tool "Internet and Computer User Profile" questionnaire (ICUP) is presented and validated. It was developed in order to gather information for setting intervention goals to meet current demands. Sixty-eight subjects aged 23-68 participated in the study. The study group (n = 28) was sampled from two vocational centers. The control group consisted of 40 participants from the general population that were sampled by convenience sampling based on the demographics of the study group. Subjects from both groups answered the ICUP questionnaire. Subjects of the study group answered the General Self- Efficacy (GSE) questionnaire and performed the Assessment of Computer Task Performance (ACTP) test in order to examine the convergent validity of the ICUP. Twenty subjects from both groups retook the ICUP questionnaire in order to obtain test-retest results. Differences between groups were tested using multiple analysis of variance (MANOVA) tests. Pearson and Spearman's tests were used for calculating correlations. Cronbach's alpha coefficient and k equivalent were used to assess internal consistency. The results indicate that the questionnaire is valid and reliable. They emphasize that the layout of the ICUP items facilitates in making a comprehensive examination of the client's perception regarding his participation in computer and internet activities. Implications for Rehabiliation The assessment tool "Internet and Computer User Profile" (ICUP) questionnaire is a novel assessment tool that evaluates operative use and individual perception of computer activities. The questionnaire is valid and reliable for use with participants of vocational centers dealing with mental illness. It is essential to facilitate access to computers for people with mental illnesses, seeing that they express similar interest in computers and internet as people from the general population of the same age. Early intervention will be particularly effective for young

  4. High Performance Computing in Science and Engineering '14

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2015-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS). The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and   engineers. The book comes with a wealth of color illustrations and tables of results.  

  5. Computer modeling with randomized-controlled trial data informs the development of person-centered aged care homes.

    Science.gov (United States)

    Chenoweth, Lynn; Vickland, Victor; Stein-Parbury, Jane; Jeon, Yun-Hee; Kenny, Patricia; Brodaty, Henry

    2015-10-01

    To answer questions on the essential components (services, operations and resources) of a person-centered aged care home (iHome) using computer simulation. iHome was developed with AnyLogic software using extant study data obtained from 60 Australian aged care homes, 900+ clients and 700+ aged care staff. Bayesian analysis of simulated trial data will determine the influence of different iHome characteristics on care service quality and client outcomes. Interim results: A person-centered aged care home (socio-cultural context) and care/lifestyle services (interactional environment) can produce positive outcomes for aged care clients (subjective experiences) in the simulated environment. Further testing will define essential characteristics of a person-centered care home.

  6. Spectrum of tablet computer use by medical students and residents at an academic medical center.

    Science.gov (United States)

    Robinson, Robert

    2015-01-01

    Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians. Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM) in July and August of 2012. Results. There were 76 medical student responses (26% response rate) and 66 resident/fellow responses to this survey (21% response rate). Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035). The most common reported uses were for accessing medical reference applications (46%), e-Books (45%), and board study (32%). Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010), review radiology images (27% vs. 12%, p = 0.019), and enter patient care orders (26% vs. 3%, p e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks. Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on resident physicians. Further study is needed better understand how tablet computers and other mobile devices may assist in medical education and patient care.

  7. The Stuttgart-Heidelberg Model of Active Feedback Driven Quality Management: Means for the Optimization of Psychotherapy Provision

    Directory of Open Access Journals (Sweden)

    Hans Kordy

    2003-01-01

    Full Text Available La gestión de la calidad busca la evaluación del tratamiento psicoterapéutico. Un aspecto central se relaciona con el desarrollo de baterías de evaluación y criterios de evaluación adecuados. El modelo Stuttgart-Heidelberg (S-H representa un sistema que proporciona conceptos, instrumentos psicométricos y un programa informático desarrollado para la gestión de la calidad basada en el feedback activo. La información central del modelo Stuttgart-Heidelberg es el resultado individual del tratamiento. El planteamiento es que la psicoterapia puede mejorarse si proporcionamos información sobre los resultados terapéuticos (en especial los negativos, pues los procesos de solución de problemas se estimularán por el feedback recibido. El presente trabajo presenta un inventario de evaluación, la evaluación estandarizada de los resultados y las diversas herramientas de feedback del modelo SH. Un estudio sistemático incluyendo 1715 pacientes de un hospital especializado en trastornos psicosomáticos documenta la validez de este abordaje. Los resultados empíricos refuerzan una estrategia de transparencia acerca de lo que acontece en la práctica clínica – por ejemplo, acerca de los tratamientos administrados, sus resultados y costos. Implicaciones para la posterior optimización de los servicios de salud son discutidos.

  8. Computer-Aided Diagnosis of Breast Cancer: A Multi-Center Demonstrator

    National Research Council Canada - National Science Library

    Floyd, Carey

    2000-01-01

    .... The focus has been to gather data from multiple sites in order to verify and whether the artificial neural network computer aid to the diagnosis of breast cancer can be translated between locations...

  9. Computed tomography-guided percutaneous gastrostomy: initial experience at a cancer center

    Energy Technology Data Exchange (ETDEWEB)

    Tyng, Chiang Jeng; Santos, Erich Frank Vater; Guerra, Luiz Felipe Alves; Bitencourt, Almir Galvao Vieira; Barbosa, Paula Nicole Vieira Pinto; Chojniak, Rubens [A. C. Camargo Cancer Center, Sao Paulo, SP (Brazil); Universidade Federal do Espirito Santo (HUCAM/UFES), Vitoria, ES (Brazil). Hospital Universitario Cassiano Antonio de Morais. Radiologia e Diagnostico por Imagem

    2017-03-15

    Gastrostomy is indicated for patients with conditions that do not allow adequate oral nutrition. To reduce the morbidity and costs associated with the procedure, there is a trend toward the use of percutaneous gastrostomy, guided by endoscopy, fluoroscopy, or, most recently, computed tomography. The purpose of this paper was to review the computed tomography-guided gastrostomy procedure, as well as the indications for its use and the potential complications. (author)

  10. Computed tomography-guided percutaneous gastrostomy: initial experience at a cancer center

    International Nuclear Information System (INIS)

    Tyng, Chiang Jeng; Santos, Erich Frank Vater; Guerra, Luiz Felipe Alves; Bitencourt, Almir Galvao Vieira; Barbosa, Paula Nicole Vieira Pinto; Chojniak, Rubens; Universidade Federal do Espirito Santo

    2017-01-01

    Gastrostomy is indicated for patients with conditions that do not allow adequate oral nutrition. To reduce the morbidity and costs associated with the procedure, there is a trend toward the use of percutaneous gastrostomy, guided by endoscopy, fluoroscopy, or, most recently, computed tomography. The purpose of this paper was to review the computed tomography-guided gastrostomy procedure, as well as the indications for its use and the potential complications. (author)

  11. Measurements and predictions of the air distribution systems in high compute density (Internet) data centers

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jinkyun [HIMEC (Hanil Mechanical Electrical Consultants) Ltd., Seoul 150-103 (Korea); Department of Architectural Engineering, Yonsei University, Seoul 120-749 (Korea); Lim, Taesub; Kim, Byungseon Sean [Department of Architectural Engineering, Yonsei University, Seoul 120-749 (Korea)

    2009-10-15

    When equipment power density increases, a critical goal of a data center cooling system is to separate the equipment exhaust air from the equipment intake air in order to prevent the IT server from overheating. Cooling systems for data centers are primarily differentiated according to the way they distribute air. The six combinations of flooded and locally ducted air distribution make up the vast majority of all installations, except fully ducted air distribution methods. Once the air distribution system (ADS) is selected, there are other elements that must be integrated into the system design. In this research, the design parameters and IT environmental aspects of the cooling system were studied with a high heat density data center. CFD simulation analysis was carried out in order to compare the heat removal efficiencies of various air distribution systems. The IT environment of an actual operating data center is measured to validate a model for predicting the effect of different air distribution systems. A method for planning and design of the appropriate air distribution system is described. IT professionals versed in precision air distribution mechanisms, components, and configurations can work more effectively with mechanical engineers to ensure the specification and design of optimized cooling solutions. (author)

  12. Risk factors for computer visual syndrome (CVS) among operators of two call centers in São Paulo, Brazil.

    Science.gov (United States)

    Sa, Eduardo Costa; Ferreira Junior, Mario; Rocha, Lys Esther

    2012-01-01

    The aims of this study were to investigate work conditions, to estimate the prevalence and to describe risk factors associated with Computer Vision Syndrome among two call centers' operators in São Paulo (n = 476). The methods include a quantitative cross-sectional observational study and an ergonomic work analysis, using work observation, interviews and questionnaires. The case definition was the presence of one or more specific ocular symptoms answered as always, often or sometimes. The multiple logistic regression model, were created using the stepwise forward likelihood method and remained the variables with levels below 5% (p vision (43.5%). The prevalence of Computer Vision Syndrome was 54.6%. Associations verified were: being female (OR 2.6, 95% CI 1.6 to 4.1), lack of recognition at work (OR 1.4, 95% CI 1.1 to 1.8), organization of work in call center (OR 1.4, 95% CI 1.1 to 1.7) and high demand at work (OR 1.1, 95% CI 1.0 to 1.3). The organization and psychosocial factors at work should be included in prevention programs of visual syndrome among call centers' operators.

  13. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    Energy Technology Data Exchange (ETDEWEB)

    Santana, Priscila do Carmo; Oliveira, Paulo Marcio Campos de; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila, E-mail: pridili@gmail.com [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil); Silva, Teogenes Augusto da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-01-15

    Objective: to evaluate the level of ambient radiation in a PET/CT center. Materials and methods: previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results: in none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/ year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion: in the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. (author)

  14. PRODEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP : HIGH PERFORMANCE COMPUTING WITH QCDOC AND BLUEGENE.

    Energy Technology Data Exchange (ETDEWEB)

    CHRIST,N.; DAVENPORT,J.; DENG,Y.; GARA,A.; GLIMM,J.; MAWHINNEY,R.; MCFADDEN,E.; PESKIN,A.; PULLEYBLANK,W.

    2003-03-11

    Staff of Brookhaven National Laboratory, Columbia University, IBM and the RIKEN BNL Research Center organized a one-day workshop held on February 28, 2003 at Brookhaven to promote the following goals: (1) To explore areas other than QCD applications where the QCDOC and BlueGene/L machines can be applied to good advantage, (2) To identify areas where collaboration among the sponsoring institutions can be fruitful, and (3) To expose scientists to the emerging software architecture. This workshop grew out of an informal visit last fall by BNL staff to the IBM Thomas J. Watson Research Center that resulted in a continuing dialog among participants on issues common to these two related supercomputers. The workshop was divided into three sessions, addressing the hardware and software status of each system, prospective applications, and future directions.

  15. Noise-Resilient Quantum Computing with a Nitrogen-Vacancy Center and Nuclear Spins.

    Science.gov (United States)

    Casanova, J; Wang, Z-Y; Plenio, M B

    2016-09-23

    Selective control of qubits in a quantum register for the purposes of quantum information processing represents a critical challenge for dense spin ensembles in solid-state systems. Here we present a protocol that achieves a complete set of selective electron-nuclear gates and single nuclear rotations in such an ensemble in diamond facilitated by a nearby nitrogen-vacancy (NV) center. The protocol suppresses internuclear interactions as well as unwanted coupling between the NV center and other spins of the ensemble to achieve quantum gate fidelities well exceeding 99%. Notably, our method can be applied to weakly coupled, distant spins representing a scalable procedure that exploits the exceptional properties of nuclear spins in diamond as robust quantum memories.

  16. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    Science.gov (United States)

    Santana, Priscila do Carmo; de Oliveira, Paulo Marcio Campos; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila; da Silva, Teógenes Augusto

    2015-01-01

    Objective To evaluate the level of ambient radiation in a PET/CT center. Materials and Methods Previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results In none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion In the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. PMID:25798004

  17. Human-Centered Design of Human-Computer-Human Dialogs in Aerospace Systems

    Science.gov (United States)

    Mitchell, Christine M.

    1998-01-01

    A series of ongoing research programs at Georgia Tech established a need for a simulation support tool for aircraft computer-based aids. This led to the design and development of the Georgia Tech Electronic Flight Instrument Research Tool (GT-EFIRT). GT-EFIRT is a part-task flight simulator specifically designed to study aircraft display design and single pilot interaction. ne simulator, using commercially available graphics and Unix workstations, replicates to a high level of fidelity the Electronic Flight Instrument Systems (EFIS), Flight Management Computer (FMC) and Auto Flight Director System (AFDS) of the Boeing 757/767 aircraft. The simulator can be configured to present information using conventional looking B757n67 displays or next generation Primary Flight Displays (PFD) such as found on the Beech Starship and MD-11.

  18. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    International Nuclear Information System (INIS)

    Bancroft, G.; Plessel, T.; Merritt, F.; Watson, V.

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers. 7 refs

  19. A canonical perturbation method for computing the guiding-center motion in magnetized axisymmetric plasma columns

    International Nuclear Information System (INIS)

    Gratreau, P.

    1987-01-01

    The motion of charged particles in a magnetized plasma column, such as that of a magnetic mirror trap or a tokamak, is determined in the framework of the canonical perturbation theory through a method of variation of constants which preserves the energy conservation and the symmetry invariance. The choice of a frame of coordinates close to that of the magnetic coordinates allows a relatively precise determination of the guiding-center motion with a low-ordered approximation in the adiabatic parameter. A Hamiltonian formulation of the motion equations is obtained

  20. Eustatic control on epicontinental basins: The example of the Stuttgart Formation in the Central European Basin (Middle Keuper, Late Triassic)

    Science.gov (United States)

    Franz, M.; Nowak, K.; Berner, U.; Heunisch, C.; Bandel, K.; Röhling, H.-G.; Wolfgramm, M.

    2014-11-01

    The deposition of the Stuttgart Formation ('Schilfsandstein'), commonly considered as a type-example of the Carnian Pluvial Event, was controlled by high frequent 4th order sequences that resulted in pre-, intra- and post-Schilfsandstein transgressions from Tethyan waters into the epicontinental Central European Basin (CEB). The pre-Schilfsandstein transgression flooded the CEB trough gates to the Southeast and resulted in a wide-spread inland sea that was characterised by increased biological productivity, predominantly oxic conditions and enabled the immigration of euryhaline marine fauna with plankton, ostracodes, fishes, bivalves and the gastropods Omphaloptychia suebica n. sp. and Settsassia stuttgartica n. sp. The rather short-term intra- and post-Schilfsandstein transgressions flooded the CEB from the Southwest and Southeast and established a shallow brackish inland sea that stretched up to North Germany. Both, the 4th and 3rd order sequences derived from the succession in the CEB correlate well with those derived from successions of Tethyan shelfs. Therefore pronounced circum-Tethyan eustatic cycles are evidenced and may have had considerable impact on prominent middle Carnian events: Reingraben turnover, Carnian Pluvial Event, Carnian Crisis and Mid Carnian Wet Intermezzo. The broad circum-Tethyan evidence of 106-year scale cycles suggests glacioeustatic sea-level changes even in the Triassic Greenhouse period.

  1. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

    Science.gov (United States)

    Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

    2013-12-01

    Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    International Nuclear Information System (INIS)

    Bhattacharjee, Amitava

    2016-01-01

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scale laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh

  3. Biomedical Computing Technology Information Center (BCTIC): Final progress report, March 1, 1986-September 30, 1986

    International Nuclear Information System (INIS)

    1987-01-01

    During this time, BCTIC packaged and disseminated computing technology and honored all requests made before September 1, 1986. The final month of operation was devoted to completing code requests, returning submitted codes, and sending out notices of BCTIC's termination of services on September 30th. Final BCTIC library listings were distributed to members of the active mailing list. Also included in the library listing are names and addresses of program authors and contributors in order that users may have continued support of their programs. The BCTIC library list is attached

  4. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharjee, Amitava [Univ. of New Hampshire, Durham, NH (United States)

    2016-03-27

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scale laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh

  5. Computations on the massively parallel processor at the Goddard Space Flight Center

    Science.gov (United States)

    Strong, James P.

    1991-01-01

    Described are four significant algorithms implemented on the massively parallel processor (MPP) at the Goddard Space Flight Center. Two are in the area of image analysis. Of the other two, one is a mathematical simulation experiment and the other deals with the efficient transfer of data between distantly separated processors in the MPP array. The first algorithm presented is the automatic determination of elevations from stereo pairs. The second algorithm solves mathematical logistic equations capable of producing both ordered and chaotic (or random) solutions. This work can potentially lead to the simulation of artificial life processes. The third algorithm is the automatic segmentation of images into reasonable regions based on some similarity criterion, while the fourth is an implementation of a bitonic sort of data which significantly overcomes the nearest neighbor interconnection constraints on the MPP for transferring data between distant processors.

  6. Computer simulations of low energy displacement cascades in a face centered cubic lattice

    International Nuclear Information System (INIS)

    Schiffgens, J.O.; Bourquin, R.D.

    1976-09-01

    Computer simulations of atomic motion in a copper lattice following the production of primary knock-on atoms (PKAs) with energies from 25 to 200 eV are discussed. In this study, a mixed Moliere-Englert pair potential is used to model the copper lattice. The computer code COMENT, which employs the dynamical method, is used to analyze the motion of up to 6000 atoms per time step during cascade evolution. The atoms are specified as initially at rest on the sites of an ideal lattice. A matrix of 12 PKA directions and 6 PKA energies is investigated. Displacement thresholds in the [110] and [100] are calculated to be approximately 17 and 20 eV, respectively. A table showing the stability of isolated Frenkel pairs with different vacancy and interstitial orientations and separations is presented. The numbers of Frenkel pairs and atomic replacements are tabulated as a function of PKA direction for each energy. For PKA energies of 25, 50, 75, 100, 150, and 200 eV, the average number of Frenkel pairs per PKA are 0.4, 0.6, 1.0, 1.2, 1.4, and 2.2 and the average numbers of replacements per PKA are 2.4, 4.0, 3.3, 4.9, 9.3, and 15.8

  7. Computation of Electromagnetic Fields Scattered From Dielectric Objects of Uncertain Shapes Using MLMC Center for Uncertainty

    KAUST Repository

    Litvinenko, Alexander

    2015-01-05

    Simulators capable of computing scattered fields from objects of uncertain shapes are highly useful in electromagnetics and photonics, where device designs are typically subject to fabrication tolerances. Knowledge of statistical variations in scattered fields is useful in ensuring error-free functioning of devices. Oftentimes such simulators use a Monte Carlo (MC) scheme to sample the random domain, where the variables parameterize the uncertainties in the geometry. At each sample, which corresponds to a realization of the geometry, a deterministic electromagnetic solver is executed to compute the scattered fields. However, to obtain accurate statistics of the scattered fields, the number of MC samples has to be large. This significantly increases the total execution time. In this work, to address this challenge, the Multilevel MC (MLMC) scheme is used together with a (deterministic) surface integral equation solver. The MLMC achieves a higher efficiency by “balancing” the statistical errors due to sampling of the random domain and the numerical errors due to discretization of the geometry at each of these samples. Error balancing results in a smaller number of samples requiring coarser discretizations. Consequently, total execution time is significantly shortened.

  8. Brain-computer interface signal processing at the Wadsworth Center: mu and sensorimotor beta rhythms.

    Science.gov (United States)

    McFarland, Dennis J; Krusienski, Dean J; Wolpaw, Jonathan R

    2006-01-01

    The Wadsworth brain-computer interface (BCI), based on mu and beta sensorimotor rhythms, uses one- and two-dimensional cursor movement tasks and relies on user training. This is a real-time closed-loop system. Signal processing consists of channel selection, spatial filtering, and spectral analysis. Feature translation uses a regression approach and normalization. Adaptation occurs at several points in this process on the basis of different criteria and methods. It can use either feedforward (e.g., estimating the signal mean for normalization) or feedback control (e.g., estimating feature weights for the prediction equation). We view this process as the interaction between a dynamic user and a dynamic system that coadapt over time. Understanding the dynamics of this interaction and optimizing its performance represent a major challenge for BCI research.

  9. Abstracts of digital computer code packages. Assembled by the Radiation Shielding Information Center

    International Nuclear Information System (INIS)

    McGill, B.; Maskewitz, B.F.; Anthony, C.M.; Comolander, H.E.; Hendrickson, H.R.

    1976-01-01

    The term ''code package'' is used to describe a miscellaneous grouping of materials which, when interpreted in connection with a digital computer, enables the scientist--user to solve technical problems in the area for which the material was designed. In general, a ''code package'' consists of written material--reports, instructions, flow charts, listings of data, and other useful material and IBM card decks (or, more often, a reel of magnetic tape) on which the source decks, sample problem input (including libraries of data) and the BCD/EBCDIC output listing from the sample problem are written. In addition to the main code, and any available auxiliary routines are also included. The abstract format was chosen to give to a potential code user several criteria for deciding whether or not he wishes to request the code package

  10. Abstracts of digital computer code packages. Assembled by the Radiation Shielding Information Center. [Radiation transport codes

    Energy Technology Data Exchange (ETDEWEB)

    McGill, B.; Maskewitz, B.F.; Anthony, C.M.; Comolander, H.E.; Hendrickson, H.R.

    1976-01-01

    The term ''code package'' is used to describe a miscellaneous grouping of materials which, when interpreted in connection with a digital computer, enables the scientist--user to solve technical problems in the area for which the material was designed. In general, a ''code package'' consists of written material--reports, instructions, flow charts, listings of data, and other useful material and IBM card decks (or, more often, a reel of magnetic tape) on which the source decks, sample problem input (including libraries of data) and the BCD/EBCDIC output listing from the sample problem are written. In addition to the main code, and any available auxiliary routines are also included. The abstract format was chosen to give to a potential code user several criteria for deciding whether or not he wishes to request the code package. (RWR)

  11. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables

  12. Predicting Structures of Ru-Centered Dyes: A Computational Screening Tool.

    Science.gov (United States)

    Fredin, Lisa A; Allison, Thomas C

    2016-04-07

    Dye-sensitized solar cells (DSCs) represent a means for harvesting solar energy to produce electrical power. Though a number of light harvesting dyes are in use, the search continues for more efficient and effective compounds to make commercially viable DSCs a reality. Computational methods have been increasingly applied to understand the dyes currently in use and to aid in the search for improved light harvesting compounds. Semiempirical quantum chemistry methods have a well-deserved reputation for giving good quality results in a very short amount of computer time. The most recent semiempirical models such as PM6 and PM7 are parametrized for a wide variety of molecule types, including organometallic complexes similar to DSC chromophores. In this article, the performance of PM6 is tested against a set of 20 molecules whose geometries were optimized using a density functional theory (DFT) method. It is found that PM6 gives geometries that are in good agreement with the optimized DFT structures. In order to reduce the differences between geometries optimized using PM6 and geometries optimized using DFT, the PM6 basis set parameters have been optimized for a subset of the molecules. It is found that it is sufficient to optimize the basis set for Ru alone to improve the agreement between the PM6 results and the DFT results. When this optimized Ru basis set is used, the mean unsigned error in Ru-ligand bond lengths is reduced from 0.043 to 0.017 Å in the set of 20 test molecules. Though the magnitude of these differences is small, the effect on the calculated UV/vis spectra is significant. These results clearly demonstrate the value of using PM6 to screen DSC chromophores as well as the value of optimizing PM6 basis set parameters for a specific set of molecules.

  13. The Benefits of Making Data from the EPA National Center for Computational Toxicology available for reuse (ACS Fall meeting 3 of 12)

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology (NCCT) integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. The goal of this research is to quickly evalua...

  14. Annual report of R and D activities in center for promotion of computational science and engineering from April 1, 2003 to March 31, 2004

    International Nuclear Information System (INIS)

    2005-08-01

    Major Research and development activities of Center for Promotion of Computational Science and Engineering (CCSE), JAERI, have focused on ITBL (IT Based Laboratory) project, computational material science and Quantum Bioinformatics. This report provides an overview of research and development activities in (CCSE) in the fiscal year 2003 (April 1, 2003 - March 31, 2004). (author)

  15. Computed Tomography Features of Benign and Malignant Calcified Thyroid Nodules: A Single-Center Study.

    Science.gov (United States)

    Kim, Donghyun; Kim, Dong Wook; Heo, Young Jin; Baek, Jin Wook; Lee, Yoo Jin; Park, Young Mi; Baek, Hye Jin; Jung, Soo Jin

    No previous studies have investigated thyroid calcification on computed tomography (CT) quantitatively by using Hounsfield unit (HU) values. This study aimed to analyze quantitative HU values of thyroid calcification on preoperative neck CT and to assess the characteristics of benign and malignant calcified thyroid nodules (CTNs). Two hundred twenty patients who underwent neck CT before thyroid surgery from January 2015 to June 2016 were included. On soft-tissue window CT images, CTNs with calcified components of 3 mm or larger in minimum diameter were included in this study. The HU values and types of CTNs were determined and analyzed. Of 61 CTNs in 49 patients, there were 42 malignant nodules and 19 benign nodules. The mean largest diameter of the calcified component was 5.3 (2.5) mm (range, 3.1-17.1 mm). A statistically significant difference was observed in the HU values of calcified portions between benign and malignant CTNs, whereas there was no significant difference in patient age or sex or in the size, location, or type of each CTN. Of the 8 CTNs with pure calcification, 3 exhibited a honeycomb pattern on bone window CT images, and these 3 CTNs were all diagnosed as papillary thyroid carcinoma on histopathological examination. Hounsfield unit values of CTNs may be helpful for differentiating malignancy from benignity.

  16. EVALUATION OF PROPTOSIS BY USING COMPUTED TOMOGRAPHY IN A TERTIARY CARE CENTER, BURLA, SAMBALPUR, ODISHA

    Directory of Open Access Journals (Sweden)

    Vikas Agrawal

    2017-07-01

    Full Text Available BACKGROUND Proptosis is defined as the abnormal anterior protrusion of the globe beyond the orbital margins.1 It is an important clinical manifestation of various orbital as well as systemic disorders. Aetiology ranging from infection to malignant tumours, among which space occupying lesions within the orbits are the most important. Proptosis is defined as an abnormal protrusion of the eyeball. MATERIALS AND METHODS A total of 32 patients referred from various departments mainly from ophthalmology and medicine with history and clinical features suggestive of proptosis were evaluated in our department and after proper history taking and clinical examination, Computed Tomography (CT scan was done. RESULTS The age of the patients ranged from 1-55 years. Associated chief complaints in case of proptosis were in decreasing order from pain / headache, restricted eye movement, diminished vision and diplopia. Mass lesions (46.87% were the most common cause of proptosis followed by inflammatory lesions (37.5%. Trauma vascular lesions and congenital conditions were infrequent causes of proptosis. In children, common causes of proptosis were retinoblastoma (35.71% and orbital cellulitis (28.57% and in adults the common causes were thyroid ophthalmopathy (22.22%, trauma (16.66% and pseudo-tumour (16.66%. CONCLUSION Mass lesions (46.87% were the most common cause of proptosis followed by inflammatory lesions (37.5%. CT scanning should be the chief investigation in evaluation of lesions causing proptosis. It is the most useful in detecting characterising and determining the extent of disease process. The overall accuracy of CT scan in diagnosis of proptosis is 96.87%.

  17. Incorporating modern OpenGL into computer graphics education.

    Science.gov (United States)

    Reina, Guido; Muller, Thomas; Ertl, Thomas

    2014-01-01

    University of Stuttgart educators have updated three computer science courses to incorporate forward-compatible OpenGL. To help students, they developed an educational framework that abstracts some of modern OpenGL's difficult aspects.

  18. Computer-Based Training in Eating and Nutrition Facilitates Person-Centered Hospital Care: A Group Concept Mapping Study.

    Science.gov (United States)

    Westergren, Albert; Edfors, Ellinor; Norberg, Erika; Stubbendorff, Anna; Hedin, Gita; Wetterstrand, Martin; Rosas, Scott R; Hagell, Peter

    2018-04-01

    Studies have shown that computer-based training in eating and nutrition for hospital nursing staff increased the likelihood that patients at risk of undernutrition would receive nutritional interventions. This article seeks to provide understanding from the perspective of nursing staff of conceptually important areas for computer-based nutritional training, and their relative importance to nutritional care, following completion of the training. Group concept mapping, an integrated qualitative and quantitative methodology, was used to conceptualize important factors relating to the training experiences through four focus groups (n = 43), statement sorting (n = 38), and importance rating (n = 32), followed by multidimensional scaling and cluster analysis. Sorting of 38 statements yielded four clusters. These clusters (number of statements) were as follows: personal competence and development (10), practice close care development (10), patient safety (9), and awareness about the nutrition care process (9). First and second clusters represented "the learning organization," and third and fourth represented "quality improvement." These findings provide a conceptual basis for understanding the importance of training in eating and nutrition, which contributes to a learning organization and quality improvement, and can be linked to and facilitates person-centered nutritional care and patient safety.

  19. Production Support Flight Control Computers: Research Capability for F/A-18 Aircraft at Dryden Flight Research Center

    Science.gov (United States)

    Carter, John F.

    1997-01-01

    NASA Dryden Flight Research Center (DFRC) is working with the United States Navy to complete ground testing and initiate flight testing of a modified set of F/A-18 flight control computers. The Production Support Flight Control Computers (PSFCC) can give any fleet F/A-18 airplane an in-flight, pilot-selectable research control law capability. NASA DFRC can efficiently flight test the PSFCC for the following four reasons: (1) Six F/A-18 chase aircraft are available which could be used with the PSFCC; (2) An F/A-18 processor-in-the-loop simulation exists for validation testing; (3) The expertise has been developed in programming the research processor in the PSFCC; and (4) A well-defined process has been established for clearing flight control research projects for flight. This report presents a functional description of the PSFCC. Descriptions of the NASA DFRC facilities, PSFCC verification and validation process, and planned PSFCC projects are also provided.

  20. Universitätsbibliografie mit PUMA. Praxisbericht aus der Einführung der Universitätsbibliografie an der Universitätsbibliothek Stuttgart

    Directory of Open Access Journals (Sweden)

    Sibylle Hermann

    2017-12-01

    The added value of the Academic Publication Management PUMA at the University of Stuttgart compared to a bibliographic cataloguing environment lies in the direct processing of data by scientists without an interim step via the library. For the university content management system, a plugin is available, which dynamically integrates publication lists into employees’ web pages. The metadata is loaded directly from PUMA, filtered, sorted and can be put out in the citation style desired. PUMA offers many interfaces and display possibilities.

  1. SCELib2: the new revision of SCELib, the parallel computational library of molecular properties in the single center approach

    Science.gov (United States)

    Sanna, N.; Morelli, G.

    2004-09-01

    In this paper we present the new version of the SCELib program (CPC Catalogue identifier ADMG) a full numerical implementation of the Single Center Expansion (SCE) method. The physics involved is that of producing the SCE description of molecular electronic densities, of molecular electrostatic potentials and of molecular perturbed potentials due to a point negative or positive charge. This new revision of the program has been optimized to run in serial as well as in parallel execution mode, to support a larger set of molecular symmetries and to permit the restart of long-lasting calculations. To measure the performance of this new release, a comparative study has been carried out on the most powerful computing architectures in serial and parallel runs. The results of the calculations reported in this paper refer to real cases medium to large molecular systems and they are reported in full details to benchmark at best the parallel architectures the new SCELib code will run on. Program summaryTitle of program: SCELib2 Catalogue identifier: ADGU Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADGU Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Reference to previous versions: Comput. Phys. Commun. 128 (2) (2000) 139 (CPC catalogue identifier: ADMG) Does the new version supersede the original program?: Yes Computer for which the program is designed and others on which it has been tested: HP ES45 and rx2600, SUN ES4500, IBM SP and any single CPU workstation based on Alpha, SPARC, POWER, Itanium2 and X86 processors Installations: CASPUR, local Operating systems under which the program has been tested: HP Tru64 V5.X, SUNOS V5.8, IBM AIX V5.X, Linux RedHat V8.0 Programming language used: C Memory required to execute with typical data: 10 Mwords. Up to 2000 Mwords depending on the molecular system and runtime parameters No. of bits in a word: 64 No. of processors used: 1 to 32 Has the code been vectorized or parallelized?: Yes

  2. [The importance of Jewish nursing in World War I as shown by the example of the Jewish nurses' home in Stuttgart].

    Science.gov (United States)

    Ruess, Susanne

    2010-01-01

    The history of Jewish nursing in World War I has so far not been central to medical history research. Rosa Bendit's war diary is still the only source available on the voluntary service Jewish nurses provided during World War I. Their number was small compared to that of nurses in general. Jewish nursing in Germany has hardly been researched. Jewish nurses, like their Christian colleagues, took on wartime nursing tasks voluntarily. This paper will focus on the experiences of the nurses who were sent to various locations in East and West by the Stuttgart Jewish Nurses' Home. Based on quotations from the war diary their position within the medical service will be described, compared and analyzed. The paper draws attention to special characteristics in the comparison ofJewish and Christian nurses and explores issues such as religious observance, religious discrimination, patriotism and differences in the evaluation of the nurses' work. A brief outline of the history of the Stuttgart Jewish Nurses' Home illustrates their working conditions. The Jewish nurses applied themselves with as much effort and devotion as their Christian counterparts. Although there were only few of them, the Jewish nurses managed to establish a recognized position for themselves within the medical service. The history of Jewish nursing in Stuttgart ended in 1941 when the Jewish Nurses' Home was dissolved by the Nazis and four nurses were murdered in concentration camps.

  3. Delivering an Informational Hub for Data at the National Center for Computational Toxicology (ACS Spring Meeting) 7 of 7

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  4. Investigating Impact Metrics for Performance for the US EPA National Center for Computational Toxicology (ACS Fall meeting)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  5. Spectroscopic and computational study of a nonheme iron nitrosyl center in a biosynthetic model of nitric oxide reductase.

    Science.gov (United States)

    Chakraborty, Saumen; Reed, Julian; Ross, Matthew; Nilges, Mark J; Petrik, Igor D; Ghosh, Soumya; Hammes-Schiffer, Sharon; Sage, J Timothy; Zhang, Yong; Schulz, Charles E; Lu, Yi

    2014-02-24

    A major barrier to understanding the mechanism of nitric oxide reductases (NORs) is the lack of a selective probe of NO binding to the nonheme FeB center. By replacing the heme in a biosynthetic model of NORs, which structurally and functionally mimics NORs, with isostructural ZnPP, the electronic structure and functional properties of the FeB nitrosyl complex was probed. This approach allowed observation of the first S=3/2 nonheme {FeNO}(7) complex in a protein-based model system of NOR. Detailed spectroscopic and computational studies show that the electronic state of the {FeNO}(7) complex is best described as a high spin ferrous iron (S=2) antiferromagnetically coupled to an NO radical (S=1/2) [Fe(2+)-NO(.)]. The radical nature of the FeB -bound NO would facilitate N-N bond formation by radical coupling with the heme-bound NO. This finding, therefore, supports the proposed trans mechanism of NO reduction by NORs. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Center for Programming Models for Scalable Parallel Computing - Towards Enhancing OpenMP for Manycore and Heterogeneous Nodes

    Energy Technology Data Exchange (ETDEWEB)

    Barbara Chapman

    2012-02-01

    OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close to DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.

  7. Annual report of R and D activities in Center for Computational Science and e-Systems from April 1, 2006 to March 31, 2007

    International Nuclear Information System (INIS)

    2008-03-01

    This report provides an overview of the research and development activities of the Center for Computational Science and e-Systems (CCSE), JAEA in fiscal year 2006 (April 1, 2006 - March 31, 2007). These research and development activities have been performed by the Simulation Technology Research and Development Office and the Computer Science Research and Development Office. The primary results of the research and development activities are the development of simulation techniques for a virtual earthquake testbed, an intelligent infrastructure for atomic energy research, computational biological disciplines to predict DNA repair function of protein, and material models for a neutron detection device, crack propagation, and gas bubble formation in nuclear fuel. (author)

  8. The combinatorics computation for Casimir operators of the symplectic Lie algebra and the application for determining the center of the enveloping algebra of a semidirect product

    International Nuclear Information System (INIS)

    Le Van Hop.

    1989-12-01

    The combinatorics computation is used to describe the Casimir operators of the symplectic Lie Algebra. This result is applied for determining the Center of the enveloping Algebra of the semidirect Product of the Heisenberg Lie Algebra and the symplectic Lie Algebra. (author). 10 refs

  9. Pleural effusion biomarkers and computed tomography findings in diagnosing malignant pleural mesothelioma: A retrospective study in a single center

    Science.gov (United States)

    Kataoka, Yuki; Ikegaki, Shunkichi; Saito, Emiko; Matsumoto, Hirotaka; Kaku, Sawako; Shimada, Masatoshi; Hirabayashi, Masataka

    2017-01-01

    In this study, we aimed to examine the clinical value of the pleural effusion (PE) biomarkers, soluble mesothelin-related peptide (SMRP), cytokeratin 19 fragment (CYFRA 21–1) and carcinoembryonic antigen (CEA), and the utility of combining chest computed tomography (CT) findings with these biomarkers, in diagnosing malignant pleural mesothelioma (MPM). We conducted a retrospective cohort study in a single center. Consecutive patients with undiagnosed pleural effusions who underwent PE analysis between September 2014 and August 2016 were reviewed. This study included 240 patients (32 with MPM and 208 non-MPM). SMRP and the CYFRA 21-1/CEA ratio had a sensitivity and specificity for diagnosing MPM of 56.3% and 86.5%, and 87.5% and 74.0%, respectively. Using receiver operating characteristics (ROC) curve analysis of the ability of these markers to distinguish MPM from all other PE causes, the area under the ROC curve (AUC) for SMRP and the CYFRA 21-1/CEA ratio was 0.804 and 0.874, respectively. The sensitivity and specificity of SMRP combined with the CYFRA 21-1/CEA ratio were 93.8% and 64.9%, respectively. The sensitivity of the combination of SMRP, the CYFRA 21-1/CEA ratio, and the presence of Leung’s criteria (a chest CT finding that is suggestive of malignant pleural disease) was 93.8%. In conclusion, the combined PE biomarkers had a high sensitivity for diagnosing MPM, although the addition of chest CT findings did not improve the sensitivity of SMRP combined with the CYFRA 21-1/CEA ratio. Combination of these biomarkers helped to rule out MPM effectively among patients at high risk of suffering MPM and would be valuable especially for old frail patients who have difficulty in undergoing invasive procedures such as thoracoscopy. PMID:28968445

  10. 76 FR 21091 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Science.gov (United States)

    2011-04-14

    ...: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching...: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503...), as amended, (Pub. L. 100-503, the Computer Matching and Privacy Protection Act (CMPPA) of 1988), the...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  12. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  14. Development of an Instrument to Measure Health Center (HC) Personnel's Computer Use, Knowledge and Functionality Demand for HC Computerized Information System in Thailand

    OpenAIRE

    Kijsanayotin, Boonchai; Pannarunothai, Supasit; Speedie, Stuart

    2005-01-01

    Knowledge about socio-technical aspects of information technology (IT) is vital for the success of health IT projects. The Thailand health administration anticipates using health IT to support the recently implemented national universal health care system. However, the national knowledge associate with the socio-technical aspects of health IT has not been studied in Thailand. A survey instrument measuring Thai health center (HC) personnel’s computer use, basic IT knowledge a...

  15. Experimental and Computational Instrumentation for Rotorcraft Noise and Vibration Control Research at the Penn State Rotorcraft Center

    National Research Council Canada - National Science Library

    Smith, Edward

    2001-01-01

    A team of faculty at the Penn State Rotorcraft Center of Excellence has integrated five new facilities into a broad range of research and educational programs focused on rotorcraft noise and vibration control...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  18. 77 FR 33547 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare and Medicaid...

    Science.gov (United States)

    2012-06-06

    ...: Social Security Administration (SSA). ACTION: Notice of a new computer matching program that will expire... protections for such persons. The Privacy Act, as amended, regulates the use of computer matching by Federal... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0015] Privacy Act of 1974, as Amended...

  19. 78 FR 69926 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Science.gov (United States)

    2013-11-21

    ...: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L 100-503), amended the... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2013-0059] Privacy Act of 1974, as Amended...

  20. Annual report of R and D activities in center for promotion of computational science and engineering from April 1, 2004 to March 31, 2005

    International Nuclear Information System (INIS)

    2005-09-01

    This report provides an overview of research and development activities in Center for Promotion of Computational Science and Engineering (CCSE), JAERI, in the fiscal year 2004 (April 1, 2004 - March 31, 2005). The activities have been performed by Research Group for Computational Science in Atomic Energy, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics Group in CCSE. The ITBL (Information Technology Based Laboratory) project is performed mainly by the R and D Group for Computer Science and the Research Group for Computational Science in Atomic Energy. According to the mid-term evaluation for the ITBL project conducted by the MEXT, the achievement of the ITBL infrastructure software developed by JAERI has been remarked as outstanding at the 13th Information Science and Technology Committee in the Subdivision on R and D Planning and Evaluation of the Council for Science and Technology on April 26th, 2004. (author)

  1. Annual report of R and D activities in Center for Computational Science and e-Systems from April 1, 2009 to March 31, 2010

    International Nuclear Information System (INIS)

    2011-10-01

    This report overviews the activity of research and development (R and D) in Center for Computational Science and e-Systems (CCSE) of the Japan Atomic Energy Agency (JAEA), during the fiscal year 2009 (April 1, 2009 - March 31, 2010). The work has been accomplished by the Simulation Technology R and D Office and Computer Science R and D Office in CCSE. The activity includes researches of secure computational infrastructure for the use in atomic energy research, which is based on the grid technology, a seismic response analysis for the structure of nuclear power plants, materials science, and quantum bioinformatics. The materials science research includes large scale atomic and subatomic simulations of nuclear fuels and materials for safety assessment, large scale quantum simulations of superconductor for the design of new devices and fundamental understanding of superconductivity. The quantum bioinformatics research focuses on the development of technology for large scale atomic simulations of proteins. (author)

  2. Annual report of R and D activities in Center for Computational Science and e-Systems from April 1, 2007 to March 31, 2009

    International Nuclear Information System (INIS)

    2010-01-01

    This report provides an overview of research and development activities in Center for Computational Science and e-Systems (CCSE), JAEA, during the fiscal years 2007 and 2008 (Apr 1, 2007 - March 31, 2009). These research and development activities have been performed by the Simulation Technology R and D Office and Computer Science R and D Office. These activities include development of secure computational infrastructure for atomic energy research based on the grid technology, large scale seismic analysis of an entire nuclear reactor structure, large scale fluid dynamics simulation of J-PARC mercury target, large scale plasma simulation for nuclear fusion reactor, large scale atomic and subatomic simulations of nuclear fuels and materials for safety assessment, large scale quantum simulations of superconductor for the design of new devices and fundamental understanding of superconductivity, development of protein database for the identification of radiation-resistance gene, and large scale atomic simulation of proteins. (author)

  3. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  10. Barrier-free public transport in the Stuttgart region. Results of a round table with disabled persons on entrance to S-Bahn carriages; Barrierefreiheit im oeffentlichen Personennahverkehr (OePNV) in der Region Stuttgart. Ergebnisse eines Runden Tisches mit Betroffenen zum Problembereich Fahrzeugzugang bei S-Bahnen

    Energy Technology Data Exchange (ETDEWEB)

    Pauls, K. (ed.)

    2001-06-01

    On July, 18{sup th}, 2000, the Centre of Technology Assessment in Baden-Wuerttemberg invited representatives of organisations for the disabled to take part in a round table in order to discuss barrierfree public transport in the region of Stuttgart with Dr. Witgar Weber, vice regional director of the Verband Region Stuttgart and Mr. Nils Himmelmann, representative of Switch Transit Consult. In the first lecture, Dr. Witgar Weber explained tasks of the Verband Region Stuttgart concerning public transport, especially the S-Bahn. Mr. Nils Himmelmann explained the work he is doing on instruction for the Verband Region Stuttgart, at finding solutions to reduce the problem of a vertical and horizontal interval at the entrance of S-Bahn carriages. As a result of the round table, a profile of requirements was compiled by participants, including preferences for a barrierfree use of the S-Bahn. For the S-Bahn system they demand a partial raise of platforms and where a static raise is not possible there should be a dynamic partial raise. This measure should be combined with an extending step at the entrances of the S-Bahn to solve both horizontal and vertical intervals. Further, the integration of disabled and handicapped people in planning is essential at an early stage. Not only for new projects but also for the modifications of trails or carriages. Therewith, concrete problems can be identified immediately which leads to a reduction of costs in realising requirements for a barrierfree public transport instead of very expensive modifications afterwards. (orig.) [German] Auf Einladung der Akademie fuer Technikfolgenabschaetzung in Baden-Wuerttemberg kamen am 18.07.00 Vertreterinnen und Vertreter von Behindertenverbaenden zu einem Runden Tisch, um das Thema, Barrierefreiheit im OePNV in der Region Stuttgart' zusammen mit Dr. Witgar Weber, stellvertretender Regionaldirektor beim Verband Region Stuttgart (VRS), und Nils Himmelmann von der Firma Switch Transit Consult zu

  11. Autonomous Micro-Modular Mobile Data Center Cloud Computing Study for Modeling, Simulation, Information Processing and Cyber-Security Viability

    Data.gov (United States)

    National Aeronautics and Space Administration — Cloud computing environments offer opportunities for malicious users to penetrate security layers and damage, destroy or steal data. This ability can be exploited to...

  12. Finding the Cell Center by a Balance of Dynein and Myosin Pulling and Microtubule Pushing: A Computational Study

    Science.gov (United States)

    Zhu, Jie; Burakov, Anton; Rodionov, Vladimir

    2010-01-01

    The centrosome position in many types of interphase cells is actively maintained in the cell center. Our previous work indicated that the centrosome is kept at the center by pulling force generated by dynein and actin flow produced by myosin contraction and that an unidentified factor that depends on microtubule dynamics destabilizes position of the centrosome. Here, we use modeling to simulate the centrosome positioning based on the idea that the balance of three forces—dyneins pulling along microtubule length, myosin-powered centripetal drag, and microtubules pushing on organelles—is responsible for the centrosome displacement. By comparing numerical predictions with centrosome behavior in wild-type and perturbed interphase cells, we rule out several plausible hypotheses about the nature of the microtubule-based force. We conclude that strong dynein- and weaker myosin-generated forces pull the microtubules inward competing with microtubule plus-ends pushing the microtubule aster outward and that the balance of these forces positions the centrosome at the cell center. The model also predicts that kinesin action could be another outward-pushing force. Simulations demonstrate that the force-balance centering mechanism is robust yet versatile. We use the experimental observations to reverse engineer the characteristic forces and centrosome mobility. PMID:20980619

  13. Cone-beam Computed Tomographic Assessment of Canal Centering Ability and Transportation after Preparation with Twisted File and Bio RaCe Instrumentation.

    Directory of Open Access Journals (Sweden)

    Kiamars Honardar

    2014-08-01

    Full Text Available Use of rotary Nickel-Titanium (NiTi instruments for endodontic preparation has introduced a new era in endodontic practice, but this issue has undergone dramatic modifications in order to achieve improved shaping abilities. Cone-beam computed tomography (CBCT has made it possible to accurately evaluate geometrical changes following canal preparation. This study was carried out to compare canal centering ability and transportation of Twisted File and BioRaCe rotary systems by means of cone-beam computed tomography.Thirty root canals from freshly extracted mandibular and maxillary teeth were selected. Teeth were mounted and scanned before and after preparation by CBCT at different apical levels. Specimens were divided into 2 groups of 15. In the first group Twisted File and in the second, BioRaCe was used for canal preparation. Canal transportation and centering ability after preparation were assessed by NNT Viewer and Photoshop CS4 software. Statistical analysis was performed using t-test and two-way ANOVA.All samples showed deviations from the original axes of the canals. No significant differences were detected between the two rotary NiTi instruments for canal centering ability in all sections. Regarding canal transportation however, a significant difference was seen in the BioRaCe group at 7.5mm from the apex.Under the conditions of this in vitro study, Twisted File and BioRaCe rotary NiTi files retained original canal geometry.

  14. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  15. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  17. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  18. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  2. Canal transportation and centering ability of protaper and self-adjusting file system in long oval canals: An ex-vivo cone-beam computed tomography analysis.

    Science.gov (United States)

    Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak

    2017-01-01

    The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.

  3. Pricing the Services of the Computer Center at the Catholic University of Louvain. Program on Institutional Management in Higher Education.

    Science.gov (United States)

    Hecquet, Ignace; And Others

    Principles are outlined that are used as a basis for the system of pricing the services of the Computer Centre. The system illustrates the use of a management method to secure better utilization of university resources. Departments decide how to use the appropriations granted to them and establish a system of internal prices that reflect the cost…

  4. Development of an instrument to measure health center (HC) personnel's computer use, knowledge and functionality demand for HC computerized information system in Thailand.

    Science.gov (United States)

    Kijsanayotin, Boonchai; Pannarunothai, Supasit; Speedie, Stuart

    2005-01-01

    Knowledge about socio-technical aspects of information technology (IT) is vital for the success of health IT projects. The Thailand health administration anticipates using health IT to support the recently implemented national universal health care system. However, the national knowledge associate with the socio-technical aspects of health IT has not been studied in Thailand. A survey instrument measuring Thai health center (HC) personnel's computer use, basic IT knowledge and HC computerized information system functionality needs was developed. The instrument reveals acceptable test-retest reliability and reasonable internal consistency of the measures. The future nation-wide demonstration study will benefit from this study.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  6. Design reality gap issues within an ICT4D project:an assessment of Jigawa State Community Computer Center

    OpenAIRE

    Kanya, Rislana Abdulazeez; Good, Alice

    2013-01-01

    This paper evaluates the Jigawa State Government Community Computer centre project using the design reality gap framework. The purpose of this was to analyse the shortfall between design expectations and implementation realities, in order to find out the current situation of the project. Furthermore to analyse whether it would meet the key stakeholder’s expectation. The Majority of Government ICT Projects is classified as either failure or partial failure. Our research will underpin a case st...

  7. Peer interaction in mixed age groups: a study in the computer area of an early childhood education center in Portugal

    OpenAIRE

    Figueiredo, Maria Pacheco; Figueiredo, Ana Cláudia Nogueira de; Rego, Belmiro

    2015-01-01

    The study was developed as a teacher-research project during initial teacher education – Masters Degree of Early Childhood and Primary Education, in Portugal. It analysed the interactions between children of 3 to 6 years old, during the use of the computer as a free choice activity, confronting situations between peers of the same age and situations between peers of different ages. The focus of the analysis was the collaborative interactions. This was a qualitative study. Child...

  8. Molecular modeling and computational simulation of the photosystem-II reaction center to address isoproturon resistance in Phalaris minor.

    Science.gov (United States)

    Singh, Durg Vijay; Agarwal, Shikha; Kesharwani, Rajesh Kumar; Misra, Krishna

    2012-08-01

    Isoproturon is the only herbicide that can control Phalaris minor, a competitive weed of wheat that developed resistance in 1992. Resistance against isoproturon was reported to be due to a mutation in the psbA gene that encodes the isoproturon-binding D1 protein. Previously in our laboratory, a triazole derivative of isoproturon (TDI) was synthesized and found to be active against both susceptible and resistant biotypes at 0.5 kg/ha but has shown poor specificity. In the present study, both susceptible D1((S)), resistant D1((R)) and D2 proteins of the PS-II reaction center of P. minor have been modeled and simulated, selecting the crystal structure of PS-II from Thermosynechococcus elongatus (2AXT.pdb) as template. Loop regions were refined, and the complete reaction center D1/D2 was simulated with GROMACS in lipid (1-palmitoyl-2-oleoylglycero-3-phosphoglycerol, POPG) environment along with ligands and cofactor. Both S and R models were energy minimized using steepest decent equilibrated with isotropic pressure coupling and temperature coupling using a Berendsen protocol, and subjected to 1,000 ps of MD simulation. As a result of MD simulation, the best model obtained in lipid environment had five chlorophylls, two plastoquinones, two phenophytins and a bicarbonate ion along with cofactor Fe and oxygen evolving center (OEC). The triazole derivative of isoproturon was used as lead molecule for docking. The best worked out conformation of TDI was chosen for receptor-based de novo ligand design. In silico designed molecules were screened and, as a result, only those molecules that show higher docking and binding energies in comparison to isoproturon and its triazole derivative were proposed for synthesis in order to get more potent, non-resistant and more selective TDI analogs.

  9. Modernization of the graphics post-processors of the Hamburg German Climate Computer Center Carbon Cycle Codes

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, E.J.; McNeilly, G.S.

    1994-03-01

    The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.

  10. Comparison of canal transportation and centering ability of hand Protaper files and rotary Protaper files by using micro computed tomography

    OpenAIRE

    Amit Gandhi; Taru Gandhi

    2011-01-01

    Introduction and objective: The aim of the present study was to compare root canal preparation with rotary ProTaper files and hand ProTaper files to find a better instrumentation technique for maintaining root canal geometry with the aid of computed tomography. Material and methods: Twenty curved root canals with at least 10 degree of curvature were divided into 2 groups of 10 teeth each. In group I the canals were prepared with hand ProTaper files and in group II the canals were prepared wit...

  11. Root Canal Transportation and Centering Ability of Nickel-Titanium Rotary Instruments in Mandibular Premolars Assessed Using Cone-Beam Computed Tomography.

    Science.gov (United States)

    Mamede-Neto, Iussif; Borges, Alvaro Henrique; Guedes, Orlando Aguirre; de Oliveira, Durvalino; Pedro, Fábio Luis Miranda; Estrela, Carlos

    2017-01-01

    The aim of this study was to evaluate, using cone-beam computed tomography (CBCT), transportation and centralization of different nickel-titanium (NiTi) rotary instruments. One hundred and twenty eight mandibular premolars were selected and instrumented using the following brands of NiTi files: WaveOne, WaveOne Gold, Reciproc, ProTaper Next, ProTaper Gold, Mtwo, BioRaCe and RaCe. CBCT imaging was performed before and after root canal preparation to obtain measurements of mesial and distal dentin walls and calculations of root canal transportation and centralization. A normal distribution of data was confirmed by the Kolmogorov-Smirnov and Levene tests, and results were assessed using the Kruskal-Wallis test. Statistical significance was set at 5%. ProTaper Gold produced the lowest canal transportation values, and RaCe, the highest. ProTaper Gold files also showed the highest values for centering ability, whereas BioRaCe showed the lowest. No significant differences were found across the different instruments in terms of canal transportation and centering ability (P > 0.05). Based on the methodology employed, all instruments used for root canal preparation of mandibular premolars performed similarly with regard to canal transportation and centering ability.

  12. Comparison of canal transportation and centering ability of twisted files, Pathfile-ProTaper system, and stainless steel hand K-files by using computed tomography.

    Science.gov (United States)

    Gergi, Richard; Rjeily, Joe Abou; Sader, Joseph; Naaman, Alfred

    2010-05-01

    The purpose of this study was to compare canal transportation and centering ability of 2 rotary nickel-titanium (NiTi) systems (Twisted Files [TF] and Pathfile-ProTaper [PP]) with conventional stainless steel K-files. Ninety root canals with severe curvature and short radius were selected. Canals were divided randomly into 3 groups of 30 each. After preparation with TF, PP, and stainless steel files, the amount of transportation that occurred was assessed by using computed tomography. Three sections from apical, mid-root, and coronal levels of the canal were recorded. Amount of transportation and centering ability were assessed. The 3 groups were statistically compared with analysis of variance and Tukey honestly significant difference test. Less transportation and better centering ability occurred with TF rotary instruments (P < .0001). K-files showed the highest transportation followed by PP system. PP system showed significant transportation when compared with TF (P < .0001). The TF system was found to be the best for all variables measured in this study. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  13. Computer classes and games in virtual reality environment to reduce loneliness among students of an elderly reference center: Study protocol for a randomised cross-over design.

    Science.gov (United States)

    Antunes, Thaiany Pedrozo Campos; Oliveira, Acary Souza Bulle de; Crocetta, Tania Brusque; Antão, Jennifer Yohanna Ferreira de Lima; Barbosa, Renata Thais de Almeida; Guarnieri, Regiani; Massetti, Thais; Monteiro, Carlos Bandeira de Mello; Abreu, Luiz Carlos de

    2017-03-01

    Physical and mental changes associated with aging commonly lead to a decrease in communication capacity, reducing social interactions and increasing loneliness. Computer classes for older adults make significant contributions to social and cognitive aspects of aging. Games in a virtual reality (VR) environment stimulate the practice of communicative and cognitive skills and might also bring benefits to older adults. Furthermore, it might help to initiate their contact to the modern technology. The purpose of this study protocol is to evaluate the effects of practicing VR games during computer classes on the level of loneliness of students of an elderly reference center. This study will be a prospective longitudinal study with a randomised cross-over design, with subjects aged 50 years and older, of both genders, spontaneously enrolled in computer classes for beginners. Data collection will be done in 3 moments: moment 0 (T0) - at baseline; moment 1 (T1) - after 8 typical computer classes; and moment 2 (T2) - after 8 computer classes which include 15 minutes for practicing games in VR environment. A characterization questionnaire, the short version of the Short Social and Emotional Loneliness Scale for Adults (SELSA-S) and 3 games with VR (Random, MoviLetrando, and Reaction Time) will be used. For the intervention phase 4 other games will be used: Coincident Timing, Motor Skill Analyser, Labyrinth, and Fitts. The statistical analysis will compare the evolution in loneliness perception, performance, and reaction time during the practice of the games between the 3 moments of data collection. Performance and reaction time during the practice of the games will also be correlated to the loneliness perception. The protocol is approved by the host institution's ethics committee under the number 52305215.3.0000.0082. Results will be disseminated via peer-reviewed journal articles and conferences. This clinical trial is registered at ClinicalTrials.gov identifier: NCT

  14. Cool computers in a bunker. 10 000 kW of cold demand for 160 000 internet computers; Coole Rechner im Bunker. 10 000 kW Kaeltebedarf fuer 160 000 Internetrechner

    Energy Technology Data Exchange (ETDEWEB)

    Klein, S. [Combitherm GmbH, Stuttgart-Fellbach (Germany)

    2007-06-15

    In 2005, Combitherm GmbH of Stuttgart-Fellbach, a producer of refrigerators and heat pumps specializing in customized solutions, was given an unusual order as 1 and 1 Internet AG, one of the world's biggest internet providers, was looking for a cooling concept for their new central computer system near Baden-Baden, which was to become a central node in international data transmission. Combitherm already had experience with cold water units and free cooling elements in the 5000 kW range for a big computer center. The tasks were defined in close cooperation with the customer and with a Karlsruhe bureau of engineering consultants, and a refrigerating concept was developed. (orig.)

  15. Effort-reward imbalance and one-year change in neck-shoulder and upper extremity pain among call center computer operators.

    Science.gov (United States)

    Krause, Niklas; Burgel, Barbara; Rempel, David

    2010-01-01

    The literature on psychosocial job factors and musculoskeletal pain is inconclusive in part due to insufficient control for confounding by biomechanical factors. The aim of this study was to investigate prospectively the independent effects of effort-reward imbalance (ERI) at work on regional musculoskeletal pain of the neck and upper extremities of call center operators after controlling for (i) duration of computer use both at work and at home, (ii) ergonomic workstation design, (iii) physical activities during leisure time, and (iv) other individual worker characteristics. This was a one-year prospective study among 165 call center operators who participated in a randomized ergonomic intervention trial that has been described previously. Over an approximate four-week period, we measured ERI and 28 potential confounders via a questionnaire at baseline. Regional upper-body pain and computer use was measured by weekly surveys for up to 12 months following the implementation of ergonomic interventions. Regional pain change scores were calculated as the difference between average weekly pain scores pre- and post intervention. A significant relationship was found between high average ERI ratios and one-year increases in right upper-extremity pain after adjustment for pre-intervention regional mean pain score, current and past physical workload, ergonomic workstation design, and anthropometric, sociodemographic, and behavioral risk factors. No significant associations were found with change in neck-shoulder or left upper-extremity pain. This study suggests that ERI predicts regional upper-extremity pain in -computer operators working >or=20 hours per week. Control for physical workload and ergonomic workstation design was essential for identifying ERI as a risk factor.

  16. Design and analysis of a tendon-based computed tomography-compatible robot with remote center of motion for lung biopsy.

    Science.gov (United States)

    Yang, Yunpeng; Jiang, Shan; Yang, Zhiyong; Yuan, Wei; Dou, Huaisu; Wang, Wei; Zhang, Daguang; Bian, Yuan

    2017-04-01

    Nowadays, biopsy is a decisive method of lung cancer diagnosis, whereas lung biopsy is time-consuming, complex and inaccurate. So a computed tomography-compatible robot for rapid and precise lung biopsy is developed in this article. According to the actual operation process, the robot is divided into two modules: 4-degree-of-freedom position module for location of puncture point is appropriate for patient's almost all positions and 3-degree-of-freedom tendon-based orientation module with remote center of motion is compact and computed tomography-compatible to orientate and insert needle automatically inside computed tomography bore. The workspace of the robot surrounds patient's thorax, and the needle tip forms a cone under patient's skin. A new error model of the robot based on screw theory is proposed in view of structure error and actuation error, which are regarded as screw motions. Simulation is carried out to verify the precision of the error model contrasted with compensation via inverse kinematics. The results of insertion experiment on specific phantom prove the feasibility of the robot with mean error of 1.373 mm in laboratory environment, which is accurate enough to replace manual operation.

  17. Comparison of canal transportation and centering ability of rotary protaper, one shape system and wave one system using cone beam computed tomography: An in vitro study

    Science.gov (United States)

    Tambe, Varsha Harshal; Nagmode, Pradnya Sunil; Abraham, Sathish; Patait, Mahendra; Lahoti, Pratik Vinod; Jaju, Neha

    2014-01-01

    Aim: The aim of the present study was to compare the canal transportation and centering ability of Rotary ProTaper, One Shape and Wave One systems using cone beam computed tomography (CBCT) in curved root canals to find better instrumentation technique for maintaining root canal geometry. Materials and Methods: Total 30 freshly extracted premolars having curved root canals with at least 10 degrees of curvature were divided into three groups of 10 teeth each. All teeth were scanned by CBCT to determine the root canal shape before instrumentation. In Group 1, the canals were prepared with Rotary ProTaper files, in Group 2 the canals were prepared with One Shape files and in Group 3 canals were prepared with Wave One files. After preparation, post-instrumentation scan was performed. Pre-instrumentation and post-instrumentation images were obtained at three levels, 3 mm apical, 3 mm coronal and 8 mm apical above the apical foramen were compared using CBCT software. Amount of transportation and centering ability were assessed. The three groups were statistically compared with analysis of variance and Tukey honestly significant. Results: All instruments maintained the original canal curvature with significant differences between the different files. Data suggested that Wave One files presented the best outcomes for both the variables evaluated. Wave One files caused lesser transportation and remained better centered in the canal than One Shape and Rotary ProTaper files. Conclusion: The canal preparation with Wave One files showed lesser transportation and better centering ability than One Shape and ProTaper. PMID:25506145

  18. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study

    Science.gov (United States)

    Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-01-01

    Background The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. Aim The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Materials and Methods Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20o to 35o were divided into three groups of 20 samples each: ProTaper PT (group I) – full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO – single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey’s honestly significant difference test. Results It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant Conclusion It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal

  19. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study.

    Science.gov (United States)

    Agarwal, Rolly S; Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-05-01

    The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20(o) to 35(o) were divided into three groups of 20 samples each: ProTaper PT (group I) - full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO - single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey's honestly significant difference test. It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant. It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence

  20. Environmental Modeling Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Environmental Modeling Center provides the computational tools to perform geostatistical analysis, to model ground water and atmospheric releases for comparison...

  1. Hock, Beáta. 2013. Gendered Artistic Positions and Social Voices - Politics, Cinema and the Visual Arts in State-Socialist and Post-Socialist Hungary. Stuttgart: Franz Steiner Verlag. 284 pp. illus.

    Directory of Open Access Journals (Sweden)

    Lilla Tőke

    2016-01-01

    Full Text Available Hock, Beáta. 2013. Gendered Artistic Positions and Social Voices - Politics, Cinema and the Visual Arts in State-Socialist and Post-Socialist Hungary. Stuttgart: Franz Steiner Verlag. 284 pp. illus. Reviewed by Lilla Tőke, Assistant Professor, City University of New York, LaGuardia Community College

  2. A User-Centered Mobile Cloud Computing Platform for Improving Knowledge Management in Small-to-Medium Enterprises in the Chilean Construction Industry

    Directory of Open Access Journals (Sweden)

    Daniela Núñez

    2018-03-01

    Full Text Available Knowledge management (KM is a key element for the development of small-to-medium enterprises (SMEs in the construction industry. This is particularly relevant in Chile, where this industry is composed almost entirely of SMEs. Although various KM system proposals can be found in the literature, they are not suitable for SMEs, due to usability problems, budget constraints, and time and connectivity issues. Mobile Cloud Computing (MCC systems offer several advantages to construction SMEs, but they have not yet been exploited to address KM needs. Therefore, this research is aimed at the development of a MCC-based KM platform to manage lessons learned in different construction projects of SMEs, through an iterative and user-centered methodology. Usability and quality evaluations of the proposed platform show that MCC is a feasible and attractive option to address the KM issues in SMEs of the Chilean construction industry, since it is possible to consider both technical and usability requirements.

  3. Spinopelvic dissociation: multidetector computed tomographic evaluation of fracture patterns and associated injuries at a single level 1 trauma center.

    Science.gov (United States)

    Gupta, Pushpender; Barnwell, Jonathan C; Lenchik, Leon; Wuertzer, Scott D; Miller, Anna N

    2016-06-01

    The objective of the present study is to evaluate multidetector computed tomographic (MDCT) fracture patterns and associated injuries in patients with spinopelvic dissociation (SPD). Our institutional trauma registry database was reviewed from Jan. 1, 2006, to Sept. 30, 2012, specifically evaluating patients with sacral fractures. MDCT scans of patients with sacral fractures were reviewed to determine the presence of SPD. SPD cases were characterized into the following fracture patterns: U-shaped, Y-shaped, T-shaped, H-shaped, and burst. The following MDCT features were recorded: level of the horizontal fracture, location of vertical fracture, kyphosis between major fracture fragments, displacement of fracture fragment, narrowing of central spinal canal, narrowing of neural foramina, and extension into sacroiliac joints. Quantitative evaluation of the sacral fractures was performed in accordance with the consensus statement by the Spine Trauma Study Group. Medical records were reviewed to determine associated pelvic and non-pelvic fractures, bladder and bowel injuries, nerve injuries, and type of surgical intervention. Twenty-one patients had SPD, of whom 13 were men and eight were women. Mean age was 41.8 years (range 18.8 to 87.7). Five fractures (24 %) were U-shaped, six (29 %) H-shaped, four (19 %) Y-shaped, and six (29 %) burst. Nine patients (43 %) had central canal narrowing, and 19 (90 %) had neural foramina narrowing. Eleven patients (52 %) had kyphotic angulation between major fracture fragments, and seven patients (33 %) had either anterior (24 %) or posterior (10 %) displacement of the proximal fracture fragment. Fourteen patients (67 %) had associated pelvic fractures, and 20 (95 %) had associated non-pelvic fractures. Two patients (10 %) had associated urethral injuries, and one (5 %) had an associated colon injury. Seven patients (33 %) had associated nerve injuries. Six patients (29 %) had surgical fixation while 15 (71 %) were

  4. Computed tomography-guided needle aspiration and biopsy of pulmonary lesions - A single-center experience in 1000 patients

    Energy Technology Data Exchange (ETDEWEB)

    Poulou, Loukia S.; Tsagouli, Paraskevi; Thanos, Loukas [Dept. of Medical Imaging and Interventional Radiology, General Hospital of Chest Diseases ' Sotiria' , Athens (Greece)], e-mail: ploukia@hotmail.com; Ziakas, Panayiotis D. [Program of Outcomes Research, Div. of Infectious Diseases, Warren Alpert Medical School, Brown Univ., RI, and Div. of Infectious Diseases, Rhode Island Hospital, Rhode Island (United States); Politi, Dimitra [Dept. of Cythopathology, General Hospital of Chest Diseases ' Sotiria' Athens (Greece); Trigidou, Rodoula [Dept. of Pathology, General Hospital of Chest Diseases ' Sotiria' Athens (Greece)

    2013-07-15

    Background: Computed tomography (CT)-guided fine needle aspiration (FNA) and biopsies are well-established, minimally invasive diagnostic tools for pulmonary lesions. Purpose: To analyze retrospectively the results of 1000 consecutive lung CT-guided FNA and/or core needle biopsies (CNB), the main outcome measures being diagnostic yield, and complication rates. Material and Methods: Patients considered eligible were those referred to our department for lung lesions. The choice of FNA, CNB, or both was based upon the radiologist's judgment. Diagnostic yield was defined as the probability of having a definite result by cytology/histology. Results: The study included 733 male patients and 267 female patients, with a mean (SD) age of 66.4 (11.4) years. The mean (SD) lesion size was 3.7 (2.4) cm in maximal diameter. Six hundred and forty-one (64%) patients underwent an FNA procedure, 245 (25%) a CNB, and 114 (11%) had been subjected to both. The diagnostic yield was 960/994 (96.6%); this decreased significantly with the use of CNB only (odds ratio [OR] 0.32; 95% CI 0.12 - 0.88; P = 0.03), while it increased with lesion size (OR 1.35; 95% CI 1.03 - 1.79; P = 0.03 per cm increase). In 506 patients (52.7%), a malignant process was diagnosed by cytopathology/histology. The complication rate reached 97/1000 (9.7%); complications included: hemorrhage, 62 (6.2%); pneumothorax, 28 (2.8%); hemorrhage and pneumothorax, 5 (0.5%); and hemoptysis, 2 (0.2%). It was not significantly affected by the type of procedure or localization of the lesion. The overall risk for complications was three times higher for lesions <4 cm (OR 3.26; 95% CI 1.96 - 5.42; P < 0.001). Conclusion: CT-guided lung biopsy has a high diagnostic yield using FNA, CNB, or both. The CNB procedure alone will not suffice. Complication rates were acceptable and correlated inversely with lesion size, not localization or type of procedure.

  5. Projected Applications of a ``Climate in a Box'' Computing System at the NASA Short-term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, G.; Molthan, A.; Zavodsky, B.; Case, J.; Lafontaine, F.

    2010-12-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to “Climate in a Box” systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the “Climate in a Box” system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA’s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the “Climate in a Box” system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed

  6. Projected Applications of a "Climate in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, Gary J.; Molthan, Andrew L.; Zavodsky, Bradley; Case, Jonathan L.; LaFontaine, Frank J.

    2010-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to "Climate in a Box" systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the "Climate in a Box" system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the "Climate in a Box" system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed within the NASA SPo

  7. Nuclear energy research in Germany 2008. Research centers and universities

    International Nuclear Information System (INIS)

    Tromm, Walter

    2009-01-01

    This summary report presents nuclear energy research at research centers and universities in Germany in 2008. Activities are explained on the basis of examples of research projects and a description of the situation of research and teaching in general. Participants are the - Karlsruhe Research Center, - Juelich Research Center (FZJ), - Dresden-Rossendorf Research Center (FZD), - Verein fuer Kernverfahrenstechnik und Analytik Rossendorf e.V. (VKTA), - Technical University of Dresden, - University of Applied Sciences, Zittau/Goerlitz, - Institute for Nuclear Energy and Energy Systems (IKE) at the University of Stuttgart, - Reactor Simulation and Reactor Safety Working Group at the Bochum Ruhr University. (orig.)

  8. A qualitative study adopting a user-centered approach to design and validate a brain computer interface for cognitive rehabilitation for people with brain injury.

    Science.gov (United States)

    Martin, Suzanne; Armstrong, Elaine; Thomson, Eileen; Vargiu, Eloisa; Solà, Marc; Dauwalder, Stefan; Miralles, Felip; Daly Lynn, Jean

    2017-07-14

    Cognitive rehabilitation is established as a core intervention within rehabilitation programs following a traumatic brain injury (TBI). Digitally enabled assistive technologies offer opportunities for clinicians to increase remote access to rehabilitation supporting transition into home. Brain Computer Interface (BCI) systems can harness the residual abilities of individuals with limited function to gain control over computers through their brain waves. This paper presents an online cognitive rehabilitation application developed with therapists, to work remotely with people who have TBI, who will use BCI at home to engage in the therapy. A qualitative research study was completed with people who are community dwellers post brain injury (end users), and a cohort of therapists involved in cognitive rehabilitation. A user-centered approach over three phases in the development, design and feasibility testing of this cognitive rehabilitation application included two tasks (Find-a-Category and a Memory Card task). The therapist could remotely prescribe activity with different levels of difficulty. The service user had a home interface which would present the therapy activities. This novel work was achieved by an international consortium of academics, business partners and service users.

  9. SCELib3.0: The new revision of SCELib, the parallel computational library of molecular properties in the Single Center Approach

    Science.gov (United States)

    Sanna, N.; Baccarelli, I.; Morelli, G.

    2009-12-01

    SCELib is a computer program which implements the Single Center Expansion (SCE) method to describe molecular electronic densities and the interaction potentials between a charged projectile (electron or positron) and a target molecular system. The first version (CPC Catalog identifier ADMG_v1_0) was submitted to the CPC Program Library in 2000, and version 2.0 (ADMG_v2_0) was submitted in 2004. We here announce the new release 3.0 which presents additional features with respect to the previous versions aiming at a significative enhance of its capabilities to deal with larger molecular systems. SCELib 3.0 allows for ab initio effective core potential (ECP) calculations of the molecular wavefunctions to be used in the SCE method in addition to the standard all-electron description of the molecule. The list of supported architectures has been updated and the code has been ported to platforms based on accelerating coprocessors, such as the NVIDIA GPGPU and the new parallel model adopted is able to efficiently run on a mixed many-core computing system. Program summaryProgram title: SCELib3.0 Catalogue identifier: ADMG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADMG_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2 018 862 No. of bytes in distributed program, including test data, etc.: 4 955 014 Distribution format: tar.gz Programming language: C Compilers used: xlc V8.x, Intel C V10.x, Portland Group V7.x, nvcc V2.x Computer: All SMP platforms based on AIX, Linux and SUNOS operating systems over SPARC, POWER, Intel Itanium2, X86, em64t and Opteron processors Operating system: SUNOS, IBM AIX, Linux RedHat (Enterprise), Linux SuSE (SLES) Has the code been vectorized or parallelized?: Yes. 1 to 32 (CPU or GPU) used RAM: Up to 32 GB depending on the molecular

  10. Declining trend in the use of repeat computed tomography for trauma patients admitted to a level I trauma center for traffic-related injuries

    Energy Technology Data Exchange (ETDEWEB)

    Psoter, Kevin J., E-mail: kevinp2@u.washington.edu [Department of Epidemiology, University of Washington, Box 357236, Seattle, WA 98195 (United States); Roudsari, Bahman S., E-mail: roudsari@u.washington.edu [Department of Radiology, Comparative Effectiveness, Cost and Outcomes Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States); Graves, Janessa M., E-mail: janessa@u.washington.edu [Department of Pediatrics, Harborview Injury Prevention and Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States); Mack, Christopher, E-mail: cdmack@uw.edu [Harborview Injury Prevention and Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States); Jarvik, Jeffrey G., E-mail: jarvikj@u.washington.edu [Department of Radiology and Department of Neurological Surgery, Comparative Effectiveness, Cost and Outcomes Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States)

    2013-06-15

    Objective: To evaluate the trend in utilization of repeat (i.e. ≥2) computed tomography (CT) and to compare utilization patterns across body regions for trauma patients admitted to a level I trauma center for traffic-related injuries (TRI). Materials and Methods: We linked the Harborview Medical Center trauma registry (1996–2010) to the billing department data. We extracted the following variables: type and frequency of CTs performed, age, gender, race/ethnicity, insurance status, injury mechanism and severity, length of hospitalization, intensive care unit (ICU) admission and final disposition. TRIs were defined as motor vehicle collisions, motorcycle, bicycle and pedestrian-related injuries. Logistic regression was used to evaluate the association between utilization of different body region repeat (i.e. ≥2) CTs and year of admission, adjusting for patient and injury-related characteristics that could influence utilization patterns. Results: A total of 28,431 patients were admitted for TRIs over the study period and 9499 (33%) received repeat CTs. From 1996 to 2010, the proportion of patients receiving repeat CTs decreased by 33%. Relative to 2000 and adjusting for other covariates, patients with TRIs admitted in 2010 had significantly lower odds of undergoing repeat head (OR = 0.61; 95% CI: 0.49–0.76), pelvis (OR = 0.37; 95% CI: 0.27–0.52), cervical spine (OR = 0.23; 95% CI: 0.12–0.43), and maxillofacial CTs (OR = 0.24; 95% CI: 0.10–0.57). However, they had higher odds of receiving repeat thoracic CTs (OR = 1.86; 95% CI: 1.02–3.38). Conclusion: A significant decrease in the utilization of repeat CTs was observed in trauma patients presenting with traffic-related injuries over a 15-year period.

  11. Temporal trends in compliance with appropriateness criteria for stress single-photon emission computed tomography sestamibi studies in an academic medical center.

    Science.gov (United States)

    Gibbons, Raymond J; Askew, J Wells; Hodge, David; Miller, Todd D

    2010-03-01

    The purpose of this study was to apply published appropriateness criteria for single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) in a single academic medical center to determine if the percentage of inappropriate studies was changing over time. In a previous study, we applied the American College of Cardiology Foundation/American Society of Nuclear Cardiology (ASNC) appropriateness criteria for stress SPECT MPI and reported that 14% of stress SPECT studies were performed for inappropriate reasons. Using similar methodology, we retrospectively examined 284 patients who underwent stress SPECT MPI in October 2006 and compared the findings to the previous cohort of 284 patients who underwent stress SPECT MPI in May 2005. The indications for testing in the 2 cohorts were very similar. The overall level of agreement in characterizing categories of appropriateness between 2 experienced cardiovascular nurse abstractors was good (kappa = 0.68), which represented an improvement from our previous study (kappa = 0.56). There was a significant change between May 2005 and October 2006 in the overall classification of categories for appropriateness (P = .024 by chi(2) statistic). There were modest, but insignificant, increases in the number of patients who were unclassified (15% in the current study vs 11% previously), appropriate (66% vs 64%), and uncertain (12% vs 11%). Only 7% of the studies in the current study were inappropriate, which represented a significant (P = .004) decrease from the 14% reported in the 2005 cohort. In the absence of any specific intervention, there was a significant change in the overall classification of SPECT appropriateness in an academic medical center over 17 months. The only significant difference in individual categories was a decrease in inappropriate studies. Additional measurements over time will be required to determine if this trend is sustainable or generalizable.

  12. Transformation of topologically close-packed β-W to body-centered cubic α-W: Comparison of experiments and computations.

    Science.gov (United States)

    Barmak, Katayun; Liu, Jiaxing; Harlan, Liam; Xiao, Penghao; Duncan, Juliana; Henkelman, Graeme

    2017-10-21

    The enthalpy and activation energy for the transformation of the metastable form of tungsten, β-W, which has the topologically close-packed A15 structure (space group Pm3¯n), to equilibrium α-W, which is body-centered cubic (A2, space group Im3¯m), was measured using differential scanning calorimetry. The β-W films were 1 μm-thick and were prepared by sputter deposition in argon with a small amount of nitrogen. The transformation enthalpy was measured as -8.3 ± 0.4 kJ/mol (-86 ± 4 meV/atom) and the transformation activation energy as 2.2 ± 0.1 eV. The measured enthalpy was found to agree well with the difference in energies of α and β tungsten computed using density functional theory, which gave a value of -82 meV/atom for the transformation enthalpy. A calculated concerted transformation mechanism with a barrier of 0.4 eV/atom, in which all the atoms in an A15 unit cell transform into A2, was found to be inconsistent with the experimentally measured activation energy for any critical nucleus larger than two A2 unit cells. Larger calculations of eight A15 unit cells spontaneously relax to a mechanism in which part of the supercell first transforms from A15 to A2, creating a phase boundary, before the remaining A15 transforms into the A2 phase. Both calculations indicate that a nucleation and growth mechanism is favored over a concerted transformation. More consistent with the experimental activation energy was that of a calculated local transformation mechanism at the A15-A2 phase boundary, computed as 1.7 eV using molecular dynamics simulations. This calculated phase transformation mechanism involves collective rearrangements of W atoms in the disordered interface separating the A15 and A2 phases.

  13. The Place of Computed Tomography as a Guidance Modality in Percutaneous Nephrostomy: Analysis of a 10-Year Single-Center Experience

    International Nuclear Information System (INIS)

    Egilmez, H.; Oztoprak, I.; Atalar, M.; Cetin, A.; Gumus, C.; Gultekin, Y.; Bulut, S.; Arslan, M.; Solak, O.

    2007-01-01

    Background: Percutaneous nephrostomy (PCN) has been established as an effective technique for urinary decompression or diversion. This procedure may be performed with the guidance of fluoroscopy, ultrasonography, a combination of fluoroscopy and ultrasonography, computed tomography (CT), or magnetic resonance imaging. Purpose: To retrospectively review experience with CT-guided PCN over a 10-year period in a single center. Material and Methods: All CT-guided PCN procedures performed in adults at our institution between 1995 and 2005 were evaluated. In 882 patients, 1113 nephrostomy catheters were inserted. Interventional radiologists or radiology residents under direct attending supervision inserted all catheters. During the PCN procedure, bleeding, sepsis, and injuries to adjacent organs were regarded as major complications. Clinical events requiring nominal therapy with no sequelae were regarded as minor complications. Results: PCN procedures were performed via 1-3 punctures in patients with grades 0-1 and 2 hydronephrosis, and via 1-2 punctures in patients with grade 3 hydronephrosis. They were carried out with a procedure time ranging from 9 to 26 min. All PCNs were considered as technically successful, and no major complications were observed. There were minor complications including transient macroscopic hematuria (28.6%, 19.9%, and 4.9% in patients with hydronephrosis grades 0-1, 2, and 3, respectively) and perirenal hematomas in a total of eight patients. No patient required additional intervention secondary to complications of the PCN procedure. Conclusion: CT-guided PCN is an efficient and safe procedure with major and minor complication rates below the accepted thresholds. It can be used for the management of patients requiring nephrostomy insertion in inpatient settings, and might be a preferable procedure in patients with minimal or no dilatation of the renal pelvis. Keywords: Computed tomography; percutaneous nephrostomy; urinary obstruction

  14. The Place of Computed Tomography as a Guidance Modality in Percutaneous Nephrostomy: Analysis of a 10-Year Single-Center Experience

    Energy Technology Data Exchange (ETDEWEB)

    Egilmez, H.; Oztoprak, I.; Atalar, M.; Cetin, A.; Gumus, C.; Gultekin, Y.; Bulut, S.; Arslan, M.; Solak, O. [Depts. of Radiology, Obstetrics and Gynecology, and Urology, Cumhuriyet Univ. School of Medicine, Sivas (Turkey)

    2007-09-15

    Background: Percutaneous nephrostomy (PCN) has been established as an effective technique for urinary decompression or diversion. This procedure may be performed with the guidance of fluoroscopy, ultrasonography, a combination of fluoroscopy and ultrasonography, computed tomography (CT), or magnetic resonance imaging. Purpose: To retrospectively review experience with CT-guided PCN over a 10-year period in a single center. Material and Methods: All CT-guided PCN procedures performed in adults at our institution between 1995 and 2005 were evaluated. In 882 patients, 1113 nephrostomy catheters were inserted. Interventional radiologists or radiology residents under direct attending supervision inserted all catheters. During the PCN procedure, bleeding, sepsis, and injuries to adjacent organs were regarded as major complications. Clinical events requiring nominal therapy with no sequelae were regarded as minor complications. Results: PCN procedures were performed via 1-3 punctures in patients with grades 0-1 and 2 hydronephrosis, and via 1-2 punctures in patients with grade 3 hydronephrosis. They were carried out with a procedure time ranging from 9 to 26 min. All PCNs were considered as technically successful, and no major complications were observed. There were minor complications including transient macroscopic hematuria (28.6%, 19.9%, and 4.9% in patients with hydronephrosis grades 0-1, 2, and 3, respectively) and perirenal hematomas in a total of eight patients. No patient required additional intervention secondary to complications of the PCN procedure. Conclusion: CT-guided PCN is an efficient and safe procedure with major and minor complication rates below the accepted thresholds. It can be used for the management of patients requiring nephrostomy insertion in inpatient settings, and might be a preferable procedure in patients with minimal or no dilatation of the renal pelvis. Keywords: Computed tomography; percutaneous nephrostomy; urinary obstruction.

  15. Spelling is Just a Click Away - A User-Centered Brain-Computer Interface Including Auto-Calibration and Predictive Text Entry.

    Science.gov (United States)

    Kaufmann, Tobias; Völker, Stefan; Gunesch, Laura; Kübler, Andrea

    2012-01-01

    Brain-computer interfaces (BCI) based on event-related potentials (ERP) allow for selection of characters from a visually presented character-matrix and thus provide a communication channel for users with neurodegenerative disease. Although they have been topic of research for more than 20 years and were multiply proven to be a reliable communication method, BCIs are almost exclusively used in experimental settings, handled by qualified experts. This study investigates if ERP-BCIs can be handled independently by laymen without expert support, which is inevitable for establishing BCIs in end-user's daily life situations. Furthermore we compared the classic character-by-character text entry against a predictive text entry (PTE) that directly incorporates predictive text into the character-matrix. N = 19 BCI novices handled a user-centered ERP-BCI application on their own without expert support. The software individually adjusted classifier weights and control parameters in the background, invisible to the user (auto-calibration). All participants were able to operate the software on their own and to twice correctly spell a sentence with the auto-calibrated classifier (once with PTE, once without). Our PTE increased spelling speed and, importantly, did not reduce accuracy. In sum, this study demonstrates feasibility of auto-calibrating ERP-BCI use, independently by laymen and the strong benefit of integrating predictive text directly into the character-matrix.

  16. grid will help physicists' global hunt for particles Researchers have begun running experiments with the MidWest Tier 2 Center, one of five regional computing centers in the US.

    CERN Multimedia

    Ames, Ben

    2006-01-01

    "When physicists at Switzerland's CERN laboratory turn on their newsest particle collider in 2007, they will rely on computer scientists in Chicago and Indianapolis to help sift through the results using a worldwide supercomputing grid." (1/2 page)

  17. Research center Juelich to install Germany's most powerful supercomputer new IBM System for science and research will achieve 5.8 trillion computations per second

    CERN Multimedia

    2002-01-01

    "The Research Center Juelich, Germany, and IBM today announced that they have signed a contract for the delivery and installation of a new IBM supercomputer at the Central Institute for Applied Mathematics" (1/2 page).

  18. Test of the Center for Automated Processing of Hardwoods' Auto-Image Detection and Computer-Based Grading and Cutup System

    Science.gov (United States)

    Philip A. Araman; Janice K. Wiedenbeck

    1995-01-01

    Automated lumber grading and yield optimization using computer controlled saws will be plausible for hardwoods if and when lumber scanning systems can reliably identify all defects by type. Existing computer programs could then be used to grade the lumber, identify the best cut-up solution, and control the sawing machines. The potential value of a scanning grading...

  19. Computer-Based Training at a Military Medical Center: Understanding Decreased Participation in Training among Staff and Ways to Improve Completion Rates

    Science.gov (United States)

    Lavender, Julie

    2013-01-01

    Military health care facilities make extensive use of computer-based training (CBT) for both clinical and non-clinical staff. Despite evidence identifying various factors that may impact CBT, the problem is unclear as to what factors specifically influence employee participation in computer-based training. The purpose of this mixed method case…

  20. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  1. Feasibility and safety of augmented-reality glass for computed tomography-assisted percutaneous revascularization of coronary chronic total occlusion: A single center prospective pilot study.

    Science.gov (United States)

    Opolski, Maksymilian P; Debski, Artur; Borucki, Bartosz A; Staruch, Adam D; Kepka, Cezary; Rokicki, Jakub K; Sieradzki, Bartosz; Witkowski, Adam

    2017-11-01

    Percutaneous coronary intervention (PCI) of chronic total occlusion (CTO) may be facilitated by projection of coronary computed tomography angiography (CTA) datasets in the catheterization laboratory. There is no data on the feasibility and safety outcomes of CTA-assisted CTO PCI using a wearable augmented-reality glass. A total of 15 patients scheduled for elective antegrade CTO intervention were prospectively enrolled and underwent preprocedural coronary CTA. Three-dimensional and curved multiplanar CT reconstructions were transmitted to a head-mounted hands-free computer worn by interventional cardiologists during CTO PCI to provide additional information on CTO tortuosity and calcification. The results of CTO PCI using a wearable computer were compared with a time-matched prospective angiographic registry of 59 patients undergoing antegrade CTO PCI without a wearable computer. Operators' satisfaction was assessed by a 5-point Likert scale. Mean age was 64 ± 8 years and the mean J-CTO score was 2.1 ± 0.9 in the CTA-assisted group. The voice-activated co-registration and review of CTA images in a wearable computer during CTO PCI were feasible and highly rated by PCI operators (4.7/5 points). There were no major adverse cardiovascular events. Compared with standard CTO PCI, CTA-assisted recanalization of CTO using a wearable computer showed more frequent selection of the first-choice stiff wire (0% vs 40%, p augmented-reality glass is feasible and safe, and might reduce the resources required for the interventional treatment of CTO. Copyright © 2017 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  2. Handbook on data centers

    CERN Document Server

    Khan, Samee Ullah

    2015-01-01

    This handbook offers a comprehensive review of the state-of-the-art research achievements in the field of data centers. Contributions from international, leading researchers and scholars offer topics in cloud computing, virtualization in data centers, energy efficient data centers, and next generation data center architecture.  It also comprises current research trends in emerging areas, such as data security, data protection management, and network resource management in data centers. Specific attention is devoted to industry needs associated with the challenges faced by data centers, such as various power, cooling, floor space, and associated environmental health and safety issues, while still working to support growth without disrupting quality of service. The contributions cut across various IT data technology domains as a single source to discuss the interdependencies that need to be supported to enable a virtualized, next-generation, energy efficient, economical, and environmentally friendly data cente...

  3. Algorithmic trends in computational fluid dynamics; The Institute for Computer Applications in Science and Engineering (ICASE)/LaRC Workshop, NASA Langley Research Center, Hampton, VA, US, Sep. 15-17, 1991

    Science.gov (United States)

    Hussaini, M. Y. (Editor); Kumar, A. (Editor); Salas, M. D. (Editor)

    1993-01-01

    The purpose here is to assess the state of the art in the areas of numerical analysis that are particularly relevant to computational fluid dynamics (CFD), to identify promising new developments in various areas of numerical analysis that will impact CFD, and to establish a long-term perspective focusing on opportunities and needs. Overviews are given of discretization schemes, computational fluid dynamics, algorithmic trends in CFD for aerospace flow field calculations, simulation of compressible viscous flow, and massively parallel computation. Also discussed are accerelation methods, spectral and high-order methods, multi-resolution and subcell resolution schemes, and inherently multidimensional schemes.

  4. An accelerated line-by-line option for MODTRAN combining on-the-fly generation of line center absorption within 0.1 cm-1 bins and pre-computed line tails

    Science.gov (United States)

    Berk, Alexander; Conforti, Patrick; Hawes, Fred

    2015-05-01

    A Line-By-Line (LBL) option is being developed for MODTRAN6. The motivation for this development is two-fold. Firstly, when MODTRAN is validated against an independent LBL model, it is difficult to isolate the source of discrepancies. One must verify consistency between pressure, temperature and density profiles, between column density calculations, between continuum and particulate data, between spectral convolution methods, and more. Introducing a LBL option directly within MODTRAN will insure common elements for all calculations other than those used to compute molecular transmittances. The second motivation for the LBL upgrade is that it will enable users to compute high spectral resolution transmittances and radiances for the full range of current MODTRAN applications. In particular, introducing the LBL feature into MODTRAN will enable first-principle calculations of scattered radiances, an option that is often not readily available with LBL models. MODTRAN will compute LBL transmittances within one 0.1 cm-1 spectral bin at a time, marching through the full requested band pass. The LBL algorithm will use the highly accurate, pressure- and temperature-dependent MODTRAN Padé approximant fits of the contribution from line tails to define the absorption from all molecular transitions centered more than 0.05 cm-1 from each 0.1 cm-1 spectral bin. The beauty of this approach is that the on-the-fly computations for each 0.1 cm-1 bin will only require explicit LBL summing of transitions centered within a 0.2 cm-1 spectral region. That is, the contribution from the more distant lines will be pre-computed via the Padé approximants. The status of the LBL effort will be presented. This will include initial thermal and solar radiance calculations, validation calculations, and self-validations of the MODTRAN band model against its own LBL calculations.

  5. Surface phenomena in thermionic research. Oberflaechenphysikalische Probleme der Thermionik. Vortrage aus der Round-table- Konferenz im Institut fuer Energiewandlung und Elektrische Antriebe der DFVLR in Stuttgart 1972 [Nine papers, 4 summaries

    Energy Technology Data Exchange (ETDEWEB)

    Henne, R. (ed.)

    1973-07-15

    This report compiles papers concerning especially recent developments in work function theory and work function measurements, which were given at a round table conference about surface phenomena in thermionic research, arranged by the 'Deutsche Forschungs- und Versuchsanstalt fuer Luft- und Raumfahrt' in Stuttgart on Nov. 20 and Dec. 1, 1972. 9 papers are printed completely, 4 others in form of a summary. Two of them concern the work function of uncovered surfaces, 2 others show the influence of adsorbed electropositive elements (Cs, Sr) on work function. In 4 papers the coadsorption of electropositive (Cs, Sr, resp. Ba) and electronegative (O {sub 2}) elements and their influence on work function of different surfaces are discussed. Finally a paper is added, describing the development of Sr-Cs-alloys, which are of interest for the generation of the atmosphere of a Sr-Cs-converter by means of one single reservoir. (auth)

  6. Surface phenomena in thermionic research. Oberflaechenphysikalische Probleme der Thermionik. Vortrage aus der Round-table- Konferenz im Institut fuer Energiewandlung und Elektrische Antriebe der DFVLR in Stuttgart 1972 [Nine papers, 4 summaries

    Energy Technology Data Exchange (ETDEWEB)

    Henne, R [ed.

    1973-07-15

    This report compiles papers concerning especially recent developments in work function theory and work function measurements, which were given at a round table conference about surface phenomena in thermionic research, arranged by the 'Deutsche Forschungs- und Versuchsanstalt fuer Luft- und Raumfahrt' in Stuttgart on Nov. 20 and Dec. 1, 1972. 9 papers are printed completely, 4 others in form of a summary. Two of them concern the work function of uncovered surfaces, 2 others show the influence of adsorbed electropositive elements (Cs, Sr) on work function. In 4 papers the coadsorption of electropositive (Cs, Sr, resp. Ba) and electronegative (O {sub 2}) elements and their influence on work function of different surfaces are discussed. Finally a paper is added, describing the development of Sr-Cs-alloys, which are of interest for the generation of the atmosphere of a Sr-Cs-converter by means of one single reservoir. (auth)

  7. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  8. Projected Applications of a "Weather in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, Gary J.; Molthan, Andrew; Zavodsky, Bradley T.; Case, Jonathan L.; LaFontaine, Frank J.; Srikishen, Jayanthi

    2010-01-01

    The NASA Short-term Prediction Research and Transition Center (SPoRT)'s new "Weather in a Box" resources will provide weather research and forecast modeling capabilities for real-time application. Model output will provide additional forecast guidance and research into the impacts of new NASA satellite data sets and software capabilities. By combining several research tools and satellite products, SPoRT can generate model guidance that is strongly influenced by unique NASA contributions.

  9. Prospective randomized comparison of rotational angiography with three-dimensional reconstruction and computed tomography merged with electro-anatomical mapping: a two center atrial fibrillation ablation study.

    Science.gov (United States)

    Anand, Rishi; Gorev, Maxim V; Poghosyan, Hermine; Pothier, Lindsay; Matkins, John; Kotler, Gregory; Moroz, Sarah; Armstrong, James; Nemtsov, Sergei V; Orlov, Michael V

    2016-08-01

    To compare the efficacy and accuracy of rotational angiography with three-dimensional reconstruction (3DATG) image merged with electro-anatomical mapping (EAM) vs. CT-EAM. A prospective, randomized, parallel, two-center study conducted in 36 patients (25 men, age 65 ± 10 years) undergoing AF ablation (33 % paroxysmal, 67 % persistent) guided by 3DATG (group 1) vs. CT (group 2) image fusion with EAM. 3DATG was performed on the Philips Allura Xper FD 10 system. Procedural characteristics including time, radiation exposure, outcome, and navigation accuracy were compared between two groups. There was no significant difference between the groups in total procedure duration or time spent for various procedural steps. Minor differences in procedural characteristics were present between two centers. Segmentation and fusion time for 3DATG or CT-EAM was short and similar between both centers. Accuracy of navigation guided by either method was high and did not depend on left atrial size. Maintenance of sinus rhythm between the two groups was no different up to 24 months of follow-up. This study did not find superiority of 3DATG-EAM image merge to guide AF ablation when compared to CT-EAM fusion. Both merging techniques result in similar navigation accuracy.

  10. Rib Radiography versus Chest Computed Tomography in the Diagnosis of Rib Fractures.

    Science.gov (United States)

    Sano, Atsushi

    2018-05-01

     The accurate diagnosis of rib fractures is important in chest trauma. Diagnostic images following chest trauma are usually obtained via chest X-ray, chest computed tomography, or rib radiography. This study evaluated the diagnostic characteristics of rib radiography and chest computed tomography.  Seventy-five rib fracture patients who underwent both chest computed tomography and rib radiography between April 2008 and December 2013 were included. Rib radiographs, centered on the site of pain, were taken from two directions. Chest computed tomography was performed using a 16-row multidetector scanner with 5-mm slice-pitch without overlap, and axial images were visualized in a bone window.  In total, 217 rib fractures were diagnosed in 75 patients. Rib radiography missed 43 rib fractures in 24 patients. The causes were overlap with organs in 15 cases, trivial fractures in 21 cases, and injury outside the imaging range in 7 cases. Left lower rib fractures were often missed due to overlap with the heart, while middle and lower rib fractures were frequently not diagnosed due to overlap with abdominal organs. Computed tomography missed 21 rib fractures in 17 patients. The causes were horizontal fractures in 10 cases, trivial fractures in 9 cases, and insufficient breath holding in 1 case.  In rib radiography, overlap with organs and fractures outside the imaging range were characteristic reasons for missed diagnoses. In chest computed tomography, horizontal rib fractures and insufficient breath holding were often responsible. We should take these challenges into account when diagnosing rib fractures. Georg Thieme Verlag KG Stuttgart · New York.

  11. Structural analysis of peptides that fill sites near the active center of the two different enzyme molecules by artificial intelligence and computer simulations

    Science.gov (United States)

    Nishiyama, Katsuhiko

    2018-05-01

    Using artificial intelligence, the binding styles of 167 tetrapeptides were predicted in the active site of papain and cathepsin K. Five tetrapeptides (Asn-Leu-Lys-Trp, Asp-Gln-Trp-Gly, Cys-Gln-Leu-Arg, Gln-Leu-Trp-Thr and Arg-Ser-Glu-Arg) were found to bind sites near the active center of both papain and cathepsin K. These five tetrapeptides have the potential to also bind sites of other cysteine proteases, and structural characteristics of these tetrapeptides should aid the design of a common inhibitor of cysteine proteases. Smart application of artificial intelligence should accelerate data mining of important complex systems.

  12. Relative Lyapunov Center Bifurcations

    DEFF Research Database (Denmark)

    Wulff, Claudia; Schilder, Frank

    2014-01-01

    Relative equilibria (REs) and relative periodic orbits (RPOs) are ubiquitous in symmetric Hamiltonian systems and occur, for example, in celestial mechanics, molecular dynamics, and rigid body motion. REs are equilibria, and RPOs are periodic orbits of the symmetry reduced system. Relative Lyapunov...... center bifurcations are bifurcations of RPOs from REs corresponding to Lyapunov center bifurcations of the symmetry reduced dynamics. In this paper we first prove a relative Lyapunov center theorem by combining recent results on the persistence of RPOs in Hamiltonian systems with a symmetric Lyapunov...... center theorem of Montaldi, Roberts, and Stewart. We then develop numerical methods for the detection of relative Lyapunov center bifurcations along branches of RPOs and for their computation. We apply our methods to Lagrangian REs of the N-body problem....

  13. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  14. Energy Performance Testing of Asetek's RackCDU System at NREL's High Performance Computing Data Center

    Energy Technology Data Exchange (ETDEWEB)

    Sickinger, D.; Van Geet, O.; Ravenscroft, C.

    2014-11-01

    In this study, we report on the first tests of Asetek's RackCDU direct-to-chip liquid cooling system for servers at NREL's ESIF data center. The system was simple to install on the existing servers and integrated directly into the data center's existing hydronics system. The focus of this study was to explore the total cooling energy savings and potential for waste-heat recovery of this warm-water liquid cooling system. RackCDU captured up to 64% of server heat into the liquid stream at an outlet temperature of 89 degrees F, and 48% at outlet temperatures approaching 100 degrees F. This system was designed to capture heat from the CPUs only, indicating a potential for increased heat capture if memory cooling was included. Reduced temperatures inside the servers caused all fans to reduce power to the lowest possible BIOS setting, indicating further energy savings potential if additional fan control is included. Preliminary studies manually reducing fan speed (and even removing fans) validated this potential savings but could not be optimized for these working servers. The Asetek direct-to-chip liquid cooling system has been in operation with users for 16 months with no necessary maintenance and no leaks.

  15. Casa de la juventud, en Stuttgart, Alemania

    Directory of Open Access Journals (Sweden)

    Ellsässer, Karl

    1966-06-01

    Full Text Available The project consists of three buildings, joined together. The main building houses woodwork and metalwork shops, photographic laboratories, ceramic workshop and various secondary installations, in the basement. The ground floor has a number of games' rooms a room for the youth leader, and the manager's flat. The top floor is occupied by the library, a hall for craftwork and services. A second building has a large assembly hall, a cafeteria bar, dining room, and drawing room on the ground floor. On the first floor there is a general hall, for showing films, lectures, and similar, with a capacity for 180 seated people. The third building includes the table tennis hall and a bicycle storage. Construction is diaphanous, simple, and of high class workmanship, in up to date styles and using modem materials.El conjunto construido consta de tres edificios unidos entre sí: El edificio principal alberga, en el semisótano, los talleres de carpintería y metalistería, laboratorio fotográfico, taller de cerámica, cuartos de las instalaciones, etc.; en la planta baja, una serie de salas de juegos, etc., el cuarto del jefe de la juventud, y vivienda del director; y en la planta alta, la biblioteca, un local para la práctica de labores manuales, etc. El edificio de la gran sala colectiva aloja, en su nivel inferior, un salón cafetería-bar, comedor, sala de dibujo; y en la primera planta, la sala para múltiples usos —proyección de películas, reuniones, etc.— con capacidad de 180 asientos. Finalmente el tercer edificio alberga la sala de ping-pong y un local para guardar bicicletas. La construcción es diáfana, agradable y está concebida y realizada con esmero, de acuerdo con las directrices y materiales más modernos.

  16. Evaluation of Orthopedic Metal Artifact Reduction Application in Three-Dimensional Computed Tomography Reconstruction of Spinal Instrumentation: A Single Saudi Center Experience.

    Science.gov (United States)

    Ali, Amir Monir

    2018-01-01

    The aim of the study was to evaluate the commercially available orthopedic metal artifact reduction (OMAR) technique in postoperative three-dimensional computed tomography (3DCT) reconstruction studies after spinal instrumentation and to investigate its clinical application. One hundred and twenty (120) patients with spinal metallic implants were included in the study. All had 3DCT reconstruction examinations using the OMAR software after obtaining the informed consents and approval of the Institution Ethical Committee. The degree of the artifacts, the related muscular density, the clearness of intermuscular fat planes, and definition of the adjacent vertebrae were qualitatively evaluated. The diagnostic satisfaction and quality of the 3D reconstruction images were thoroughly assessed. The majority (96.7%) of 3DCT reconstruction images performed were considered satisfactory to excellent for diagnosis. Only 3.3% of the reconstructed images had rendered unacceptable diagnostic quality. OMAR can effectively reduce metallic artifacts in patients with spinal instrumentation with highly diagnostic 3DCT reconstruction images.

  17. Computed tomography-guided percutaneous trephine removal of the nidus in osteoid osteoma patients: experience of a single center in Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Petrilli, Marcelo; Senerchia, Andreza Almeida; Petrilli, Antonio Sergio; Lederman, Henrique Manoel; Garcia Filho, Reynaldo Jesus, E-mail: andrezasenerchia@hotmail.com [Universidade Federal de Sao Paulo (UNIFESP), Sao Paulo, SP (Brazil). Instituto de Oncologia Pediatrica

    2015-07-15

    Objective: to report the results of computed tomography (CT)-guided percutaneous resection of the nidus in 18 cases of osteoid osteoma. Materials and methods: the medical records of 18 cases of osteoid osteoma in children, adolescents and young adults, who underwent CT-guided removal of the nidus between November, 2004 and March, 2009 were reviewed retrospectively for demographic data, lesion site, clinical outcome and complications after procedure. Results: clinical follow-up was available for all cases at a median of 29 months (range 6-60 months). No persistence of pre-procedural pain was noted on 17 patients. Only one patient experienced recurrence of symptoms 12 months after percutaneous resection, and was successfully retreated by the same technique, resulting in a secondary success rate of 18/18 (100%). Conclusion: CT-guided removal or destruction of the nidus is a safe and effective alternative to surgical resection of the osteoid osteoma nidus. (author)

  18. A 3-Month Randomized Controlled Pilot Trial of a Patient-Centered, Computer-Based Self-Monitoring System for the Care of Type 2 Diabetes Mellitus and Hypertension.

    Science.gov (United States)

    Or, Calvin; Tao, Da

    2016-04-01

    This study was performed to evaluate the effects of a patient-centered, tablet computer-based self-monitoring system for chronic disease care. A 3-month randomized controlled pilot trial was conducted to compare the use of a computer-based self-monitoring system in disease self-care (intervention group; n = 33) with a conventional self-monitoring method (control group; n = 30) in patients with type 2 diabetes mellitus and/or hypertension. The system was equipped with a 2-in-1 blood glucose and blood pressure monitor, a reminder feature, and video-based educational materials for the care of the two chronic diseases. The control patients were given only the 2-in-1 monitor for self-monitoring. The outcomes reported here included the glycated hemoglobin (HbA1c) level, fasting blood glucose level, systolic blood pressure, diastolic blood pressure, chronic disease knowledge, and frequency of self-monitoring. The data were collected at baseline and at 1-, 2-, and 3-month follow-up visits. The patients in the intervention group had a significant decrease in mean systolic blood pressure from baseline to 1 month (p computer-assisted and conventional disease self-monitoring appear to be useful to support/maintain blood pressure and diabetes control. The beneficial effects of the use of electronic self-care resources and support provided via mobile technologies require further confirmation in longer-term, larger trials.

  19. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  20. Evidence of seasonal variation in longitudinal growth of height in a sample of boys from Stuttgart Carlsschule, 1771-1793, using combined principal component analysis and maximum likelihood principle.

    Science.gov (United States)

    Lehmann, A; Scheffler, Ch; Hermanussen, M

    2010-02-01

    Recent progress in modelling individual growth has been achieved by combining the principal component analysis and the maximum likelihood principle. This combination models growth even in incomplete sets of data and in data obtained at irregular intervals. We re-analysed late 18th century longitudinal growth of German boys from the boarding school Carlsschule in Stuttgart. The boys, aged 6-23 years, were measured at irregular 3-12 monthly intervals during the period 1771-1793. At the age of 18 years, mean height was 1652 mm, but height variation was large. The shortest boy reached 1474 mm, the tallest 1826 mm. Measured height closely paralleled modelled height, with mean difference of 4 mm, SD 7 mm. Seasonal height variation was found. Low growth rates occurred in spring and high growth rates in summer and autumn. The present study demonstrates that combining the principal component analysis and the maximum likelihood principle enables growth modelling in historic height data also. Copyright (c) 2009 Elsevier GmbH. All rights reserved.

  1. Final Report, Center for Programming Models for Scalable Parallel Computing: Co-Array Fortran, Grant Number DE-FC02-01ER25505

    Energy Technology Data Exchange (ETDEWEB)

    Robert W. Numrich

    2008-04-22

    The major accomplishment of this project is the production of CafLib, an 'object-oriented' parallel numerical library written in Co-Array Fortran. CafLib contains distributed objects such as block vectors and block matrices along with procedures, attached to each object, that perform basic linear algebra operations such as matrix multiplication, matrix transpose and LU decomposition. It also contains constructors and destructors for each object that hide the details of data decomposition from the programmer, and it contains collective operations that allow the programmer to calculate global reductions, such as global sums, global minima and global maxima, as well as vector and matrix norms of several kinds. CafLib is designed to be extensible in such a way that programmers can define distributed grid and field objects, based on vector and matrix objects from the library, for finite difference algorithms to solve partial differential equations. A very important extra benefit that resulted from the project is the inclusion of the co-array programming model in the next Fortran standard called Fortran 2008. It is the first parallel programming model ever included as a standard part of the language. Co-arrays will be a supported feature in all Fortran compilers, and the portability provided by standardization will encourage a large number of programmers to adopt it for new parallel application development. The combination of object-oriented programming in Fortran 2003 with co-arrays in Fortran 2008 provides a very powerful programming model for high-performance scientific computing. Additional benefits from the project, beyond the original goal, include a programto provide access to the co-array model through access to the Cray compiler as a resource for teaching and research. Several academics, for the first time, included the co-array model as a topic in their courses on parallel computing. A separate collaborative project with LANL and PNNL showed how to

  2. Is Whole-Body Computed Tomography the Standard Work-up for Severely-Injured Children? Results of a Survey among German Trauma Centers.

    Science.gov (United States)

    Bayer, J; Reising, K; Kuminack, K; Südkamp, N P; Strohm, P C

    2015-01-01

    Whole-body computed tomography is accepted as the standard procedure in the primary diagnostic of polytraumatised adults in the emergency room. Up to now there is still controversial discussion about the same algorithm in the primary diagnostic of children. The aim of this study was to survey the participation of German trauma-centres in the care of polytraumatised children and the hospital dependant use of whole-body computed tomography for initial patient work-up. A questionnaire was mailed to every Department of Traumatology registered in the DGU (German Trauma Society) databank. We received 60,32% of the initially sent questionnaires and after applying exclusion criteria 269 (53,91%) were applicable to statistical analysis. In the three-tiered German hospital system no statistical difference was seen in the general participation of children polytrauma care between hospitals of different tiers (p = 0.315). Even at the lowest hospital level 69,47% of hospitals stated to participate in polytrauma care for children, at the intermediate and highest level hospitals 91,89% and 95,24% stated to be involved in children polytrauma care, respectively. Children suspicious of multiple injuries or polytrauma received significantly fewer primary whole-body CTs in lowest level compared to intermediate level hospitals (36,07% vs. 56,57%; p = 0.015) and lowest level compared to highest level hospitals (36,07% vs. 68,42%; p = 0.001). Comparing the use of whole-body CT in intermediate to highest level hospitals a not significant increase in its use could be seen in highest level hospitals (56,57% vs. 68,42%; p = 0.174). According to our survey, taking care of polytraumatised children in Germany is not limited to specialised hospitals or a defined hospital level-of-care. Additionally, there is no established radiologic standard in work-up of the polytraumatised child. However, in higher hospital care -levels a higher percentage of hospitals employs whole-body CTs for primary

  3. Electricity Infrastructure Operations Center (EIOC)

    Data.gov (United States)

    Federal Laboratory Consortium — The Electricity Infrastructure Operations Center (EIOC) at PNNL brings together industry-leading software, real-time grid data, and advanced computation into a fully...

  4. Usage Center

    DEFF Research Database (Denmark)

    Kleinaltenkamp, Michael; Plewa, Carolin; Gudergan, Siegfried

    2017-01-01

    Purpose: The purpose of this paper is to advance extant theorizing around resourceintegration by conceptualizing and delineating the notion of a usage center. Ausage center consists of a combination of interdependent actors that draw onresources across their individual usage processes to create v...

  5. Spelling is just a click away – a user-centered brain-computer interface including auto-calibration and predictive text entry

    Directory of Open Access Journals (Sweden)

    Tobias eKaufmann

    2012-05-01

    Full Text Available Brain Computer Interfaces (BCI based on event-related potentials (ERP allow for selection of characters from a visually presented character-matrix and thus provide a communication channel for users with neurodegenerative disease. Although they have been topic of research for more than 20 years and were multiply proven to be a reliable communication method, BCIs are almost exclusively used in experimental settings, handled by qualified experts. This study investigates if ERP-BCIs can be handled independently by laymen without expert interference, which is inevitable for establishing BCIs in end-user’s daily life situations. Furthermore we compared the classic character-by-character text entry against a predictive text entry (PTE that directly incorporates predictive text into the character matrix. N=19 BCI novices handled a user-centred ERP-BCI application on their own without expert interference. The software individually adjusted classifier weights and control parameters in the background, invisible to the user (auto-calibration. All participants were able to operate the software on their own and to twice correctly spell a sentence with the auto-calibrated classifier (once with PTE, once without. Our PTE increased spelling speed and importantly did not reduce accuracy. In sum, this study demonstrates feasibility of auto-calibrating ERP-BCI use, independently by laymen and the strong benefit of integrating predictive text directly into the character-matrix.

  6. Short-term outcomes and safety of computed tomography-guided percutaneous microwave ablation of solitary adrenal metastasis from lung cancer: A multi-center retrospective study

    Energy Technology Data Exchange (ETDEWEB)

    Men, Min; Ye, Xin; Yang, Xia; Zheng, Aimin; Huang, Guang Hui; Wei, Zhigang [Dept. of Oncology, Shandong Provincial Hospital Affiliated with Shandong University, Jinan (China); Fan, Wei Jun [Imaging and Interventional Center, Sun Yat-sen University Cancer Center, Guangzhou (China); Zhang, Kaixian [Dept. of Oncology, Teng Zhou Central People' s Hospital Affiliated with Jining Medical College, Tengzhou (China); Bi, Jing Wang [Dept. of Oncology, Jinan Military General Hospital of Chinese People' s Liberation Army, Jinan (China)

    2016-11-15

    To retrospectively evaluate the short-term outcomes and safety of computed tomography (CT)-guided percutaneous microwave ablation (MWA) of solitary adrenal metastasis from lung cancer. From May 2010 to April 2014, 31 patients with unilateral adrenal metastasis from lung cancer who were treated with CT-guided percutaneous MWA were enrolled. This study was conducted with approval from local Institutional Review Board. Clinical outcomes and complications of MWA were assessed. Their tumors ranged from 1.5 to 5.4 cm in diameter. After a median follow-up period of 11.1 months, primary efficacy rate was 90.3% (28/31). Local tumor progression was detected in 7 (22.6%) of 31 cases. Their median overall survival time was 12 months. The 1-year overall survival rate was 44.3%. Median local tumor progression-free survival time was 9 months. Local tumor progression-free survival rate was 77.4%. Of 36 MWA sessions, two (5.6%) had major complications (hypertensive crisis). CT-guided percutaneous MWA may be fairly safe and effective for treating solitary adrenal metastasis from lung cancer.

  7. Relap4/SAS/Mod5 - A version of Relap4/Mod 5 adapted to IPEN/CNEN - SP computer center

    International Nuclear Information System (INIS)

    Sabundjian, G.

    1988-04-01

    In order to improve the safety of nuclear reactor power plants several computer codes have been developed in the area of thermal - hydraulics accident analysis. Among the public-available codes, RELAP4, developed by Aerojet Nuclear Company, has been the most popular one. RELAP4 has produced satisfactory results when compared to most of the available experimental data. The purposes of the present work are: optimization of RELAP4 output and messages by writing there information in temporary records, - display of RELAP4 results in graphical form through the printer. The sample problem consists on a simplified model of a 150 MW (e) PWR whose primary circuit is simulated by 6 volumes, 8 junctions and 1 heat slab. This new version of RELAP4 (named RELAP4/SAS/MOD5) have produced results which show that the above mentioned purposes have been reached. Obviously the graphical output by RELAP4/SAS/MOD5 favors the interpretation of results by the user. (author) [pt

  8. Sensitivity of endoscopic ultrasound, multidetector computed tomography, and magnetic resonance cholangiopancreatography in the diagnosis of pancreas divisum: a tertiary center experience.

    Science.gov (United States)

    Kushnir, Vladimir M; Wani, Sachin B; Fowler, Kathryn; Menias, Christine; Varma, Rakesh; Narra, Vamsi; Hovis, Christine; Murad, Faris M; Mullady, Daniel K; Jonnalagadda, Sreenivasa S; Early, Dayna S; Edmundowicz, Steven A; Azar, Riad R

    2013-04-01

    There are limited data comparing imaging modalities in the diagnosis of pancreas divisum. We aimed to: (1) evaluate the sensitivity of endoscopic ultrasound (EUS), magnetic resonance cholangiopancreatography (MRCP), and multidetector computed tomography (MDCT) for pancreas divisum; and (2) assess interobserver agreement (IOA) among expert radiologists for detecting pancreas divisum on MDCT and MRCP. For this retrospective cohort study, we identified 45 consecutive patients with pancreaticobiliary symptoms and pancreas divisum established by endoscopic retrograde pancreatography who underwent EUS and cross-sectional imaging. The control group was composed of patients without pancreas divisum who underwent endoscopic retrograde pancreatography and cross-sectional imaging. The sensitivity of EUS for pancreas divisum was 86.7%, significantly higher than the sensitivity reported in the medical records for MDCT (15.5%) or MRCP (60%) (P pancreas divisum; IOA was moderate (κ = 0.43). Endoscopic ultrasound is a sensitive test for diagnosing pancreas divisum and is superior to MDCT and MRCP. Review of MDCT studies by expert radiologists substantially raises its sensitivity for pancreas divisum.

  9. SENSITIVITY OF ENDOSCOPIC ULTRASOUND, MULTIDETECTOR COMPUTER TOMOGRAPHY AND MAGNETIC RESONANCE CHOLANGIOPANCREATOGRAPHY IN THE DIAGNOSIS OF PANCREAS DIVISUM: A TERTIARY CENTER EXPERIENCE

    Science.gov (United States)

    Kushnir, Vladimir M.; Wani, Sachin B.; Fowler, Kathryn; Menias, Christine; Varma, Rakesh; Narra, Vamsi; Hovis, Christine; Murad, Faris; Mullady, Daniel; Jonnalagadda, Sreenivasa S.; Early, Dayna S.; Edmundowicz, Steven A.; Azar, Riad R.

    2014-01-01

    OBJECTIVES There are limited data comparing imaging modalities in the diagnosis of pancreas divisum. We aimed to: 1. Evaluate the sensitivity of endoscopic ultrasound (EUS), magnetic resonance cholangiopancreatography (MRCP) and multi-detector computed tomography (MDCT) for pancreas divisum. 2. Assess interobserver agreement (IOA) among expert radiologists for detecting pancreas divisum on MDCT and MRCP. METHODS For this retrospective cohort study, we identified 45 consecutive patients with pancreaticobiliary symptoms and pancreas divisum established by endoscopic retrograde pancreatography (ERP) who underwent EUS and cross-sectional imaging. The control group was composed of patients without pancreas divisum who underwent ERP and cross-sectional imaging. RESULTS The sensitivity of EUS for pancreas divisum was 86.7%, significantly higher than sensitivity reported in the medical records for MDCT (15.5%) or MRCP (60%) [ppancreas divisum; IOA was moderate (қ=0.43). CONCLUSIONS EUS is a sensitive test for diagnosing pancreas divisum and is superior to MDCT and MRCP. Review of MDCT studies by expert radiologists substantially raises its sensitivity for pancreas divisum. PMID:23211370

  10. Computing Cost Price by Using Activity Based Costing (ABC Method in Dialysis Ward of Shahid Rajaei Medical & Education Center, in Alborz University of Medical Sciences Karaj in 2015

    Directory of Open Access Journals (Sweden)

    H. Derafshi

    2016-08-01

    Full Text Available Background: Analysis of hospital cost is one of the key subjects for resource allocation. The Activity – based costing is an applicable tool to recognize accurate costs .This technique helps to determine costs. The aim of this study is utilizing activity activity-based costing method to estimate the cost of dialysis unit related to Shahid Rajaei hospital in year 2015. Methods: The type of this research is applied and sectioned descriptive study. The required data is collected from dialysis unit , accounting unit, discharge, the completion of medical equipments of Shahid Rajaei hospital in the first six months 2015 which was calculated cost by excel software. Results and Conclusion: In any month, the average 1238 patients accepted to receive the dialysis services in Shahid Rajaei hospital .The cost of consumables materials was 47.6%, which is the majority percentage of allocated costs. The lowest cost related to insurance deductions about 2.27%. After Calculating various costs of dialysis services, we find out, the personal cost covers only 32% of the all cost. The other ongoing overhead cost is about 11.94% of all cost. Therefore, any dialysis service requires 2.017.131 rial costs, however the tariff of any dialysis service is 1.838.871 rial. So, this center loses 178,260 rial in each session. The results show that the cost of doing any dialysis services is more than the revenue of it in Shahid Rajaei hospital. It seems that the reforming processes of supplying consumable, changing the tariffs in chronic dialysis; especially in set the filter and consumable materials unit besides controlling the cost of human resource could decrease the cost of this unit with Regard to the results recommended using capacity of the private department recommended. 

  11. Center for Coastline Security Technology, Year-2

    National Research Council Canada - National Science Library

    Glegg, Stewart; Glenn, William; Furht, Borko; Beaujean, P. P; Frisk, G; Schock, S; VonEllenrieder, K; Ananthakrishnan, P; An, E; Granata, R

    2007-01-01

    ...), the Imaging Technology Center, the Department of Computer Science and Engineering, and the University Consortium for Intermodal Transportation Safety and Security at Florida Atlantic University...

  12. Energy efficient data centers

    Energy Technology Data Exchange (ETDEWEB)

    Tschudi, William; Xu, Tengfang; Sartor, Dale; Koomey, Jon; Nordman, Bruce; Sezgen, Osman

    2004-03-30

    through extensive participation with data center professionals, examination of case study findings, and participation in data center industry meetings and workshops. Industry partners enthusiastically provided valuable insight into current practice, and helped to identify areas where additional public interest research could lead to significant efficiency improvement. This helped to define and prioritize the research agenda. The interaction involved industry representatives with expertise in all aspects of data center facilities, including specialized facility infrastructure systems and computing equipment. In addition to the input obtained through industry workshops, LBNL's participation in a three-day, comprehensive design ''charrette'' hosted by the Rocky Mountain Institute (RMI) yielded a number of innovative ideas for future research.

  13. Computer Center CDC Libraries/NSRDC (Subprograms).

    Science.gov (United States)

    1981-02-01

    NEXT PARAMETER FROM USER-SUPPLIED PARAMETER STRING FUNCTIONAL CATEGORIES: M4 USAGE CALL EXTPRM (IAREA, LAREA, IPARM, ISEP ) CALL EXTPRM (IAREA, LAREA...INTEGER.) IPARM - OUT - NEXT PARAMETER, LEFT-JUSTIFIED, ZERO-FILLED ISEP - OUT - IF PRESENT, CODE INDICATING TYPE OF SEPARATOR FOUND FOLLOWING THE...CATEGORIES: M4 USAGE CALL PARGET (IAREA, LAREA, IPARAM, NPARAM, ISEP , RSEP, LSEP’ CALL PARGET (IAREA, LAREA, IPARAM, NPARAM, ISEP , RSEP) CALL PARGET (IAREA

  14. Computer Center CDC Libraries/NSRD (Subprograms).

    Science.gov (United States)

    1984-06-01

    SUBROUTINE MUST BE RE-INITIALIZED USING EITHER THE THIRD OR FOURTH FORM OF THE CALL. USAGE CALL EXTPRM (IAREA, LAREA, IPARM, ISEP ) CALL EXTPRM (IAREA, LAREA...INTEGER.) IPARM - OUT - NEXT PARAMETER, LEFT-JUSTIFIED, ZERO-FILLED ISEP - OUT - IF PRESENT, CODE INDICATING TYPE OF SEPARATOR FOUND FOLLOWING THE...SYSTEMS) CDC 6000/CYBER 170 (NOS/BE) REMARKS NONE USAGE CALL PARGET (IAREA, LAREA, IPARAM, NPARAM, ISEP , RSEP, LSEP) CALL PARGET (IAREA, LAREA, IPARAM

  15. Optics in computers servers and data centers

    NARCIS (Netherlands)

    Dorren, H.J.S.; Duan, P.; Raz, O.; Luijten, R.; Glebov, A.L.; Chen, R.T.

    2012-01-01

    Based on well-known laws of physics, a lower bound on the energy-per-bit required for transmitting information using a photonic channel is established. The analysis includes the energy required to convert information from the electronic to the photonic domain and back. We investigate links that

  16. High-End Scientific Computing

    Science.gov (United States)

    EPA uses high-end scientific computing, geospatial services and remote sensing/imagery analysis to support EPA's mission. The Center for Environmental Computing (CEC) assists the Agency's program offices and regions to meet staff needs in these areas.

  17. Center for Adaptive Optics | Center

    Science.gov (United States)

    Astronomy, UCSC's CfAO and ISEE, and Maui Community College, runs education and internship programs in / Jacobs Retina Center Department of Psychology University of California, San Francisco Department of University School of Optometry Maui Community College Maui Community College Space Grant Program Montana

  18. Computational physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1985-12-01

    The development of numerical models for plasma phenomena and magnetic confinement devices is discussed. The multidimensional Fokker-Planck and transport codes are applied to toroidal mirror and compact toroid devices. Linear and nonlinear resistive magnetohydrodynamics in two and three dimensions are used in the investigation of various fusion devices. 362 refs., 4 tabs

  19. Outline of computer application in PNC

    International Nuclear Information System (INIS)

    Aoki, Minoru

    1990-01-01

    Computer application systems are an important resource for the R and D (research and development) in PNC. Various types of computer systems are widely used on the R and D of experiment, evaluation and analysis, plant operation and other jobs in PNC. Currently, the computer centers in PNC have been established in Oarai engineering Center and Tokai Works. The former uses a large scale digital computer and supercomputer systems. The latter uses only a large scale digital computer system. These computer systems have joined in the PNC Information Network that connects between Head Office and Branches, Oarai, Tokai, Ningyotoge and Fugen, by means of super digital circuit. In the near future, the computer centers will be brought together in order to raise up efficiency of operation of the computer systems. New computer center called 'Information Center' is under construction in Oarai Engineering Center. (author)

  20. Computer Operating System Maintenance.

    Science.gov (United States)

    1982-06-01

    FACILITY The Computer Management Information Facility ( CMIF ) system was developed by Rapp Systems to fulfill the need at the CRF to record and report on...computer center resource usage and utilization. The foundation of the CMIF system is a System 2000 data base (CRFMGMT) which stores and permits access

  1. Relationship between lung function and quantitative computed tomographic parameters of airway remodeling, air trapping, and emphysema in patients with asthma and chronic obstructive pulmonary disease: A single-center study.

    Science.gov (United States)

    Hartley, Ruth A; Barker, Bethan L; Newby, Chris; Pakkal, Mini; Baldi, Simonetta; Kajekar, Radhika; Kay, Richard; Laurencin, Marie; Marshall, Richard P; Sousa, Ana R; Parmar, Harsukh; Siddiqui, Salman; Gupta, Sumit; Brightling, Chris E

    2016-05-01

    There is a paucity of studies comparing asthma and chronic obstructive pulmonary disease (COPD) based on thoracic quantitative computed tomographic (QCT) parameters. We sought to compare QCT parameters of airway remodeling, air trapping, and emphysema between asthmatic patients and patients with COPD and explore their relationship with airflow limitation. Asthmatic patients (n = 171), patients with COPD (n = 81), and healthy subjects (n = 49) recruited from a single center underwent QCT and clinical characterization. Proximal airway percentage wall area (%WA) was significantly increased in asthmatic patients (62.5% [SD, 2.2]) and patients with COPD (62.7% [SD, 2.3]) compared with that in healthy control subjects (60.3% [SD, 2.2], P Emphysema assessed based on lung density measured by using Hounsfield units below which 15% of the voxels lie (Perc15) was a feature of COPD only (patients with COPD: mean, -964 [SD, 19.62] vs asthmatic patients: mean, -937 [SD, 22.7] and healthy subjects: mean, -937 [SD, 17.1], P < .001). Multiple regression analyses showed that the strongest predictor of lung function impairment in asthmatic patients was %WA, whereas in the COPD and asthma subgrouped with postbronchodilator FEV1 percent predicted value of less than 80%, it was air trapping. Factor analysis of QCT parameters in asthmatic patients and patients with COPD combined determined 3 components, with %WA, air trapping, and Perc15 values being the highest loading factors. Cluster analysis identified 3 clusters with mild, moderate, or severe lung function impairment with corresponding decreased lung density (Perc15 values) and increased air trapping. In asthmatic patients and patients with COPD, lung function impairment is strongly associated with air trapping, with a contribution from proximal airway narrowing in asthmatic patients. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  2. [Academic procrastination in clients of a psychotherapeutic student counselling center].

    Science.gov (United States)

    Jamrozinski, Katja; Kuda, Manfred; Mangholz, Astrid

    2009-01-01

    The start of university education is the beginning of a new phase of life for young adults, which requires significant psychosocial adjustments. Sociobiographical data, clinical symptoms, characteristics of education, work attitude, and career perspectives were gathered from 152 clients by a psychotherapeutic student counselling center to evaluate characteristics of students with and without academic procrastination. The procrastination group comprised heightened numbers of students who had changed universities, and people with suboptimal career prospects and career targets. These subjects were more often male and showed increased incidences of drug- and alcohol problems, as well as a lack of planning of the future. Furthermore, they had larger amounts of their study self-financed. On the basis of these results, concrete recommendations for preventive measures to improve on-time completion of study, and to prevent student drop-out are presented. Georg Thieme Verlag KG Stuttgart-New York.

  3. The Soviet center of astronomical data

    International Nuclear Information System (INIS)

    Dluzhnevskaya, O.B.

    1982-01-01

    On the basis of the current French-Soviet cooperation in science and technology, the Astronomical Council of the U.S.S.R. Academy of Sciences and the Strasbourg Center signed in 1977 an agreement on setting up the Soviet Center of Astronomical Data as its filial branch. The Soviet Center was created on the basis of a computation center at the Zvenigorod station of the Astronomical Council of the U.S.S.R. Academy of Sciences, which had already had considerable experience of working with stellar catalogues. In 1979 the Center was equipped with a EC-1033 computer. In 1978-1979 the Soviet Center of Astronomical Data (C.A.D.) received from Strasbourg 96 of the most important catalogues. By September 1981 the list of catalogues available at the Soviet Center has reached 140 catalogues some of which are described. (Auth.)

  4. Colorado Learning Disabilities Research Center.

    Science.gov (United States)

    DeFries, J. C.; And Others

    1997-01-01

    Results obtained from the center's six research projects are reviewed, including research on psychometric assessment of twins with reading disabilities, reading and language processes, attention deficit-hyperactivity disorder and executive functions, linkage analysis and physical mapping, computer-based remediation of reading disabilities, and…

  5. Rapid guiding center calculations

    International Nuclear Information System (INIS)

    White, R.B.

    1995-04-01

    Premature loss of high energy particles, and in particular fusion alpha particles, is very deleterious in a fusion reactor. Because of this it is necessary to make long-time simulations, on the order of the alpha particle slowing down time, with a number of test particles sufficient to give predictions with reasonable statistical accuracy. Furthermore it is desirable to do this for a large number of equilibria with different characteristic magnetic field ripple, to best optimize engineering designs. In addition, modification of the particle distribution due to magnetohydrodynamic (MHD) modes such as the saw tooth mode present in the plasma can be important, and this effect requires additional simulation. Thus the large number of necessary simulations means any increase of computing speed in guiding center codes is an important improvement in predictive capability. Previous guiding center codes using numerical equilibria such as ORBIT evaluated the local field strength and ripple magnitude using Lagrangian interpolation on a grid. Evaluation of these quantities four times per time step (using a fourth order Runge-Kutta routine) constitutes the major computational effort of the code. In the present work the authors represent the field quantities through an expansion in terms of pseudo-cartesian coordinates formed from the magnetic coordinates. The simplicity of the representation gives four important advantages over previous methods

  6. ICASE/LaRC/NSF/ARO Workshop, conducted by the Institute for Computer Applications in Science and Engineering, NASA Langley Research Center, The National Science Foundation and the Army Research Office

    CERN Document Server

    Anderson, W

    2000-01-01

    Over the last decade, the role of computational simulations in all aspects of aerospace design has steadily increased. However, despite the many advances, the time required for computations is far too long. This book examines new ideas and methodologies that may, in the next twenty years, revolutionize scientific computing. The book specifically looks at trends in algorithm research, human computer interface, network-based computing, surface modeling and grid generation and computer hardware and architecture. The book provides a good overview of the current state-of-the-art and provides guidelines for future research directions. The book is intended for computational scientists active in the field and program managers making strategic research decisions.

  7. [Imaging center - optimization of the imaging process].

    Science.gov (United States)

    Busch, H-P

    2013-04-01

    Hospitals around the world are under increasing pressure to optimize the economic efficiency of treatment processes. Imaging is responsible for a great part of the success but also of the costs of treatment. In routine work an excessive supply of imaging methods leads to an "as well as" strategy up to the limit of the capacity without critical reflection. Exams that have no predictable influence on the clinical outcome are an unjustified burden for the patient. They are useless and threaten the financial situation and existence of the hospital. In recent years the focus of process optimization was exclusively on the quality and efficiency of performed single examinations. In the future critical discussion of the effectiveness of single exams in relation to the clinical outcome will be more important. Unnecessary exams can be avoided, only if in addition to the optimization of single exams (efficiency) there is an optimization strategy for the total imaging process (efficiency and effectiveness). This requires a new definition of processes (Imaging Pathway), new structures for organization (Imaging Center) and a new kind of thinking on the part of the medical staff. Motivation has to be changed from gratification of performed exams to gratification of process quality (medical quality, service quality, economics), including the avoidance of additional (unnecessary) exams. © Georg Thieme Verlag KG Stuttgart · New York.

  8. [Text mining, a method for computer-assisted analysis of scientific texts, demonstrated by an analysis of author networks].

    Science.gov (United States)

    Hahn, P; Dullweber, F; Unglaub, F; Spies, C K

    2014-06-01

    Searching for relevant publications is becoming more difficult with the increasing number of scientific articles. Text mining as a specific form of computer-based data analysis may be helpful in this context. Highlighting relations between authors and finding relevant publications concerning a specific subject using text analysis programs are illustrated graphically by 2 performed examples. © Georg Thieme Verlag KG Stuttgart · New York.

  9. Visual problems in young adults due to computer use.

    Science.gov (United States)

    Moschos, M M; Chatziralli, I P; Siasou, G; Papazisis, L

    2012-04-01

    Computer use can cause visual problems. The purpose of our study was to evaluate visual problems due to computer use in young adults. Participants in our study were 87 adults, 48 male and 39 female, mean aged 31.3 years old (SD 7.6). All the participants completed a questionnaire regarding visual problems detected after computer use. The mean daily use of computers was 3.2 hours (SD 2.7). 65.5 % of the participants complained for dry eye, mainly after more than 2.5 hours of computer use. 32 persons (36.8 %) had a foreign body sensation in their eyes, while 15 participants (17.2 %) complained for blurred vision which caused difficulties in driving, after 3.25 hours of continuous computer use. 10.3 % of the participants sought medical advice for their problem. There was a statistically significant correlation between the frequency of visual problems and the duration of computer use (p = 0.021). 79.3 % of the participants use artificial tears during or after long use of computers, so as not to feel any ocular discomfort. The main symptom after computer use in young adults was dry eye. All visual problems associated with the duration of computer use. Artificial tears play an important role in the treatment of ocular discomfort after computer use. © Georg Thieme Verlag KG Stuttgart · New York.

  10. Performance indicators for call centers with impatience

    NARCIS (Netherlands)

    Jouini, O.; Koole, G.M.; Roubos, A.

    2013-01-01

    An important feature of call center modeling is the presence of impatient customers. This article considers single-skill call centers including customer abandonments. A number of different service-level definitions are structured, including all those used in practice, and the explicit computation of

  11. Analytic reducibility of nondegenerate centers: Cherkas systems

    Directory of Open Access Journals (Sweden)

    Jaume Giné

    2016-07-01

    where $P_i(x$ are polynomials of degree $n$, $P_0(0=0$ and $P_0'(0 <0$. Computing the focal values we find the center conditions for such systems for degree $3$, and using modular arithmetics for degree $4$. Finally we do a conjecture about the center conditions for Cherkas polynomial differential systems of degree $n$.

  12. Networking at NASA. Johnson Space Center

    Science.gov (United States)

    Garman, John R.

    1991-01-01

    A series of viewgraphs on computer networks at the Johnson Space Center (JSC) are given. Topics covered include information resource management (IRM) at JSC, the IRM budget by NASA center, networks evolution, networking as a strategic tool, the Information Services Directorate charter, and SSC network requirements, challenges, and status.

  13. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  14. McMaster University: College and University Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    The computing and information services (CIS) organization includes administrative computing, academic computing, and networking and has three divisions: computing services, development services, and information services. Other computing activities include Health Sciences, Humanities Computing Center, and Department of Computer Science and Systems.…

  15. Magnetic fusion energy and computers

    International Nuclear Information System (INIS)

    Killeen, J.

    1982-01-01

    The application of computers to magnetic fusion energy research is essential. In the last several years the use of computers in the numerical modeling of fusion systems has increased substantially. There are several categories of computer models used to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are also in use. To meet the needs of the fusion program, the National Magnetic Fusion Energy Computer Center has been established at the Lawrence Livermore National Laboratory. A large central computing facility is linked to smaller computer centers at each of the major MFE laboratories by a communication network. In addition to providing cost effective computing services, the NMFECC environment stimulates collaboration and the sharing of computer codes among the various fusion research groups

  16. Virtual Meteorological Center

    Directory of Open Access Journals (Sweden)

    Marius Brinzila

    2007-10-01

    Full Text Available A virtual meteorological center, computer based with Internet possibility transmission of the information is presented. Circumstance data is collected with logging field meteorological station. The station collects and automatically save data about the temperature in the air, relative humidity, pressure, wind speed and wind direction, rain gauge, solar radiation and air quality. Also can perform sensors test, analyze historical data and evaluate statistical information. The novelty of the system is that it can publish data over the Internet using LabVIEW Web Server capabilities and deliver a video signal to the School TV network. Also the system performs redundant measurement of temperature and humidity and was improved using new sensors and an original signal conditioning module.

  17. Interactive design center.

    Energy Technology Data Exchange (ETDEWEB)

    Pomplun, Alan R. (Sandia National Laboratories, Livermore, CA)

    2005-07-01

    Sandia's advanced computing resources provide researchers, engineers and analysts with the ability to develop and render highly detailed large-scale models and simulations. To take full advantage of these multi-million data point visualizations, display systems with comparable pixel counts are needed. The Interactive Design Center (IDC) is a second generation visualization theater designed to meet this need. The main display integrates twenty-seven projectors in a 9-wide by 3-high array with a total display resolution of more than 35 million pixels. Six individual SmartBoard displays offer interactive capabilities that include on-screen annotation and touch panel control of the facility's display systems. This report details the design, implementation and operation of this innovative facility.

  18. Security in cloud computing

    OpenAIRE

    Moreno Martín, Oriol

    2016-01-01

    Security in Cloud Computing is becoming a challenge for next generation Data Centers. This project will focus on investigating new security strategies for Cloud Computing systems. Cloud Computingisarecent paradigmto deliver services over Internet. Businesses grow drastically because of it. Researchers focus their work on it. The rapid access to exible and low cost IT resources on an on-demand fashion, allows the users to avoid planning ahead for provisioning, and enterprises to save money ...

  19. Building a cluster computer for the computing grid of tomorrow

    International Nuclear Information System (INIS)

    Wezel, J. van; Marten, H.

    2004-01-01

    The Grid Computing Centre Karlsruhe takes part in the development, test and deployment of hardware and cluster infrastructure, grid computing middleware, and applications for particle physics. The construction of a large cluster computer with thousands of nodes and several PB data storage capacity is a major task and focus of research. CERN based accelerator experiments will use GridKa, one of only 8 world wide Tier-1 computing centers, for its huge computer demands. Computing and storage is provided already for several other running physics experiments on the exponentially expanding cluster. (orig.)

  20. Computer Technology for Industry

    Science.gov (United States)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  1. Iowa Water Center | Iowa Water Center

    Science.gov (United States)

    Home Iowa State University Extension Iowa Water Center Submitted by mollyd on April 24, 2012 - 09 :42 Advancing the state of water knowledge and management The Iowa Water Center is a part of a nationwide network of university-based water centers created to encourage interdisciplinary water research

  2. How to Bill Your Computer Services.

    Science.gov (United States)

    Dooskin, Herbert P.

    1981-01-01

    A computer facility billing procedure should be designed so that the full costs of a computer center operation are equitably charged to the users. Design criteria, costing methods, and management's role are discussed. (Author/MLF)

  3. Energy efficient thermal management of data centers

    CERN Document Server

    Kumar, Pramod

    2012-01-01

    Energy Efficient Thermal Management of Data Centers examines energy flow in today's data centers. Particular focus is given to the state-of-the-art thermal management and thermal design approaches now being implemented across the multiple length scales involved. The impact of future trends in information technology hardware, and emerging software paradigms such as cloud computing and virtualization, on thermal management are also addressed. The book explores computational and experimental characterization approaches for determining temperature and air flow patterns within data centers. Thermodynamic analyses using the second law to improve energy efficiency are introduced and used in proposing improvements in cooling methodologies. Reduced-order modeling and robust multi-objective design of next generation data centers are discussed. This book also: Provides in-depth treatment of energy efficiency ideas based on  fundamental heat transfer, fluid mechanics, thermodynamics, controls, and computer science Focus...

  4. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  5. Stephenson Cancer Center

    Science.gov (United States)

    Stephenson Cancer Center at the University of Oklahoma in Oklahoma City is an NCI-designated cancer center at the forefront of NCI-supported cancer research. Learn more about the Stephenson Cancer Center's mission.

  6. Carbon Monoxide Information Center

    Medline Plus

    Full Text Available ... Education Centers Carbon Monoxide Information Center Carbon Monoxide Information Center En Español The Invisible Killer Carbon monoxide, ... Install one and check its batteries regularly. View Information About CO Alarms Other CO Topics Safety Tips ...

  7. Carbon Monoxide Information Center

    Medline Plus

    Full Text Available ... Education Safety Education Centers Carbon Monoxide Information Center Carbon Monoxide Information Center En Español The Invisible Killer Carbon monoxide, also known as CO, is called the " ...

  8. Computed tomography system

    International Nuclear Information System (INIS)

    Lambert, T.W.; Blake, J.E.

    1981-01-01

    This invention relates to computed tomography and is particularly concerned with determining the CT numbers of zones of interest in an image displayed on a cathode ray tube which zones lie in the so-called level or center of the gray scale window. (author)

  9. Security and Privacy in Fog Computing: Challenges

    OpenAIRE

    Mukherjee, Mithun; Matam, Rakesh; Shu, Lei; Maglaras, Leandros; Ferrag, Mohamed Amine; Choudhry, Nikumani; Kumar, Vikas

    2017-01-01

    open access article Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing toward the edge of the networks while offloading the cloud data centers and reducing service latency to the end users. However, the characteristics of fog computing arise new security and privacy challenges. The existing security and privacy measurements for cloud computing cannot be directly applied to the fog computing due to its features, such as mobility, heteroge...

  10. Womens Business Center

    Data.gov (United States)

    Small Business Administration — Women's Business Centers (WBCs) represent a national network of nearly 100 educational centers throughout the United States and its territories, which are designed...

  11. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  12. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  13. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  14. Management Needs for Computer Support.

    Science.gov (United States)

    Irby, Alice J.

    University management has many and varied needs for effective computer services in support of their processing and information functions. The challenge for the computer center managers is to better understand these needs and assist in the development of effective and timely solutions. Management needs can range from accounting and payroll to…

  15. Data Center Consolidation: A Step towards Infrastructure Clouds

    Science.gov (United States)

    Winter, Markus

    Application service providers face enormous challenges and rising costs in managing and operating a growing number of heterogeneous system and computing landscapes. Limitations of traditional computing environments force IT decision-makers to reorganize computing resources within the data center, as continuous growth leads to an inefficient utilization of the underlying hardware infrastructure. This paper discusses a way for infrastructure providers to improve data center operations based on the findings of a case study on resource utilization of very large business applications and presents an outlook beyond server consolidation endeavors, transforming corporate data centers into compute clouds.

  16. The Development of University Computing in Sweden 1965-1985

    Science.gov (United States)

    Dahlstrand, Ingemar

    In 1965-70 the government agency, Statskontoret, set up five university computing centers, as service bureaux financed by grants earmarked for computer use. The centers were well equipped and staffed and caused a surge in computer use. When the yearly flow of grant money stagnated at 25 million Swedish crowns, the centers had to find external income to survive and acquire time-sharing. But the charging system led to the computers not being fully used. The computer scientists lacked equipment for laboratory use. The centers were decentralized and the earmarking abolished. Eventually they got new tasks like running computers owned by the departments, and serving the university administration.

  17. Information center as a technical institute unifying a user community

    International Nuclear Information System (INIS)

    Maskewitz, B.F.; McGill, B.; Hatmaker, N.A.

    1976-01-01

    The historical background to the information analysis center concept is presented first. The Radiation Shielding Information Center (RSIC) at ORNL is cited as an example of the information analysis center. RSIC objectives and scope are described, and RSIC's role in unification of the field of shielding is discussed. Some problems in handling information exchange with respect to computer codes are examined

  18. Center conditions and limit cycles for BiLienard systems

    Directory of Open Access Journals (Sweden)

    Jaume Gine

    2017-03-01

    Full Text Available In this article we study the center problem for polynomial BiLienard systems of degree n. Computing the focal values and using Grobner bases we find the center conditions for such systems for n=6. We also establish a conjecture about the center conditions for polynomial BiLienard systems of arbitrary degree.

  19. USSR Report, Cybernetics Computers and Automation Technology

    Science.gov (United States)

    1985-09-05

    organization, the SKALD program utilizes a dictionary or data base to generate SKALD poetry at the computer center of Minsk State Pedagogical ...wonderful capabilities at the^ Krasnoyarsk branch of the USSR AN [Academy of Sciences] Siberian section’s Computer Center. They began training the kids

  20. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  1. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  2. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  3. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  4. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  5. Cloud Computing Security

    OpenAIRE

    Ngongang, Guy

    2011-01-01

    This project aimed to show how possible it is to use a network intrusion detection system in the cloud. The security in the cloud is a concern nowadays and security professionals are still finding means to make cloud computing more secure. First of all the installation of the ESX4.0, vCenter Server and vCenter lab manager in server hardware was successful in building the platform. This allowed the creation and deployment of many virtual servers. Those servers have operating systems and a...

  6. Software package as an information center product

    International Nuclear Information System (INIS)

    Butler, M.K.

    1977-01-01

    The Argonne Code Center serves as a software exchange and information center for the U.S. Energy Research and Development Administration and the Nuclear Regulatory Commission. The goal of the Center's program is to provide a means for sharing of software among agency offices and contractors, and for transferring computing applications and technology, developed within the agencies, to the information-processing community. A major activity of the Code Center is the acquisition, review, testing, and maintenance of a collection of software--computer systems, applications programs, subroutines, modules, and data compilations--prepared by agency offices and contractors to meet programmatic needs. A brief review of the history of computer program libraries and software sharing is presented to place the Code Center activity in perspective. The state-of-the-art discussion starts off with an appropriate definition of the term software package, together with descriptions of recommended package contents and the Carter's package evaluation activity. An effort is made to identify the various users of the product, to enumerate their individual needs, to document the Center's efforts to meet these needs and the ongoing interaction with the user community. Desirable staff qualifications are considered, and packaging problems, reviewed. The paper closes with a brief look at recent developments and a forecast of things to come. 2 tables

  7. Quantum computing with defects

    Science.gov (United States)

    Varley, Joel

    2011-03-01

    The development of a quantum computer is contingent upon the identification and design of systems for use as qubits, the basic units of quantum information. One of the most promising candidates consists of a defect in diamond known as the nitrogen-vacancy (NV-1) center, since it is an individually-addressable quantum system that can be initialized, manipulated, and measured with high fidelity at room temperature. While the success of the NV-1 stems from its nature as a localized ``deep-center'' point defect, no systematic effort has been made to identify other defects that might behave in a similar way. We provide guidelines for identifying other defect centers with similar properties. We present a list of physical criteria that these centers and their hosts should meet and explain how these requirements can be used in conjunction with electronic structure theory to intelligently sort through candidate systems. To elucidate these points, we compare electronic structure calculations of the NV-1 center in diamond with those of several deep centers in 4H silicon carbide (SiC). Using hybrid functionals, we report formation energies, configuration-coordinate diagrams, and defect-level diagrams to compare and contrast the properties of these defects. We find that the NC VSi - 1 center in SiC, a structural analog of the NV-1 center in diamond, may be a suitable center with very different optical transition energies. We also discuss how the proposed criteria can be translated into guidelines to discover NV analogs in other tetrahedrally coordinated materials. This work was performed in collaboration with J. R. Weber, W. F. Koehl, B. B. Buckley, A. Janotti, C. G. Van de Walle, and D. D. Awschalom. This work was supported by ARO, AFOSR, and NSF.

  8. Datacenter Changes vs. Employment Rates for Datacenter Managers In the Cloud Computing Era

    OpenAIRE

    Mirzoev, Timur; Benson, Bruce; Hillhouse, David; Lewis, Mickey

    2014-01-01

    Due to the evolving Cloud Computing paradigm, there is a prevailing concern that in the near future data center managers may be in short supply. Cloud computing, as a whole, is becoming more prevalent into today s computing world. In fact, cloud computing has become so popular that some are now referring to data centers as cloud centers. How does this interest in cloud computing translate into employment rates for data center managers? The popularity of the public and private cloud models are...

  9. Applied technology center business plan and market survey

    Science.gov (United States)

    Hodgin, Robert F.; Marchesini, Roberto

    1990-01-01

    Business plan and market survey for the Applied Technology Center (ATC), computer technology transfer and development non-profit corporation, is presented. The mission of the ATC is to stimulate innovation in state-of-the-art and leading edge computer based technology. The ATC encourages the practical utilization of late-breaking computer technologies by firms of all variety.

  10. Tornadoes: A Center Approach.

    Science.gov (United States)

    Christman-Rothlein, Liz; Meinbach, Anita M.

    1981-01-01

    Information is given on how to put together a learning center. Discusses information and activity packets for a complete learning center on tornadoes including objectives, directions, materials, photographs of physical arrangements, and posttest. (DC)

  11. Tehran Nuclear Research Center

    International Nuclear Information System (INIS)

    Taherzadeh, M.

    1977-01-01

    The Tehran Nuclear Research Center was formerly managed by the University of Tehran. This Center, after its transformation to the AEOI, has now become a focal point for basic research in the area of Nuclear Energy in Iran

  12. Day Care Centers

    Data.gov (United States)

    Department of Homeland Security — This database contains locations of day care centers for 50 states and Washington D.C. and Puerto Rico. The dataset only includes center based day care locations...

  13. Center for Functional Nanomaterials

    Data.gov (United States)

    Federal Laboratory Consortium — The Center for Functional Nanomaterials (CFN) explores the unique properties of materials and processes at the nanoscale. The CFN is a user-oriented research center...

  14. Hydrologic Engineering Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Hydrologic Engineering Center (HEC), an organization within the Institute for Water Resources, is the designated Center of Expertise for the U.S. Army Corps of...

  15. Carbon Monoxide Information Center

    Medline Plus

    Full Text Available ... OnSafety Blog Safety Education Centers Neighborhood Safety Network Community Outreach Resource Center Toy Recall Statistics CO Poster ... Sitemap RSS E-mail Inside CPSC Accessibility Privacy Policy Budget, Performances & Finance Open Government Freedom of Information ( ...

  16. MARYLAND ROBOTICS CENTER

    Data.gov (United States)

    Federal Laboratory Consortium — The Maryland Robotics Center is an interdisciplinary research center housed in the Institute for Systems Research (link is external)within the A. James Clark School...

  17. Find a Health Center

    Data.gov (United States)

    U.S. Department of Health & Human Services — HRSA Health Centers care for you, even if you have no health insurance – you pay what you can afford based on your income. Health centers provide services that...

  18. NIH Clinical Centers

    Data.gov (United States)

    Federal Laboratory Consortium — The NIH Clinical Center consists of two main facilities: The Mark O. Hatfield Clinical Research Center, which opened in 2005, houses inpatient units, day hospitals,...

  19. Genetic Science Learning Center

    Science.gov (United States)

    Genetic Science Learning Center Making science and health easy for everyone to understand Home News Our Team What We Do ... Collaboration Conferences Current Projects Publications Contact The Genetic Science Learning Center at The University of Utah is a ...

  20. Poison Control Centers

    Science.gov (United States)

    ... 1222 immediately. Name State American Association of Poison Control Centers Address AAPCC Central Office NOT A POISON ... not for emergency use. Arkansas ASPCA Animal Poison Control Center Address 1717 S. Philo Road, Suite 36 Urbana, ...

  1. Evaluation of a High Concentrated Contrast Media Injection Protocol in Combination with Low Tube Current for Dose Reduction in Coronary Computed Tomography Angiography: A Randomized, Two-center Prospective Study.

    Science.gov (United States)

    Sun, Yibo; Hua, Yanqing; Wang, Mingpeng; Mao, Dingbiao; Jin, Xiu; Li, Cheng; Shi, Kailei; Xu, Jianrong

    2017-12-01

    The study aimed to prospectively evaluate the radiation dose reduction potential and image quality (IQ) of a high-concentration contrast media (HCCM) injection protocol in combination with a low tube current (mAs) in coronary computed tomography angiography. Eighty-one consecutive patients (mean age: 62 years; 34 females; body mass index: 18-31) were included and randomized-assigned into two groups. All computed tomography (CT) examinations were performed in two groups with the same tube voltage (100 kV), flow rate of contrast medium (5.0 mL/s), and iodine dose (22.8 g). An automatic mAs and low concentration contrast medium (300 mgI/mL) were used in group A, whereas effective mAs was reduced by a factor 0.6 along with HCCM (400 mgI/mL) in group B. Radiation dose was assessed (CT dose index [CTDI vol ] and dose length product), and vessel-based objective IQ for various regions of interest (enhancement, noise, signal-to-noise ratio, and contrast-to-noise ratio), subjective IQ, noise, and motion artifacts were analyzed overall and vessel-based with a 5-point Likert scale. The CT attenuation of coronary arteries and image noise in group B were significantly higher than those in group A (ranges: 507.5-548.1 Hounsfield units vs 407.5-444.5 Hounsfield units; and 20.3 ± 8.6 vs 17.7 ± 8.0) (P ≤ 0.0166). There was no significant difference between the two groups in signal-to-noise ratio, contrast-to-noise ratio, and subjective IQ of coronary arteries (29.4-31.7, 30.0-37.0, and medium score of 5 in group A vs 29.4-32.4, 27.7-36.3, and medium score of 5 in group B, respectively, P ≥ 0.1859). Both mean CTDI vol and dose length product in group B were 58% of those of group A. HCCM combined with low tube current allows dose reduction in coronary computed tomography angiography and does not compromise IQ. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  2. Outline of Toshiba Business Information Center

    Science.gov (United States)

    Nagata, Yoshihiro

    Toshiba Business Information Center gathers and stores inhouse and external business information used in common within the Toshiba Corp., and provides companywide circulation, reference and other services. The Center established centralized information management system by employing decentralized computers, electronic file apparatus (30cm laser disc) and other office automation equipments. Online retrieval through LAN is available to search the stored documents and increasing copying requests are processed by electronic file. This paper describes the purpose of establishment of the Center, the facilities, management scheme, systematization of the files and the present situation and plan of each information service.

  3. Institute for Plasma Research, Stuttgart University. Annual report 1986

    International Nuclear Information System (INIS)

    1987-01-01

    In 1986 the IPP-IPF cooperation in the field of fusion related research and development has been successfully continued. The ECRH project as the largest activity is concentrated on the completion of the 1 MW/70 GHz long-pulse ECRH-system for the W VII-AS stellarator. The components tests promise optimum technical and - hopefully - physical results. Theoretical investigations mainly concern ray propagation and time dependent transport calculations for stellarators at ECRH conditions. A first successful experimental demonstration of the LIDAR scattering concept on JET was a rather encouraging progress for the laser diagnostic group. Besides this project the FIR scattering was continued on ASDEX with measurements of the k-spectra of the low frequency fluctuations during the various kinds of plasma heating. On the WEGA stellarator a programme of flux surface measurements with electron beams has been started. A meanwhile improved detection method will be applied - in cooperation with the Wendelstein team - at the coming W VII-AS stellarator. In the last year the plasma focus experiment POSEIDON achieved successful operation at full bank energy. (orig.)

  4. Forecast model for electromobile loads at Stuttgart airport and fair

    OpenAIRE

    Triebke, Henriette; Siehler, Elias; Staebler, Elmar

    2017-01-01

    To achieve national climate protection goals, the decarbonisation of the transport sector is of primordial importance. In this regard, electromobility has become one of the most promising automotive trends. However, a large-scale adoption of electric vehicles (EVs) would considerably burden the existing energy grid, especially in high-traffic areas. From the power industry’s perspective it is essential to anticipate the power capacities required for EV-charging in order to ensure sufficient p...

  5. Gunter Pritsch - "Bienenweide", published by Kosmos, Stuttgart, 2007

    Directory of Open Access Journals (Sweden)

    Anna Wróblewska

    2012-12-01

    Full Text Available The reviewed book “Bienenweide” (“Bee pastures” is a compendium of knowledge on honey bee plants. Its advantage is its rich photographic documentation in the form of excellently reproduced colourful photographs and tabulated data which are easily accessible for the reader. It can be a valuable scientific resource for researchers involved in evaluation of the apicultural value of both crop plants and plants found in natural plant communities. This book can also provide valuable information to practi-cing beekeepers interested in species selection designed to expand food resources for the honey bee and other pollinators. It can also be re-commended as the literature of the subject to university students in the fields of agriculture and horticulture.

  6. Conjunto de dos escuelas, en Zellerstrasse, Stuttgart, Alemania

    Directory of Open Access Journals (Sweden)

    Kammerer, Hans

    1969-04-01

    Full Text Available This integrated project brings together two new schools, the «Lehrschule» for underdeveloped children and the «Sprachschule» for deaf children who have speach defects. Each school has its own gymnasium, and the «Lehrschule» has eight classrooms, a natural science and a carpentry classroom, a kitchen for cookery instruction, and a sewing room, and another for religious teaching, as well as a library. The «Sprachschule» has four classrooms, for deaf children and four for children having speach defects. There are rooms for natural science tuition, for carpentry, and also for religious instruction and the use of hearing devices. There is also a library. There are no significant novelties in the construction method, but the schools include all modem advances, as well as works of art that enhance its quality.El conjunto construido agrupa dos nuevas Escuelas: la «Lehenschule » para niños retrasados y la «Sprachheilschule» para niños sordos o que tienen algún defecto de pronunciación. Cada escuela posee un gimnasio propio; y concretando, la «Lehenschule» dispone: de ocho clases; una sala de Ciencias Naturales y dos de carpintería; una cocina para la enseñanza de la confección de guisos domésticos; una sala de costura y otra para instrucción religiosa, y una biblioteca. En la «Sprachheilschule» se distribuyen: cinco clases para niños sordos y cuatro para niños con defectos de dicción; una sala de Ciencias Naturales, otra de costura, dos de carpintería, una de instrucción religiosa y otra de audiometría, y una biblioteca. El sistema constructivo no ofrece novedades de interés, pero se ha dotado al edificio de todos los adelantos técnicos, incorporándole obras de arte que resaltan su categoría.

  7. Headquarters, Special Operations Command, Africa Stuttgart, Germany (redacted)

    Science.gov (United States)

    2016-08-09

    make sure the boss is not stepping over the line in te1ms of reprisal." - documented that conversation on November 4, 2011, in an MFR. stated that...play nice and wait until I’m gone. Smile . Act like you’re going to work ... but ifyou continue to unde1mine my authority as a commander, I’m going to

  8. Data Center Tasking.

    Science.gov (United States)

    Temares, M. Lewis; Lutheran, Joseph A.

    Operations tasking for data center management is discussed. The original and revised organizational structures of the data center at the University of Miami are also described. The organizational strategy addresses the functions that should be performed by the data center, anticipates the specialized skills required, and addresses personnel…

  9. Center of buoyancy definition

    International Nuclear Information System (INIS)

    Sandberg, V.

    1988-12-01

    The center of buoyancy of an arbitrary shaped body is defined in analogy to the center of gravity. The definitions of the buoyant force and center of buoyancy in terms of integrals over the area of the body are converted to volume integrals and shown to have simple intuitive interpretations

  10. Wound care centers

    Science.gov (United States)

    Pressure ulcer - wound care center; Decubitus ulcer - wound care center; Diabetic ulcer - wound care center; Surgical wound - wound ... Common types of non-healing wounds include: Pressure sores Surgical ... flow, or swollen legs Certain wounds may not heal well due to: ...

  11. Nuclear Reaction Data Centers

    International Nuclear Information System (INIS)

    McLane, V.; Nordborg, C.; Lemmel, H.D.; Manokhin, V.N.

    1988-01-01

    The cooperating Nuclear Reaction Data Centers are involved in the compilation and exchange of nuclear reaction data for incident neutrons, charged particles and photons. Individual centers may also have services in other areas, e.g., evaluated data, nuclear structure and decay data, reactor physics, nuclear safety; some of this information may also be exchanged between interested centers. 20 refs., 1 tab

  12. User perspectives on computer applications

    International Nuclear Information System (INIS)

    Trammell, H.E.

    1979-04-01

    Experiences of a technical group that uses the services of computer centers are recounted. An orientation on the ORNL Engineering Technology Division and its missions is given to provide background on the diversified efforts undertaken by the Division and its opportunities to benefit from computer technology. Specific ways in which computers are used within the Division are described; these include facility control, data acquisition, data analysis, theory applications, code development, information processing, cost control, management of purchase requisitions, maintenance of personnel information, and control of technical publications. Problem areas found to need improvement are the overloading of computers during normal working hours, lack of code transportability, delay in obtaining routine programming, delay in key punching services, bewilderment in the use of large computer centers, complexity of job control language, and uncertain quality of software. 20 figures

  13. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  14. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  15. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  16. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  17. Diagnostic value of thallium-201 myocardial perfusion IQ-SPECT without and with computed tomography-based attenuation correction to predict clinically significant and insignificant fractional flow reserve: A single-center prospective study.

    Science.gov (United States)

    Tanaka, Haruki; Takahashi, Teruyuki; Ohashi, Norihiko; Tanaka, Koichi; Okada, Takenori; Kihara, Yasuki

    2017-12-01

    The aim of this study was to clarify the predictive value of fractional flow reserve (FFR) determined by myocardial perfusion imaging (MPI) using thallium (Tl)-201 IQ-SPECT without and with computed tomography-based attenuation correction (CT-AC) for patients with stable coronary artery disease (CAD).We assessed 212 angiographically identified diseased vessels using adenosine-stress Tl-201 MPI-IQ-SPECT/CT in 84 consecutive, prospectively identified patients with stable CAD. We compared the FFR in 136 of the 212 diseased vessels using visual semiquantitative interpretations of corresponding territories on MPI-IQ-SPECT images without and with CT-AC.FFR inversely correlated most accurately with regional summed difference scores (rSDS) in images without and with CT-AC (r = -0.584 and r = -0.568, respectively, both P system can predict FFR at an optimal cut-off of reserved.

  18. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  19. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  20. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  1. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  2. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  3. Computer Training for Seniors: An Academic-Community Partnership

    Science.gov (United States)

    Sanders, Martha J.; O'Sullivan, Beth; DeBurra, Katherine; Fedner, Alesha

    2013-01-01

    Computer technology is integral to information retrieval, social communication, and social interaction. However, only 47% of seniors aged 65 and older use computers. The purpose of this study was to determine the impact of a client-centered computer program on computer skills, attitudes toward computer use, and generativity in novice senior…

  4. Framework Resources Multiply Computing Power

    Science.gov (United States)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  5. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  6. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  7. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  8. Energy-oriented modernisation of a school building at Stuttgart-Plieningen; Energiegerechte Sanierung eines Schulgebaeudes in Stuttgart-Plieningen

    Energy Technology Data Exchange (ETDEWEB)

    Kienzlen, V. [Stadt Stuttgart (Germany). Amt fuer Umweltschutz; Erhorn, H. [Fraunhofer-Institut fuer Bauphysik, Stuttgart (Germany); Biegert, B. [Inst. fuer Kernenergie und Energiesysteme, Lehrstuhl fuer Heiz- und Raumlufttechnik, Stuttgart (Germany)

    1997-12-31

    A typical school building serves to exemplify the opportunities connected with the modernization of the shell of a building and its technical equipment. By means of simultaneous, integrated planning and execution steps, positive interactions are to be obtained. The target set is to maximize energy conservation while optimizing overall economy. The following individual concepts are dealt with in detail: thermal protection, lighting, heating system, control of space heating. (MSK) [Deutsch] An einem typischen Schulgebaeude wird gezeigt, welche Moeglichkeiten eine Sanierung von Gebaeudehuelle und Anlagentechnik bietet. Durch zeitgleiche, integrierte Planung und Durchfuehrung der Sanierungsmassnahmen sollen positive Wechselwirkungen genutzt werden. Es wird die maximal moegliche Energieeinsparung bei optimierter Gesamtwirtschaftlichkeit angestrebt. Im Einzelnen werden die Konzeptionen fuer den Waermeschutz, fuer die Beleuchtung, fuer die Heizungsanlage sowie fuer die Regelung der Raumbeheizung naaeher erlaeutert.

  9. 6th Stuttgart international symposium on automotive and engine technology. Proceedings; 6. Internationales Stuttgarter Symposium: Kraftfahrwesen und Verbrennungsmotoren. Konferenzband

    Energy Technology Data Exchange (ETDEWEB)

    Bargende, M.; Reuss, H.C.; Wiedemann, J. (eds.)

    2005-07-01

    The proceedings volume has three sections: Part 1, 'Engines' discusses diesel engines, engine mechanics, gasoline engines, FVV projects, analysis and simulation. Part 2, 'Vehicles' contains papers on car concepts, dynamics, acoustics and vibrations, aeroacoustics, aerodynamics, thermomanagement and engine cooling. Part 3, 'Motor car mechatronics' goes into board network and energy management, sensors and actuators, dynamics and control, testing and diagnosis, software and design tools, networking and architecture. (orig.)

  10. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  11. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  12. PREFACE: IUPAP C20 Conference on Computational Physics (CCP 2011)

    Science.gov (United States)

    Troparevsky, Claudia; Stocks, George Malcolm

    2012-12-01

    Increasingly, computational physics stands alongside experiment and theory as an integral part of the modern approach to solving the great scientific challenges of the day on all scales - from cosmology and astrophysics, through climate science, to materials physics, and the fundamental structure of matter. Computational physics touches aspects of science and technology with direct relevance to our everyday lives, such as communication technologies and securing a clean and efficient energy future. This volume of Journal of Physics: Conference Series contains the proceedings of the scientific contributions presented at the 23rd Conference on Computational Physics held in Gatlinburg, Tennessee, USA, in November 2011. The annual Conferences on Computational Physics (CCP) are dedicated to presenting an overview of the most recent developments and opportunities in computational physics across a broad range of topical areas and from around the world. The CCP series has been in existence for more than 20 years, serving as a lively forum for computational physicists. The topics covered by this conference were: Materials/Condensed Matter Theory and Nanoscience, Strongly Correlated Systems and Quantum Phase Transitions, Quantum Chemistry and Atomic Physics, Quantum Chromodynamics, Astrophysics, Plasma Physics, Nuclear and High Energy Physics, Complex Systems: Chaos and Statistical Physics, Macroscopic Transport and Mesoscopic Methods, Biological Physics and Soft Materials, Supercomputing and Computational Physics Teaching, Computational Physics and Sustainable Energy. We would like to take this opportunity to thank our sponsors: International Union of Pure and Applied Physics (IUPAP), IUPAP Commission on Computational Physics (C20), American Physical Society Division of Computational Physics (APS-DCOMP), Oak Ridge National Laboratory (ORNL), Center for Defect Physics (CDP), the University of Tennessee (UT)/ORNL Joint Institute for Computational Sciences (JICS) and Cray, Inc

  13. St. Luke's Medical Center: technologizing health care

    International Nuclear Information System (INIS)

    Tumanguil, S.S.

    1994-01-01

    The computerization of the St. Luke's Medical Center improved the hospital administration and management, particularly in nuclear medicine department. The use of computer-aided X-ray simulator machine and computerized linear accelerator machine in diagnosing and treating cancer are the most recent medical technological breakthroughs that benefited thousands of Filipino cancer patients. 4 photos

  14. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  15. Agreement on economic and technological cooperation between the Federal Republic of Germany and the GDR. Project part 3.2, ''NDT and QA''. Project task 2.11. Experiments with the full-size vessel in Stuttgart for selection of practice-relevant non-destructive testing methods for evaluation of the value and performance of recurrent inspections of reactor components. Final report

    International Nuclear Information System (INIS)

    Betzold, K.; Brinette, R.; Bonitz, F.

    1992-01-01

    The efficiency of NDT methods such as ALOK, SAFT, EMUS, LLT, phased array, and multi-frequency eddy current testing which are generally used for reactor components recurrent inspection has been verified with experiments using two test specimens. These are a section of a main coolant pipe and the full-size vessel installed at MPA-Stuttgart, furnished with PWR test bodies with artificial defects and artificially applied natural defects. The defects have been detected with commercial probes as well as with probes optimized for the NDT methods EMUS, LLT, phased array, and multi-frequency eddy current testing. Type, location, orientation and geometry of the defects have been measured, also recording the influence of type of defect on the efficiency of the NDT methods, in order to reveal problems linked with the various methods as well as their advantages. Further tests have been made for evaluation of a combination of ALOK and SAFT using novel, specifically developed test probes, and a combination of ALOK and phased array testing. (orig.) [de

  16. Call Center Capacity Planning

    DEFF Research Database (Denmark)

    Nielsen, Thomas Bang

    in order to relate the results to the service levels used in call centers. Furthermore, the generic nature of the approximation is demonstrated by applying it to a system incorporating a dynamic priority scheme. In the last paper Optimization of overflow policies in call centers, overflows between agent......The main topics of the thesis are theoretical and applied queueing theory within a call center setting. Call centers have in recent years become the main means of communication between customers and companies, and between citizens and public institutions. The extensively computerized infrastructure...... in modern call centers allows for a high level of customization, but also induces complicated operational processes. The size of the industry together with the complex and labor intensive nature of large call centers motivates the research carried out to understand the underlying processes. The customizable...

  17. The guiding center Lagrangian

    International Nuclear Information System (INIS)

    Larsson, J.

    1986-01-01

    Recursion relations determining the guiding center Langrangian Λ and the associated guiding center variables to all orders are derived. We consider some particularly simple forms of Λ obtainable by specific choices of certain arbitrary functions appearing as free parameters in the theory. It is, for example, possible to locally define the guiding center variables so that the expression for the corresponding Langrangian is unchanged by all higher order terms. (orig.)

  18. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  19. Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Anisenkov, A; Belov, S; Kaplin, V; Korol, A; Skovpen, K; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2012-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.

  20. Quantum computing with defects.

    Science.gov (United States)

    Weber, J R; Koehl, W F; Varley, J B; Janotti, A; Buckley, B B; Van de Walle, C G; Awschalom, D D

    2010-05-11

    Identifying and designing physical systems for use as qubits, the basic units of quantum information, are critical steps in the development of a quantum computer. Among the possibilities in the solid state, a defect in diamond known as the nitrogen-vacancy (NV(-1)) center stands out for its robustness--its quantum state can be initialized, manipulated, and measured with high fidelity at room temperature. Here we describe how to systematically identify other deep center defects with similar quantum-mechanical properties. We present a list of physical criteria that these centers and their hosts should meet and explain how these requirements can be used in conjunction with electronic structure theory to intelligently sort through candidate defect systems. To illustrate these points in detail, we compare electronic structure calculations of the NV(-1) center in diamond with those of several deep centers in 4H silicon carbide (SiC). We then discuss the proposed criteria for similar defects in other tetrahedrally coordinated semiconductors.

  1. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  2. Chemical Security Analysis Center

    Data.gov (United States)

    Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...

  3. Center for Women Veterans

    Science.gov (United States)

    ... Business with VA Acquisition, Logistics, & Construction Small & Veteran Business Programs VetBiz.gov Financial & Asset Enterprise Management Security Investigation Center/Background Clearances Freedom of Information ...

  4. Small Business Development Center

    Data.gov (United States)

    Small Business Administration — Small Business Development Centers (SBDCs) provide assistance to small businesses and aspiring entrepreneurs throughout the United States and its territories. SBDCs...

  5. Center for Deployment Psychology

    Data.gov (United States)

    Federal Laboratory Consortium — The Center for Deployment Psychology was developed to promote the education of psychologists and other behavioral health specialists about issues pertaining to the...

  6. Advanced Simulation Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Advanced Simulation Center consists of 10 individual facilities which provide missile and submunition hardware-in-the-loop simulation capabilities. The following...

  7. Electron Microscopy Center (EMC)

    Data.gov (United States)

    Federal Laboratory Consortium — The Electron Microscopy Center (EMC) at Argonne National Laboratory develops and maintains unique capabilities for electron beam characterization and applies those...

  8. Audio Visual Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Audiovisual Services Center provides still photographic documentation with laboratory support, video documentation, video editing, video duplication, photo/video...

  9. Test Control Center (TCC)

    Data.gov (United States)

    Federal Laboratory Consortium — The Test Control Center (TCC) provides a consolidated facility for planning, coordinating, controlling, monitoring, and analyzing distributed test events. ,The TCC...

  10. Great Lakes Science Center

    Data.gov (United States)

    Federal Laboratory Consortium — Since 1927, Great Lakes Science Center (GLSC) research has provided critical information for the sound management of Great Lakes fish populations and other important...

  11. Magnetic-fusion energy and computers

    International Nuclear Information System (INIS)

    Killeen, J.

    1982-01-01

    The application of computers to magnetic fusion energy research is essential. In the last several years the use of computers in the numerical modeling of fusion systems has increased substantially. There are several categories of computer models used to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are also in use. To meet the needs of the fusion program, the National Magnetic Fusion Energy Computer Center has been established at the Lawrence Livermore National Laboratory. A large central computing facility is linked to smaller computer centers at each of the major MFE laboratories by a communication network. In addition to providing cost effective computing services, the NMFECC environment stimulates collaboration and the sharing of computer codes among the various fusion research groups

  12. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  13. 8th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  14. Climate Prediction Center (CPC) Palmer Drought and Crop Moisture Indices

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Climate Prediction Center (CPC) Palmer Drought Severity and Crop Moisture Indices are computed for the 344 U.S. Climate Divisions on a weekly basis based on a...

  15. Carbon Monoxide Information Center

    Medline Plus

    Full Text Available ... main content Languages 简体中文 English Bahasa Indonesia 한국어 Español ภาษาไทย Tiếng Việt Text Size: Decrease Font Increase ... Monoxide Information Center Carbon Monoxide Information Center En Español The Invisible Killer Carbon monoxide, also known as ...

  16. The Revitalized Tutoring Center

    Science.gov (United States)

    Koselak, Jeremy

    2017-01-01

    One high-leverage strategy rooted in a strong research base--the revitalized tutoring center--provides a wealth of opportunity to students who may be otherwise underserved. This embedded, open-all-day tutoring center supports collaborative teacher teams by using peer tutors and community volunteers. By centralizing resources and providing supports…

  17. From Periphery To Center

    DEFF Research Database (Denmark)

    Carré, David

    2014-01-01

    the notions of Center/Periphery. As Hermans (2001) proposed, center and periphery are not fixed ‘I-positions’ of the self; in this vein, these notions are explored as relevant theoretical tools for addressing the developmental trajectories involved in the construction of scientific identities. In sum...

  18. ENERGY RESOURCES CENTER

    Energy Technology Data Exchange (ETDEWEB)

    Sternberg, Virginia

    1979-11-01

    First I will give a short history of this Center which has had three names and three moves (and one more in the offing) in three years. Then I will tell you about the accomplishments made in the past year. And last, I will discuss what has been learned and what is planned for the future. The Energy and Environment Information Center (EEIC), as it was first known, was organized in August 1975 in San Francisco as a cooperative venture by the Federal Energy Administration (FEA), Energy Research and Development Administration (ERDA) and the Environmental Protection Agency (EPA). These three agencies planned this effort to assist the public in obtaining information about energy and the environmental aspects of energy. The Public Affairs Offices of FEA, ERDA and EPA initiated the idea of the Center. One member from each agency worked at the Center, with assistance from the Lawrence Berkeley Laboratory Information Research Group (LBL IRG) and with on-site help from the EPA Library. The Center was set up in a corner of the EPA Library. FEA and ERDA each contributed one staff member on a rotating basis to cover the daily operation of the Center and money for books and periodicals. EPA contributed space, staff time for ordering, processing and indexing publications, and additional money for acquisitions. The LBL Information Research Group received funds from ERDA on a 189 FY 1976 research project to assist in the development of the Center as a model for future energy centers.

  19. Accredited Birth Centers

    Science.gov (United States)

    ... Danbury, CT 06810 203-748-6000 Accredited Since March 1998 Corvallis Birth & Women's Health Center Accredited 2314 NW Kings Blvd, Suite ... Washington, DC 20002 202-398-5520 Accredited Since March 2001 Flagstaff Birth and Women's Center Accredited 401 West Aspen Avenue Flagstaff, AZ ...

  20. Technology Information Center

    International Nuclear Information System (INIS)

    Emerson, E.L.; Shepherd, E.W.; Minor, E.E.

    1980-01-01

    A Transportation Technology Center (TTC) has been established at Sandia to address the transportation of nuclear waste and spent fuel. The Technology Information Center (TIC) acts as TTC's clearing house for nuclear material transportation information. TIC's activities are divided into three activities: public information, policy information, and technical information. Some of the uses of TIC's activities are briefly outlined

  1. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  2. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  3. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  4. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  5. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  6. Funding Opportunity: Genomic Data Centers

    Science.gov (United States)

    Funding Opportunity CCG, Funding Opportunity Center for Cancer Genomics, CCG, Center for Cancer Genomics, CCG RFA, Center for cancer genomics rfa, genomic data analysis network, genomic data analysis network centers,

  7. Techbelt Energy Innovation Center

    Energy Technology Data Exchange (ETDEWEB)

    Marie, Hazel [Youngstown State Univ., OH (United States); Nestic, Dave [TechBelt Energy Innovation Center, Warren, OH (United States); Hripko, Michael [Youngstown State Univ., OH (United States); Abraham, Martin [Youngstown State Univ., OH (United States)

    2017-06-30

    This project consisted of three main components 1) The primary goal of the project was to renovate and upgrade an existing commercial building to the highest possible environmentally sustainable level for the purpose of creating an energy incubator. This initiative was part of the Infrastructure Technologies Program, through which a sustainable energy demonstration facility was to be created and used as a research and community outreach base for sustainable energy product and process incubation; 2) In addition, fundamental energy related research on wind energy was performed; a shrouded wind turbine on the Youngstown State University campus was commissioned; and educational initiatives were implemented; and 3) The project also included an education and outreach component to inform and educate the public in sustainable energy production and career opportunities. Youngstown State University and the Tech Belt Energy Innovation Center (TBEIC) renovated a 37,000 square foot urban building which is now being used as a research and development hub for the region’s energy technology innovation industry. The building houses basic research facilities and business development in an incubator format. In addition, the TBEIC performs community outreach and education initiatives in advanced and sustainable energy. The building is linked to a back warehouse which will eventually be used as a build-out for energy laboratory facilities. The projects research component investigated shrouded wind turbines, and specifically the “Windcube” which was renamed the “Wind Sphere” during the course of the project. There was a specific focus on the development in the theory of shrouded wind turbines. The goal of this work was to increase the potential efficiency of wind turbines by improving the lift and drag characteristics. The work included computational modeling, scale models and full-sized design and construction of a test turbine. The full-sized turbine was built on the YSU

  8. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  9. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  10. Trip attraction rates of shopping centers in Northern New Castle County, Delaware.

    Science.gov (United States)

    2004-07-01

    This report presents the trip attraction rates of the shopping centers in Northern New : Castle County in Delaware. The study aims to provide an alternative to ITE Trip : Generation Manual (1997) for computing the trip attraction of shopping centers ...

  11. Computer-assisted optimization of chest fluoroscopy

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Filippova, N.V.; Kirillov, L.P.; Momsenko, S.F.

    1987-01-01

    The main trends in the use of computer for the optimization of chest fluorography among employees and workers of a large industrial enterprise are considered. The following directions were determined: automatted sorting of fluorograms, formalization of X-ray signs in describing fluorograms, organization of a special system of fluorographic data management. Four levels of algorithms to solve the problems of fluorography were considered: 1) shops, personnel department, etc.; 2) an automated center for mass screening and a medical unit; 3) a computer center and 4) planning and management service. The results of computer use over a 3-year period were analyzed. The efficacy of computer was shown

  12. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  13. Quality and Dose Optimized CT Trauma Protocol - Recommendation from a University Level-I Trauma Center.

    Science.gov (United States)

    Kahn, Johannes; Kaul, David; Böning, Georg; Rotzinger, Roman; Freyhardt, Patrick; Schwabe, Philipp; Maurer, Martin H; Renz, Diane Miriam; Streitparth, Florian

    2017-09-01

    Purpose  As a supra-regional level-I trauma center, we evaluated computed tomography (CT) acquisitions of polytraumatized patients for quality and dose optimization purposes. Adapted statistical iterative reconstruction [(AS)IR] levels, tube voltage reduction as well as a split-bolus contrast agent (CA) protocol were applied. Materials and Methods  61 patients were split into 3 different groups that differed with respect to tube voltage (120 - 140 kVp) and level of applied ASIR reconstruction (ASIR 20 - 50 %). The CT protocol included a native acquisition of the head followed by a single contrast-enhanced acquisition of the whole body (64-MSCT). CA (350 mg/ml iodine) was administered as a split bolus injection of 100 ml (2 ml/s), 20 ml NaCl (1 ml/s), 60 ml (4 ml/s), 40 ml NaCl (4 ml/s) with a scan delay of 85 s to detect injuries of both the arterial system and parenchymal organs in a single acquisition. Both the quantitative (SNR/CNR) and qualitative (5-point Likert scale) image quality was evaluated in parenchymal organs that are often injured in trauma patients. Radiation exposure was assessed. Results  The use of IR combined with a reduction of tube voltage resulted in good qualitative and quantitative image quality and a significant reduction in radiation exposure of more than 40 % (DLP 1087 vs. 647 mGyxcm). Image quality could be improved due to a dedicated protocol that included different levels of IR adapted to different slice thicknesses, kernels and the examined area for the evaluation of head, lung, body and bone injury patterns. In synopsis of our results, we recommend the implementation of a polytrauma protocol with a tube voltage of 120 kVp and the following IR levels: cCT 5mm: ASIR 20; cCT 0.625 mm: ASIR 40; lung 2.5 mm: ASIR 30, body 5 mm: ASIR 40; body 1.25 mm: ASIR 50; body 0.625 mm: ASIR 0. Conclusion  A dedicated adaptation of the CT trauma protocol (level of reduction of tube voltage and of IR

  14. Handbook for the Computer Security Certification of Trusted Systems

    National Research Council Canada - National Science Library

    Weissman, Clark

    1995-01-01

    Penetration testing is required for National Computer Security Center (NCSC) security evaluations of systems and products for the B2, B3, and A1 class ratings of the Trusted Computer System Evaluation Criteria (TCSEC...

  15. CILT2000: Ubiquitous Computing--Spanning the Digital Divide.

    Science.gov (United States)

    Tinker, Robert; Vahey, Philip

    2002-01-01

    Discusses the role of ubiquitous and handheld computers in education. Summarizes the contributions of the Center for Innovative Learning Technologies (CILT) and describes the ubiquitous computing sessions at the CILT2000 Conference. (Author/YDS)

  16. Information sharing guidebook for transportation management centers, emergency operations centers, and fusion centers

    Science.gov (United States)

    2010-06-01

    This guidebook provides an overview of the mission and functions of transportation management centers, emergency operations centers, and fusion centers. The guidebook focuses on the types of information these centers produce and manage and how the sh...

  17. Information sharing guidebook for transportation management centers, emergency operations centers, and fusion centers.

    Science.gov (United States)

    2010-06-01

    This guidebook provides an overview of the mission and functions of transportation management centers, emergency operations centers, and fusion centers. The guidebook focuses on the types of information these centers produce and manage and how the sh...

  18. User-centered design

    International Nuclear Information System (INIS)

    Baik, Joo Hyun; Kim, Hyeong Heon

    2008-01-01

    The simplification philosophy, as an example, that both of EPRI-URD and EUR emphasize is treated mostly for the cost reduction of the nuclear power plants, but not for the simplification of the structure of user's tasks, which is one of the principles of user-centered design. A user-centered design is a philosophy based on the needs and interests of the user, with an emphasis on making products usable and understandable. However, the nuclear power plants offered these days by which the predominant reactor vendors are hardly user-centered but still designer-centered or technology-centered in viewpoint of fulfilling user requirements. The main goal of user-centered design is that user requirements are elicited correctly, reflected properly into the system requirements, and verified thoroughly by the tests. Starting from the user requirements throughout to the final test, each requirement should be traceable. That's why requirement traceability is a key to the user-centered design, and main theme of a requirement management program, which is suggested to be added into EPRI-URD and EUR in the section of Design Process. (author)

  19. Planning for the Automation of School Library Media Centers.

    Science.gov (United States)

    Caffarella, Edward P.

    1996-01-01

    Geared for school library media specialists whose centers are in the early stages of automation or conversion to a new system, this article focuses on major components of media center automation: circulation control; online public access catalogs; machine readable cataloging; retrospective conversion of print catalog cards; and computer networks…

  20. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  1. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  2. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  3. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  4. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  5. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  6. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  7. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  8. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  9. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  10. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  11. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  12. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  13. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  14. Montessori Transformation at Computer Associates.

    Science.gov (United States)

    Mars, Lisa

    2002-01-01

    Describes the growth of the all-day Montessori program for children ages 6 weeks to 6 years at Computer Associates' corporate headquarters and multiple sites worldwide. Focuses on placement of AMI Montessori-trained teachers, refurbishing of the child development centers to fit Montessori specifications, and the Nido--the children's community--and…

  15. Portability and the National Energy Software Center

    International Nuclear Information System (INIS)

    Butler, M.K.

    1978-01-01

    The software portability problem is examined from the viewpoint of experience gained in the operation of a software exchange and information center. First, the factors contributing to the program interchange to date are identified, then major problem areas remaining are noted. The import of the development of programing language and documentation standards is noted, and the program packaging procedures and dissemination practices employed by the Center to facilitate successful software transport are described. Organization, or installation, dependencies of the computing environment, often hidden from the program author, and data interchange complexities are seen as today's primary issues, with dedicated processors and network communications offering an alternative solution

  16. Space Flight Operations Center local area network

    Science.gov (United States)

    Goodman, Ross V.

    1988-01-01

    The existing Mission Control and Computer Center at JPL will be replaced by the Space Flight Operations Center (SFOC). One part of the SFOC is the LAN-based distribution system. The purpose of the LAN is to distribute the processed data among the various elements of the SFOC. The SFOC LAN will provide a robust subsystem that will support the Magellan launch configuration and future project adaptation. Its capabilities include (1) a proven cable medium as the backbone for the entire network; (2) hardware components that are reliable, varied, and follow OSI standards; (3) accurate and detailed documentation for fault isolation and future expansion; and (4) proven monitoring and maintenance tools.

  17. National Nuclear Data Center status report

    International Nuclear Information System (INIS)

    2002-01-01

    This paper is the status report of the US National Nuclear Data Center, Brookhaven. It describes the new NDS approach to customer services, which is based on users initiating wish lists on topics of interest with the possibility to receive reports in hardcopy or electronically forms. After completion within the next two years of the multi platform software for management and data retrievals from shared databases, users will have the opportunity to install directly their own local nuclear data center for desktop applications. The paper describes the computer facilities, the nuclear reaction data structure, the database migration and the customer services. (a.n.)

  18. Scientific activities 1980 Nuclear Research Center ''Democritos''

    International Nuclear Information System (INIS)

    1982-01-01

    The scientific activities and achievements of the Nuclear Research Center Democritos for the year 1980 are presented in the form of a list of 76 projects giving title, objectives, responsible of each project, developed activities and the pertaining lists of publications. The 16 chapters of this work cover the activities of the main Divisions of the Democritos NRC: Electronics, Biology, Physics, Chemistry, Health Physics, Reactor, Scientific Directorate, Radioisotopes, Environmental Radioactivity, Soil Science, Computer Center, Uranium Exploration, Medical Service, Technological Applications, Radioimmunoassay and Training. (N.C.)

  19. Multiple single-centered attractors

    International Nuclear Information System (INIS)

    Dominic, Pramod; Mandal, Taniya; Tripathy, Prasanta K.

    2014-01-01

    In this paper we study spherically symmetric single-centered attractors in N=2 supergravity in four dimensions. The attractor points are obtained by extremising the effective black hole potential in the moduli space. Both supersymmetric as well as non-supersymmetric attractors exist in mutually exclusive domains of the charge lattice. We construct axion free supersymmetric as well as non-supersymmetric multiple attractors in a simple two parameter model. We further obtain explicit examples of two distinct non-supersymmetric attractors in type IIA string theory compactified on K3×T"2 carrying D0−D4−D6 charges. We compute the entropy of these attractors and analyse their stability in detail.

  20. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  1. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  2. Top scientific research center deploys Zambeel Aztera (TM) network storage system in high performance environment

    CERN Multimedia

    2002-01-01

    " The National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory has implemented a Zambeel Aztera storage system and software to accelerate the productivity of scientists running high performance scientific simulations and computations" (1 page).

  3. Sensitive Data Protection Based on Intrusion Tolerance in Cloud Computing

    OpenAIRE

    Jingyu Wang; xuefeng Zheng; Dengliang Luo

    2011-01-01

    Service integration and supply on-demand coming from cloud computing can significantly improve the utilization of computing resources and reduce power consumption of per service, and effectively avoid the error of computing resources. However, cloud computing is still facing the problem of intrusion tolerance of the cloud computing platform and sensitive data of new enterprise data center. In order to address the problem of intrusion tolerance of cloud computing platform and sensitive data in...

  4. Advanced Cancer Detection Center

    National Research Council Canada - National Science Library

    Ruckdeschel, John

    1999-01-01

    ... through screening, and the testing of methods to prevent cancer. In addition, the Center created and supports education programs to provide increased cancer awareness and established working collaborations with the James...

  5. Advanced Missile Signature Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Advanced Missile Signature Center (AMSC) is a national facility supporting the Missile Defense Agency (MDA) and other DoD programs and customers with analysis,...

  6. FEMA Disaster Recovery Centers

    Data.gov (United States)

    Department of Homeland Security — This is a search site for FEMA's Disaster Recovery Centers (DRC). A DRC is a readily accessible facility or mobile office set up by FEMA where applicants may go for...

  7. World Trade Center

    Index Scriptorium Estoniae

    2006-01-01

    Esilinastus katastroofifilm "World Trade Center" : stsenarist Andrea Berloff : režissöör Oliver Stone : kunstnik Jan Roelfs : osades Nicholas Cage, Michael Pena, Stephen Dorff jpt : Ameerika Ühendriigid 2006. Ka filmi prototüüpidest

  8. USU Patient Simulation Center

    Data.gov (United States)

    Federal Laboratory Consortium — he National Capital Area (NCA) Medical Simulation Center is a state-of-the-art training facility located near the main USU campus. It uses simulated patients (i.e.,...

  9. Carbon Monoxide Information Center

    Medline Plus

    Full Text Available ... Community Outreach Resource Center Toy Recall Statistics CO Poster Contest Pool Safely Business & Manufacturing Business & Manufacturing Business ... Featured Resources CPSC announces winners of carbon monoxide poster contest Video View the blog Clues You Can ...

  10. Center for Contaminated Sediments

    Data.gov (United States)

    Federal Laboratory Consortium — The U.S. Army Corps of Engineers Center for Contaminated Sediments serves as a clearinghouse for technology and expertise concerned with contaminated sediments. The...

  11. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  12. Mental Health Screening Center

    Science.gov (United States)

    ... Releases & Announcements Public Service Announcements Partnering with DBSA Mental Health Screening Center These online screening tools are not ... you have any concerns, see your doctor or mental health professional. Depression Screening for Adult Depression Screening for ...

  13. Carbon Monoxide Information Center

    Medline Plus

    Full Text Available ... Reports Injury Statistics NEISS Injury Data Consumer Opinion Surveys About CPSC About CPSC Chairman Commissioners Contact / FAQ ... Guide View All CO Safety Guides ")); jQuery(".node-type-safety-education-center .region-sidebar-second").css('display', " ...

  14. Climate Prediction Center - Outlooks

    Science.gov (United States)

    Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Home Site Map News Web resources and services. HOME > Outreach > Publications > Climate Diagnostics Bulletin Climate Diagnostics Bulletin - Tropics Climate Diagnostics Bulletin - Forecast Climate Diagnostics

  15. Heart Information Center

    Science.gov (United States)

    ... Rounds Seminar Series & Daily Conferences Fellowships and Residencies School of Perfusion Technology Education Resources Library & Learning Resource Center CME Resources THI Journal THI Cardiac Society Register for the Cardiac Society ...

  16. Cooperative Tagging Center (CTC)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Cooperative Tagging Center (CTC) began as the Cooperative Game Fish Tagging Program (GTP) at Woods Hole Oceanographic Institute (WHOI) in 1954. The GTP was...

  17. National Automotive Center - NAC

    Data.gov (United States)

    Federal Laboratory Consortium — Encouraged by the advantages of collaboration, the U.S. Army Tank Automotive Research, Development and Engineering Center (TARDEC) worked with the Secretary of the...

  18. Center for Hydrogen Storage.

    Science.gov (United States)

    2013-06-01

    The main goals of this project were to (1) Establish a Center for Hydrogen Storage Research at Delaware State University for the preparation and characterization of selected complex metal hydrides and the determination their suitability for hydrogen ...

  19. Mobility Data Analytics Center.

    Science.gov (United States)

    2016-01-01

    Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...

  20. HUD Homeownership Centers

    Data.gov (United States)

    Department of Housing and Urban Development — HUD Homeownership Centers (HOCs) insure single family Federal Housing Administration (FHA) mortgages and oversee the selling of HUD homes. FHA has four Homeownership...