WorldWideScience

Sample records for supercomputers topics considered

  1. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-01-01

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  2. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-12-31

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  3. Desktop supercomputer: what can it do?

    Science.gov (United States)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-12-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  4. Desktop supercomputer: what can it do?

    International Nuclear Information System (INIS)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-01-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  5. Status reports of supercomputing astrophysics in Japan

    International Nuclear Information System (INIS)

    Nakamura, Takashi; Nagasawa, Mikio

    1990-01-01

    The Workshop on Supercomputing Astrophysics was held at National Laboratory for High Energy Physics (KEK, Tsukuba) from August 31 to September 2, 1989. More than 40 participants of physicists, astronomers were attendant and discussed many topics in the informal atmosphere. The main purpose of this workshop was focused on the theoretical activities in computational astrophysics in Japan. It was also aimed to promote effective collaboration between the numerical experimentists working on supercomputing technique. The various subjects of the presented papers of hydrodynamics, plasma physics, gravitating systems, radiative transfer and general relativity are all stimulating. In fact, these numerical calculations become possible now in Japan owing to the power of Japanese supercomputer such as HITAC S820, Fujitsu VP400E and NEC SX-2. (J.P.N.)

  6. Computational Dimensionalities of Global Supercomputing

    Directory of Open Access Journals (Sweden)

    Richard S. Segall

    2013-12-01

    Full Text Available This Invited Paper pertains to subject of my Plenary Keynote Speech at the 17th World Multi-Conference on Systemics, Cybernetics and Informatics (WMSCI 2013 held in Orlando, Florida on July 9-12, 2013. The title of my Plenary Keynote Speech was: "Dimensionalities of Computation: from Global Supercomputing to Data, Text and Web Mining" but this Invited Paper will focus only on the "Computational Dimensionalities of Global Supercomputing" and is based upon a summary of the contents of several individual articles that have been previously written with myself as lead author and published in [75], [76], [77], [78], [79], [80] and [11]. The topics of these of the Plenary Speech included Overview of Current Research in Global Supercomputing [75], Open-Source Software Tools for Data Mining Analysis of Genomic and Spatial Images using High Performance Computing [76], Data Mining Supercomputing with SAS™ JMP® Genomics ([77], [79], [80], and Visualization by Supercomputing Data Mining [81]. ______________________ [11.] Committee on the Future of Supercomputing, National Research Council (2003, The Future of Supercomputing: An Interim Report, ISBN-13: 978-0-309-09016- 2, http://www.nap.edu/catalog/10784.html [75.] Segall, Richard S.; Zhang, Qingyu and Cook, Jeffrey S.(2013, "Overview of Current Research in Global Supercomputing", Proceedings of Forty- Fourth Meeting of Southwest Decision Sciences Institute (SWDSI, Albuquerque, NM, March 12-16, 2013. [76.] Segall, Richard S. and Zhang, Qingyu (2010, "Open-Source Software Tools for Data Mining Analysis of Genomic and Spatial Images using High Performance Computing", Proceedings of 5th INFORMS Workshop on Data Mining and Health Informatics, Austin, TX, November 6, 2010. [77.] Segall, Richard S., Zhang, Qingyu and Pierce, Ryan M.(2010, "Data Mining Supercomputing with SAS™ JMP®; Genomics: Research-in-Progress, Proceedings of 2010 Conference on Applied Research in Information Technology, sponsored by

  7. What is supercomputing ?

    International Nuclear Information System (INIS)

    Asai, Kiyoshi

    1992-01-01

    Supercomputing means the high speed computation using a supercomputer. Supercomputers and the technical term ''supercomputing'' have spread since ten years ago. The performances of the main computers installed so far in Japan Atomic Energy Research Institute are compared. There are two methods to increase computing speed by using existing circuit elements, parallel processor system and vector processor system. CRAY-1 is the first successful vector computer. Supercomputing technology was first applied to meteorological organizations in foreign countries, and to aviation and atomic energy research institutes in Japan. The supercomputing for atomic energy depends on the trend of technical development in atomic energy, and the contents are divided into the increase of computing speed in existing simulation calculation and the acceleration of the new technical development of atomic energy. The examples of supercomputing in Japan Atomic Energy Research Institute are reported. (K.I.)

  8. KfK-seminar series on supercomputing und visualization from May till September 1992

    International Nuclear Information System (INIS)

    Hohenhinnebusch, W.

    1993-05-01

    During the period of may 1992 to september 1992 a series of seminars was held at KfK on several topics of supercomputing in different fields of application. The aim was to demonstrate the importance of supercomputing and visualization in numerical simulations of complex physical and technical phenomena. This report contains the collection of all submitted seminar papers. (orig./HP) [de

  9. Supercomputational science

    CERN Document Server

    Wilson, S

    1990-01-01

    In contemporary research, the supercomputer now ranks, along with radio telescopes, particle accelerators and the other apparatus of "big science", as an expensive resource, which is nevertheless essential for state of the art research. Supercomputers are usually provided as shar.ed central facilities. However, unlike, telescopes and accelerators, they are find a wide range of applications which extends across a broad spectrum of research activity. The difference in performance between a "good" and a "bad" computer program on a traditional serial computer may be a factor of two or three, but on a contemporary supercomputer it can easily be a factor of one hundred or even more! Furthermore, this factor is likely to increase with future generations of machines. In keeping with the large capital and recurrent costs of these machines, it is appropriate to devote effort to training and familiarization so that supercomputers are employed to best effect. This volume records the lectures delivered at a Summer School ...

  10. KAUST Supercomputing Laboratory

    KAUST Repository

    Bailey, April Renee

    2011-11-15

    KAUST has partnered with IBM to establish a Supercomputing Research Center. KAUST is hosting the Shaheen supercomputer, named after the Arabian falcon famed for its swiftness of flight. This 16-rack IBM Blue Gene/P system is equipped with 4 gigabyte memory per node and capable of 222 teraflops, making KAUST campus the site of one of the world’s fastest supercomputers in an academic environment. KAUST is targeting petaflop capability within 3 years.

  11. KAUST Supercomputing Laboratory

    KAUST Repository

    Bailey, April Renee; Kaushik, Dinesh; Winfer, Andrew

    2011-01-01

    KAUST has partnered with IBM to establish a Supercomputing Research Center. KAUST is hosting the Shaheen supercomputer, named after the Arabian falcon famed for its swiftness of flight. This 16-rack IBM Blue Gene/P system is equipped with 4 gigabyte memory per node and capable of 222 teraflops, making KAUST campus the site of one of the world’s fastest supercomputers in an academic environment. KAUST is targeting petaflop capability within 3 years.

  12. Guide to dataflow supercomputing basic concepts, case studies, and a detailed example

    CERN Document Server

    Milutinovic, Veljko; Trifunovic, Nemanja; Giorgi, Roberto

    2015-01-01

    This unique text/reference describes an exciting and novel approach to supercomputing in the DataFlow paradigm. The major advantages and applications of this approach are clearly described, and a detailed explanation of the programming model is provided using simple yet effective examples. The work is developed from a series of lecture courses taught by the authors in more than 40 universities across more than 20 countries, and from research carried out by Maxeler Technologies, Inc. Topics and features: presents a thorough introduction to DataFlow supercomputing for big data problems; revie

  13. Enabling department-scale supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, D.S.; Hart, W.E.; Phillips, C.A.

    1997-11-01

    The Department of Energy (DOE) national laboratories have one of the longest and most consistent histories of supercomputer use. The authors summarize the architecture of DOE`s new supercomputers that are being built for the Accelerated Strategic Computing Initiative (ASCI). The authors then argue that in the near future scaled-down versions of these supercomputers with petaflop-per-weekend capabilities could become widely available to hundreds of research and engineering departments. The availability of such computational resources will allow simulation of physical phenomena to become a full-fledged third branch of scientific exploration, along with theory and experimentation. They describe the ASCI and other supercomputer applications at Sandia National Laboratories, and discuss which lessons learned from Sandia`s long history of supercomputing can be applied in this new setting.

  14. Mathematical methods and supercomputing in nuclear applications. Proceedings. Vol. 2

    International Nuclear Information System (INIS)

    Kuesters, H.; Stein, E.; Werner, W.

    1993-04-01

    All papers of the two volumes are separately indexed in the data base. Main topics are: Progress in advanced numerical techniques, fluid mechanics, on-line systems, artificial intelligence applications, nodal methods reactor kinetics, reactor design, supercomputer architecture, probabilistic estimation of risk assessment, methods in transport theory, advances in Monte Carlo techniques, and man-machine interface. (orig.)

  15. Mathematical methods and supercomputing in nuclear applications. Proceedings. Vol. 1

    International Nuclear Information System (INIS)

    Kuesters, H.; Stein, E.; Werner, W.

    1993-04-01

    All papers of the two volumes are separately indexed in the data base. Main topics are: Progress in advanced numerical techniques, fluid mechanics, on-line systems, artificial intelligence applications, nodal methods reactor kinetics, reactor design, supercomputer architecture, probabilistic estimation of risk assessment, methods in transport theory, advances in Monte Carlo techniques, and man-machine interface. (orig.)

  16. Japanese supercomputer technology

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Ewald, R.H.; Worlton, W.J.

    1982-01-01

    In February 1982, computer scientists from the Los Alamos National Laboratory and Lawrence Livermore National Laboratory visited several Japanese computer manufacturers. The purpose of these visits was to assess the state of the art of Japanese supercomputer technology and to advise Japanese computer vendors of the needs of the US Department of Energy (DOE) for more powerful supercomputers. The Japanese foresee a domestic need for large-scale computing capabilities for nuclear fusion, image analysis for the Earth Resources Satellite, meteorological forecast, electrical power system analysis (power flow, stability, optimization), structural and thermal analysis of satellites, and very large scale integrated circuit design and simulation. To meet this need, Japan has launched an ambitious program to advance supercomputer technology. This program is described

  17. An assessment of worldwide supercomputer usage

    Energy Technology Data Exchange (ETDEWEB)

    Wasserman, H.J.; Simmons, M.L.; Hayes, A.H.

    1995-01-01

    This report provides a comparative study of advanced supercomputing usage in Japan and the United States as of Spring 1994. It is based on the findings of a group of US scientists whose careers have centered on programming, evaluating, and designing high-performance supercomputers for over ten years. The report is a follow-on to an assessment of supercomputing technology in Europe and Japan that was published in 1993. Whereas the previous study focused on supercomputer manufacturing capabilities, the primary focus of the current work was to compare where and how supercomputers are used. Research for this report was conducted through both literature studies and field research in Japan.

  18. Status of supercomputers in the US

    International Nuclear Information System (INIS)

    Fernbach, S.

    1985-01-01

    Current Supercomputers; that is, the Class VI machines which first became available in 1976 are being delivered in greater quantity than ever before. In addition, manufacturers are busily working on Class VII machines to be ready for delivery in CY 1987. Mainframes are being modified or designed to take on some features of the supercomputers and new companies with the intent of either competing directly in the supercomputer arena or in providing entry-level systems from which to graduate to supercomputers are springing up everywhere. Even well founded organizations like IBM and CDC are adding machines with vector instructions in their repertoires. Japanese - manufactured supercomputers are also being introduced into the U.S. Will these begin to compete with those of U.S. manufacture. Are they truly competitive. It turns out that both from the hardware and software points of view they may be superior. We may be facing the same problems in supercomputers that we faced in videosystems

  19. Supercomputing and related national projects in Japan

    International Nuclear Information System (INIS)

    Miura, Kenichi

    1985-01-01

    Japanese supercomputer development activities in the industry and research projects are outlined. Architecture, technology, software, and applications of Fujitsu's Vector Processor Systems are described as an example of Japanese supercomputers. Applications of supercomputers to high energy physics are also discussed. (orig.)

  20. TOP500 Supercomputers for June 2004

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2004-06-23

    23rd Edition of TOP500 List of World's Fastest Supercomputers Released: Japan's Earth Simulator Enters Third Year in Top Position MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 23rd edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2004) at the International Supercomputer Conference in Heidelberg, Germany.

  1. Porting Ordinary Applications to Blue Gene/Q Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Maheshwari, Ketan C.; Wozniak, Justin M.; Armstrong, Timothy; Katz, Daniel S.; Binkowski, T. Andrew; Zhong, Xiaoliang; Heinonen, Olle; Karpeyev, Dmitry; Wilde, Michael

    2015-08-31

    Efficiently porting ordinary applications to Blue Gene/Q supercomputers is a significant challenge. Codes are often originally developed without considering advanced architectures and related tool chains. Science needs frequently lead users to want to run large numbers of relatively small jobs (often called many-task computing, an ensemble, or a workflow), which can conflict with supercomputer configurations. In this paper, we discuss techniques developed to execute ordinary applications over leadership class supercomputers. We use the high-performance Swift parallel scripting framework and build two workflow execution techniques-sub-jobs and main-wrap. The sub-jobs technique, built on top of the IBM Blue Gene/Q resource manager Cobalt's sub-block jobs, lets users submit multiple, independent, repeated smaller jobs within a single larger resource block. The main-wrap technique is a scheme that enables C/C++ programs to be defined as functions that are wrapped by a high-performance Swift wrapper and that are invoked as a Swift script. We discuss the needs, benefits, technicalities, and current limitations of these techniques. We further discuss the real-world science enabled by these techniques and the results obtained.

  2. TOP500 Supercomputers for June 2005

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2005-06-22

    25th Edition of TOP500 List of World's Fastest Supercomputers Released: DOE/L LNL BlueGene/L and IBM gain Top Positions MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 25th edition of the TOP500 list of the world's fastest supercomputers was released today (June 22, 2005) at the 20th International Supercomputing Conference (ISC2005) in Heidelberg Germany.

  3. TOP500 Supercomputers for November 2003

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2003-11-16

    22nd Edition of TOP500 List of World s Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 22nd edition of the TOP500 list of the worlds fastest supercomputers was released today (November 16, 2003). The Earth Simulator supercomputer retains the number one position with its Linpack benchmark performance of 35.86 Tflop/s (''teraflops'' or trillions of calculations per second). It was built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan.

  4. A training program for scientific supercomputing users

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, F.; Moher, T.; Sabelli, N.; Solem, A.

    1988-01-01

    There is need for a mechanism to transfer supercomputing technology into the hands of scientists and engineers in such a way that they will acquire a foundation of knowledge that will permit integration of supercomputing as a tool in their research. Most computing center training emphasizes computer-specific information about how to use a particular computer system; most academic programs teach concepts to computer scientists. Only a few brief courses and new programs are designed for computational scientists. This paper describes an eleven-week training program aimed principally at graduate and postdoctoral students in computationally-intensive fields. The program is designed to balance the specificity of computing center courses, the abstractness of computer science courses, and the personal contact of traditional apprentice approaches. It is based on the experience of computer scientists and computational scientists, and consists of seminars and clinics given by many visiting and local faculty. It covers a variety of supercomputing concepts, issues, and practices related to architecture, operating systems, software design, numerical considerations, code optimization, graphics, communications, and networks. Its research component encourages understanding of scientific computing and supercomputer hardware issues. Flexibility in thinking about computing needs is emphasized by the use of several different supercomputer architectures, such as the Cray X/MP48 at the National Center for Supercomputing Applications at University of Illinois at Urbana-Champaign, IBM 3090 600E/VF at the Cornell National Supercomputer Facility, and Alliant FX/8 at the Advanced Computing Research Facility at Argonne National Laboratory. 11 refs., 6 tabs.

  5. INTEL: Intel based systems move up in supercomputing ranks

    CERN Multimedia

    2002-01-01

    "The TOP500 supercomputer rankings released today at the Supercomputing 2002 conference show a dramatic increase in the number of Intel-based systems being deployed in high-performance computing (HPC) or supercomputing areas" (1/2 page).

  6. World's fastest supercomputer opens up to users

    Science.gov (United States)

    Xin, Ling

    2016-08-01

    China's latest supercomputer - Sunway TaihuLight - has claimed the crown as the world's fastest computer according to the latest TOP500 list, released at the International Supercomputer Conference in Frankfurt in late June.

  7. OpenMP Performance on the Columbia Supercomputer

    Science.gov (United States)

    Haoqiang, Jin; Hood, Robert

    2005-01-01

    This presentation discusses Columbia World Class Supercomputer which is one of the world's fastest supercomputers providing 61 TFLOPs (10/20/04). Conceived, designed, built, and deployed in just 120 days. A 20-node supercomputer built on proven 512-processor nodes. The largest SGI system in the world with over 10,000 Intel Itanium 2 processors and provides the largest node size incorporating commodity parts (512) and the largest shared-memory environment (2048) with 88% efficiency tops the scalar systems on the Top500 list.

  8. Supercomputing - Use Cases, Advances, The Future (1/2)

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Supercomputing has become a staple of science and the poster child for aggressive developments in silicon technology, energy efficiency and programming. In this series we examine the key components of supercomputing setups and the various advances – recent and past – that made headlines and delivered bigger and bigger machines. We also take a closer look at the future prospects of supercomputing, and the extent of its overlap with high throughput computing, in the context of main use cases ranging from oil exploration to market simulation. On the first day, we will focus on the history and theory of supercomputing, the top500 list and the hardware that makes supercomputers tick. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and initiated projects with the private sector (e.g. HP an...

  9. Supercomputing - Use Cases, Advances, The Future (2/2)

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Supercomputing has become a staple of science and the poster child for aggressive developments in silicon technology, energy efficiency and programming. In this series we examine the key components of supercomputing setups and the various advances – recent and past – that made headlines and delivered bigger and bigger machines. We also take a closer look at the future prospects of supercomputing, and the extent of its overlap with high throughput computing, in the context of main use cases ranging from oil exploration to market simulation. On the second day, we will focus on software and software paradigms driving supercomputers, workloads that need supercomputing treatment, advances in technology and possible future developments. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and i...

  10. Advanced parallel processing with supercomputer architectures

    International Nuclear Information System (INIS)

    Hwang, K.

    1987-01-01

    This paper investigates advanced parallel processing techniques and innovative hardware/software architectures that can be applied to boost the performance of supercomputers. Critical issues on architectural choices, parallel languages, compiling techniques, resource management, concurrency control, programming environment, parallel algorithms, and performance enhancement methods are examined and the best answers are presented. The authors cover advanced processing techniques suitable for supercomputers, high-end mainframes, minisupers, and array processors. The coverage emphasizes vectorization, multitasking, multiprocessing, and distributed computing. In order to achieve these operation modes, parallel languages, smart compilers, synchronization mechanisms, load balancing methods, mapping parallel algorithms, operating system functions, application library, and multidiscipline interactions are investigated to ensure high performance. At the end, they assess the potentials of optical and neural technologies for developing future supercomputers

  11. Adaptability of supercomputers to nuclear computations

    International Nuclear Information System (INIS)

    Asai, Kiyoshi; Ishiguro, Misako; Matsuura, Toshihiko.

    1983-01-01

    Recently in the field of scientific and technical calculation, the usefulness of supercomputers represented by CRAY-1 has been recognized, and they are utilized in various countries. The rapid computation of supercomputers is based on the function of vector computation. The authors investigated the adaptability to vector computation of about 40 typical atomic energy codes for the past six years. Based on the results of investigation, the adaptability of the function of vector computation that supercomputers have to atomic energy codes, the problem regarding the utilization and the future prospect are explained. The adaptability of individual calculation codes to vector computation is largely dependent on the algorithm and program structure used for the codes. The change to high speed by pipeline vector system, the investigation in the Japan Atomic Energy Research Institute and the results, and the examples of expressing the codes for atomic energy, environmental safety and nuclear fusion by vector are reported. The magnification of speed up for 40 examples was from 1.5 to 9.0. It can be said that the adaptability of supercomputers to atomic energy codes is fairly good. (Kako, I.)

  12. Development of seismic tomography software for hybrid supercomputers

    Science.gov (United States)

    Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton

    2015-04-01

    Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on

  13. TOP500 Supercomputers for November 2004

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2004-11-08

    24th Edition of TOP500 List of World's Fastest Supercomputers Released: DOE/IBM BlueGene/L and NASA/SGI's Columbia gain Top Positions MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 24th edition of the TOP500 list of the worlds fastest supercomputers was released today (November 8, 2004) at the SC2004 Conference in Pittsburgh, Pa.

  14. TOP500 Supercomputers for June 2003

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2003-06-23

    21st Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 21st edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2003). The Earth Simulator supercomputer built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan, with its Linpack benchmark performance of 35.86 Tflop/s (teraflops or trillions of calculations per second), retains the number one position. The number 2 position is held by the re-measured ASCI Q system at Los Alamos National Laboratory. With 13.88 Tflop/s, it is the second system ever to exceed the 10 Tflop/smark. ASCIQ was built by Hewlett-Packard and is based on the AlphaServerSC computer system.

  15. TOP500 Supercomputers for June 2002

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2002-06-20

    19th Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 19th edition of the TOP500 list of the worlds fastest supercomputers was released today (June 20, 2002). The recently installed Earth Simulator supercomputer at the Earth Simulator Center in Yokohama, Japan, is as expected the clear new number 1. Its performance of 35.86 Tflop/s (trillions of calculations per second) running the Linpack benchmark is almost five times higher than the performance of the now No.2 IBM ASCI White system at Lawrence Livermore National Laboratory (7.2 Tflop/s). This powerful leap frogging to the top by a system so much faster than the previous top system is unparalleled in the history of the TOP500.

  16. Comprehensive efficiency analysis of supercomputer resource usage based on system monitoring data

    Science.gov (United States)

    Mamaeva, A. A.; Shaykhislamov, D. I.; Voevodin, Vad V.; Zhumatiy, S. A.

    2018-03-01

    One of the main problems of modern supercomputers is the low efficiency of their usage, which leads to the significant idle time of computational resources, and, in turn, to the decrease in speed of scientific research. This paper presents three approaches to study the efficiency of supercomputer resource usage based on monitoring data analysis. The first approach performs an analysis of computing resource utilization statistics, which allows to identify different typical classes of programs, to explore the structure of the supercomputer job flow and to track overall trends in the supercomputer behavior. The second approach is aimed specifically at analyzing off-the-shelf software packages and libraries installed on the supercomputer, since efficiency of their usage is becoming an increasingly important factor for the efficient functioning of the entire supercomputer. Within the third approach, abnormal jobs – jobs with abnormally inefficient behavior that differs significantly from the standard behavior of the overall supercomputer job flow – are being detected. For each approach, the results obtained in practice in the Supercomputer Center of Moscow State University are demonstrated.

  17. Comments on the parallelization efficiency of the Sunway TaihuLight supercomputer

    OpenAIRE

    Végh, János

    2016-01-01

    In the world of supercomputers, the large number of processors requires to minimize the inefficiencies of parallelization, which appear as a sequential part of the program from the point of view of Amdahl's law. The recently suggested new figure of merit is applied to the recently presented supercomputer, and the timeline of "Top 500" supercomputers is scrutinized using the metric. It is demonstrated, that in addition to the computing performance and power consumption, the new supercomputer i...

  18. The ETA10 supercomputer system

    International Nuclear Information System (INIS)

    Swanson, C.D.

    1987-01-01

    The ETA Systems, Inc. ETA 10 is a next-generation supercomputer featuring multiprocessing, a large hierarchical memory system, high performance input/output, and network support for both batch and interactive processing. Advanced technology used in the ETA 10 includes liquid nitrogen cooled CMOS logic with 20,000 gates per chip, a single printed circuit board for each CPU, and high density static and dynamics MOS memory chips. Software for the ETA 10 includes an underlying kernel that supports multiple user environments, a new ETA FORTRAN compiler with an advanced automatic vectorizer, a multitasking library and debugging tools. Possible developments for future supercomputers from ETA Systems are discussed. (orig.)

  19. Integration of Panda Workload Management System with supercomputers

    Science.gov (United States)

    De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Novikov, A.; Oleynik, D.; Panitkin, S.; Poyda, A.; Read, K. F.; Ryabinkin, E.; Teslyuk, A.; Velikhov, V.; Wells, J. C.; Wenaus, T.

    2016-09-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center "Kurchatov Institute", IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan's multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads

  20. Applications of supercomputing and the utility industry: Calculation of power transfer capabilities

    International Nuclear Information System (INIS)

    Jensen, D.D.; Behling, S.R.; Betancourt, R.

    1990-01-01

    Numerical models and iterative simulation using supercomputers can furnish cost-effective answers to utility industry problems that are all but intractable using conventional computing equipment. An example of the use of supercomputers by the utility industry is the determination of power transfer capability limits for power transmission systems. This work has the goal of markedly reducing the run time of transient stability codes used to determine power distributions following major system disturbances. To date, run times of several hours on a conventional computer have been reduced to several minutes on state-of-the-art supercomputers, with further improvements anticipated to reduce run times to less than a minute. In spite of the potential advantages of supercomputers, few utilities have sufficient need for a dedicated in-house supercomputing capability. This problem is resolved using a supercomputer center serving a geographically distributed user base coupled via high speed communication networks

  1. Supercomputers to transform Science

    CERN Multimedia

    2006-01-01

    "New insights into the structure of space and time, climate modeling, and the design of novel drugs, are but a few of the many research areas that will be transforned by the installation of three supercomputers at the Unversity of Bristol." (1/2 page)

  2. Convex unwraps its first grown-up supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Manuel, T.

    1988-03-03

    Convex Computer Corp.'s new supercomputer family is even more of an industry blockbuster than its first system. At a tenfold jump in performance, it's far from just an incremental upgrade over its first minisupercomputer, the C-1. The heart of the new family, the new C-2 processor, churning at 50 million floating-point operations/s, spawns a group of systems whose performance could pass for some fancy supercomputers-namely those of the Cray Research Inc. family. When added to the C-1, Convex's five new supercomputers create the C series, a six-member product group offering a performance range from 20 to 200 Mflops. They mark an important transition for Convex from a one-product high-tech startup to a multinational company with a wide-ranging product line. It's a tough transition but the Richardson, Texas, company seems to be doing it. The extended product line propels Convex into the upper end of the minisupercomputer class and nudges it into the low end of the big supercomputers. It positions Convex in an uncrowded segment of the market in the $500,000 to $1 million range offering 50 to 200 Mflops of performance. The company is making this move because the minisuper area, which it pioneered, quickly became crowded with new vendors, causing prices and gross margins to drop drastically.

  3. The ETA systems plans for supercomputers

    International Nuclear Information System (INIS)

    Swanson, C.D.

    1987-01-01

    The ETA Systems, is a class VII supercomputer featuring multiprocessing, a large hierarchical memory system, high performance input/output, and network support for both batch and interactive processing. Advanced technology used in the ETA 10 includes liquid nitrogen cooled CMOS logic with 20,000 gates per chip, a single printed circuit board for each CPU, and high density static and dynamic MOS memory chips. Software for the ETA 10 includes an underlying kernel that supports multiple user environments, a new ETA FORTRAN compiler with an advanced automatic vectorizer, a multitasking library and debugging tools. Possible developments for future supercomputers from ETA Systems are discussed

  4. Automatic discovery of the communication network topology for building a supercomputer model

    Science.gov (United States)

    Sobolev, Sergey; Stefanov, Konstantin; Voevodin, Vadim

    2016-10-01

    The Research Computing Center of Lomonosov Moscow State University is developing the Octotron software suite for automatic monitoring and mitigation of emergency situations in supercomputers so as to maximize hardware reliability. The suite is based on a software model of the supercomputer. The model uses a graph to describe the computing system components and their interconnections. One of the most complex components of a supercomputer that needs to be included in the model is its communication network. This work describes the proposed approach for automatically discovering the Ethernet communication network topology in a supercomputer and its description in terms of the Octotron model. This suite automatically detects computing nodes and switches, collects information about them and identifies their interconnections. The application of this approach is demonstrated on the "Lomonosov" and "Lomonosov-2" supercomputers.

  5. PNNL supercomputer to become largest computing resource on the Grid

    CERN Multimedia

    2002-01-01

    Hewlett Packard announced that the US DOE Pacific Northwest National Laboratory will connect a 9.3-teraflop HP supercomputer to the DOE Science Grid. This will be the largest supercomputer attached to a computer grid anywhere in the world (1 page).

  6. Dust modelling and forecasting in the Barcelona Supercomputing Center: Activities and developments

    Energy Technology Data Exchange (ETDEWEB)

    Perez, C; Baldasano, J M; Jimenez-Guerrero, P; Jorba, O; Haustein, K; Basart, S [Earth Sciences Department. Barcelona Supercomputing Center. Barcelona (Spain); Cuevas, E [Izanaa Atmospheric Research Center. Agencia Estatal de Meteorologia, Tenerife (Spain); Nickovic, S [Atmospheric Research and Environment Branch, World Meteorological Organization, Geneva (Switzerland)], E-mail: carlos.perez@bsc.es

    2009-03-01

    The Barcelona Supercomputing Center (BSC) is the National Supercomputer Facility in Spain, hosting MareNostrum, one of the most powerful Supercomputers in Europe. The Earth Sciences Department of BSC operates daily regional dust and air quality forecasts and conducts intensive modelling research for short-term operational prediction. This contribution summarizes the latest developments and current activities in the field of sand and dust storm modelling and forecasting.

  7. Dust modelling and forecasting in the Barcelona Supercomputing Center: Activities and developments

    International Nuclear Information System (INIS)

    Perez, C; Baldasano, J M; Jimenez-Guerrero, P; Jorba, O; Haustein, K; Basart, S; Cuevas, E; Nickovic, S

    2009-01-01

    The Barcelona Supercomputing Center (BSC) is the National Supercomputer Facility in Spain, hosting MareNostrum, one of the most powerful Supercomputers in Europe. The Earth Sciences Department of BSC operates daily regional dust and air quality forecasts and conducts intensive modelling research for short-term operational prediction. This contribution summarizes the latest developments and current activities in the field of sand and dust storm modelling and forecasting.

  8. Supercomputers Of The Future

    Science.gov (United States)

    Peterson, Victor L.; Kim, John; Holst, Terry L.; Deiwert, George S.; Cooper, David M.; Watson, Andrew B.; Bailey, F. Ron

    1992-01-01

    Report evaluates supercomputer needs of five key disciplines: turbulence physics, aerodynamics, aerothermodynamics, chemistry, and mathematical modeling of human vision. Predicts these fields will require computer speed greater than 10(Sup 18) floating-point operations per second (FLOP's) and memory capacity greater than 10(Sup 15) words. Also, new parallel computer architectures and new structured numerical methods will make necessary speed and capacity available.

  9. NASA Advanced Supercomputing Facility Expansion

    Science.gov (United States)

    Thigpen, William W.

    2017-01-01

    The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.

  10. ATLAS Software Installation on Supercomputers

    CERN Document Server

    Undrus, Alexander; The ATLAS collaboration

    2018-01-01

    PowerPC and high performance computers (HPC) are important resources for computing in the ATLAS experiment. The future LHC data processing will require more resources than Grid computing, currently using approximately 100,000 cores at well over 100 sites, can provide. Supercomputers are extremely powerful as they use resources of hundreds of thousands CPUs joined together. However their architectures have different instruction sets. ATLAS binary software distributions for x86 chipsets do not fit these architectures, as emulation of these chipsets results in huge performance loss. This presentation describes the methodology of ATLAS software installation from source code on supercomputers. The installation procedure includes downloading the ATLAS code base as well as the source of about 50 external packages, such as ROOT and Geant4, followed by compilation, and rigorous unit and integration testing. The presentation reports the application of this procedure at Titan HPC and Summit PowerPC at Oak Ridge Computin...

  11. JINR supercomputer of the module type for event parallel analysis

    International Nuclear Information System (INIS)

    Kolpakov, I.F.; Senner, A.E.; Smirnov, V.A.

    1987-01-01

    A model of a supercomputer with 50 million of operations per second is suggested. Its realization allows one to solve JINR data analysis problems for large spectrometers (in particular DELPHY collaboration). The suggested module supercomputer is based on 32-bit commercial available microprocessor with a processing rate of about 1 MFLOPS. The processors are combined by means of VME standard busbars. MicroVAX-11 is a host computer organizing the operation of the system. Data input and output is realized via microVAX-11 computer periphery. Users' software is based on the FORTRAN-77. The supercomputer is connected with a JINR net port and all JINR users get an access to the suggested system

  12. Supercomputers and quantum field theory

    International Nuclear Information System (INIS)

    Creutz, M.

    1985-01-01

    A review is given of why recent simulations of lattice gauge theories have resulted in substantial demands from particle theorists for supercomputer time. These calculations have yielded first principle results on non-perturbative aspects of the strong interactions. An algorithm for simulating dynamical quark fields is discussed. 14 refs

  13. Supercomputer applications in nuclear research

    International Nuclear Information System (INIS)

    Ishiguro, Misako

    1992-01-01

    The utilization of supercomputers in Japan Atomic Energy Research Institute is mainly reported. The fields of atomic energy research which use supercomputers frequently and the contents of their computation are outlined. What is vectorizing is simply explained, and nuclear fusion, nuclear reactor physics, the hydrothermal safety of nuclear reactors, the parallel property that the atomic energy computations of fluids and others have, the algorithm for vector treatment and the effect of speed increase by vectorizing are discussed. At present Japan Atomic Energy Research Institute uses two systems of FACOM VP 2600/10 and three systems of M-780. The contents of computation changed from criticality computation around 1970, through the analysis of LOCA after the TMI accident, to nuclear fusion research, the design of new type reactors and reactor safety assessment at present. Also the method of using computers advanced from batch processing to time sharing processing, from one-dimensional to three dimensional computation, from steady, linear to unsteady nonlinear computation, from experimental analysis to numerical simulation and so on. (K.I.)

  14. Computational plasma physics and supercomputers

    International Nuclear Information System (INIS)

    Killeen, J.; McNamara, B.

    1984-09-01

    The Supercomputers of the 80's are introduced. They are 10 to 100 times more powerful than today's machines. The range of physics modeling in the fusion program is outlined. New machine architecture will influence particular codes, but parallel processing poses new coding difficulties. Increasing realism in simulations will require better numerics and more elaborate mathematics

  15. Mistral Supercomputer Job History Analysis

    OpenAIRE

    Zasadziński, Michał; Muntés-Mulero, Victor; Solé, Marc; Ludwig, Thomas

    2018-01-01

    In this technical report, we show insights and results of operational data analysis from petascale supercomputer Mistral, which is ranked as 42nd most powerful in the world as of January 2018. Data sources include hardware monitoring data, job scheduler history, topology, and hardware information. We explore job state sequences, spatial distribution, and electric power patterns.

  16. Interactive real-time nuclear plant simulations on a UNIX based supercomputer

    International Nuclear Information System (INIS)

    Behling, S.R.

    1990-01-01

    Interactive real-time nuclear plant simulations are critically important to train nuclear power plant engineers and operators. In addition, real-time simulations can be used to test the validity and timing of plant technical specifications and operational procedures. To accurately and confidently simulate a nuclear power plant transient in real-time, sufficient computer resources must be available. Since some important transients cannot be simulated using preprogrammed responses or non-physical models, commonly used simulation techniques may not be adequate. However, the power of a supercomputer allows one to accurately calculate the behavior of nuclear power plants even during very complex transients. Many of these transients can be calculated in real-time or quicker on the fastest supercomputers. The concept of running interactive real-time nuclear power plant transients on a supercomputer has been tested. This paper describes the architecture of the simulation program, the techniques used to establish real-time synchronization, and other issues related to the use of supercomputers in a new and potentially very important area. (author)

  17. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Science.gov (United States)

    DeTar, Carleton; Gottlieb, Steven; Li, Ruizi; Toussaint, Doug

    2018-03-01

    With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  18. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Directory of Open Access Journals (Sweden)

    DeTar Carleton

    2018-01-01

    Full Text Available With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  19. Flux-Level Transit Injection Experiments with NASA Pleiades Supercomputer

    Science.gov (United States)

    Li, Jie; Burke, Christopher J.; Catanzarite, Joseph; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    Flux-Level Transit Injection (FLTI) experiments are executed with NASA's Pleiades supercomputer for the Kepler Mission. The latest release (9.3, January 2016) of the Kepler Science Operations Center Pipeline is used in the FLTI experiments. Their purpose is to validate the Analytic Completeness Model (ACM), which can be computed for all Kepler target stars, thereby enabling exoplanet occurrence rate studies. Pleiades, a facility of NASA's Advanced Supercomputing Division, is one of the world's most powerful supercomputers and represents NASA's state-of-the-art technology. We discuss the details of implementing the FLTI experiments on the Pleiades supercomputer. For example, taking into account that ~16 injections are generated by one core of the Pleiades processors in an hour, the “shallow” FLTI experiment, in which ~2000 injections are required per target star, can be done for 16% of all Kepler target stars in about 200 hours. Stripping down the transit search to bare bones, i.e. only searching adjacent high/low periods at high/low pulse durations, makes the computationally intensive FLTI experiments affordable. The design of the FLTI experiments and the analysis of the resulting data are presented in “Validating an Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments” by Catanzarite et al. (#2494058).Kepler was selected as the 10th mission of the Discovery Program. Funding for the Kepler Mission has been provided by the NASA Science Mission Directorate.

  20. Extracting the Textual and Temporal Structure of Supercomputing Logs

    Energy Technology Data Exchange (ETDEWEB)

    Jain, S; Singh, I; Chandra, A; Zhang, Z; Bronevetsky, G

    2009-05-26

    Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an online clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.

  1. Introduction to Reconfigurable Supercomputing

    CERN Document Server

    Lanzagorta, Marco; Rosenberg, Robert

    2010-01-01

    This book covers technologies, applications, tools, languages, procedures, advantages, and disadvantages of reconfigurable supercomputing using Field Programmable Gate Arrays (FPGAs). The target audience is the community of users of High Performance Computers (HPe who may benefit from porting their applications into a reconfigurable environment. As such, this book is intended to guide the HPC user through the many algorithmic considerations, hardware alternatives, usability issues, programming languages, and design tools that need to be understood before embarking on the creation of reconfigur

  2. SUPERCOMPUTERS FOR AIDING ECONOMIC PROCESSES WITH REFERENCE TO THE FINANCIAL SECTOR

    Directory of Open Access Journals (Sweden)

    Jerzy Balicki

    2014-12-01

    Full Text Available The article discusses the use of supercomputers to support business processes with particular emphasis on the financial sector. A reference was made to the selected projects that support economic development. In particular, we propose the use of supercomputers to perform artificial intel-ligence methods in banking. The proposed methods combined with modern technology enables a significant increase in the competitiveness of enterprises and banks by adding new functionality.

  3. Exploiting Thread Parallelism for Ocean Modeling on Cray XC Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sarje, Abhinav [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jacobsen, Douglas W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Samuel W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ringler, Todd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-01

    The incorporation of increasing core counts in modern processors used to build state-of-the-art supercomputers is driving application development towards exploitation of thread parallelism, in addition to distributed memory parallelism, with the goal of delivering efficient high-performance codes. In this work we describe the exploitation of threading and our experiences with it with respect to a real-world ocean modeling application code, MPAS-Ocean. We present detailed performance analysis and comparisons of various approaches and configurations for threading on the Cray XC series supercomputers.

  4. Visualization environment of the large-scale data of JAEA's supercomputer system

    Energy Technology Data Exchange (ETDEWEB)

    Sakamoto, Kensaku [Japan Atomic Energy Agency, Center for Computational Science and e-Systems, Tokai, Ibaraki (Japan); Hoshi, Yoshiyuki [Research Organization for Information Science and Technology (RIST), Tokai, Ibaraki (Japan)

    2013-11-15

    On research and development of various fields of nuclear energy, visualization of calculated data is especially useful to understand the result of simulation in an intuitive way. Many researchers who run simulations on the supercomputer in Japan Atomic Energy Agency (JAEA) are used to transfer calculated data files from the supercomputer to their local PCs for visualization. In recent years, as the size of calculated data has gotten larger with improvement of supercomputer performance, reduction of visualization processing time as well as efficient use of JAEA network is being required. As a solution, we introduced a remote visualization system which has abilities to utilize parallel processors on the supercomputer and to reduce the usage of network resources by transferring data of intermediate visualization process. This paper reports a study on the performance of image processing with the remote visualization system. The visualization processing time is measured and the influence of network speed is evaluated by varying the drawing mode, the size of visualization data and the number of processors. Based on this study, a guideline for using the remote visualization system is provided to show how the system can be used effectively. An upgrade policy of the next system is also shown. (author)

  5. Multi-petascale highly efficient parallel supercomputer

    Science.gov (United States)

    Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.; Blumrich, Matthias A.; Boyle, Peter; Brunheroto, Jose R.; Chen, Dong; Cher, Chen -Yong; Chiu, George L.; Christ, Norman; Coteus, Paul W.; Davis, Kristan D.; Dozsa, Gabor J.; Eichenberger, Alexandre E.; Eisley, Noel A.; Ellavsky, Matthew R.; Evans, Kahn C.; Fleischer, Bruce M.; Fox, Thomas W.; Gara, Alan; Giampapa, Mark E.; Gooding, Thomas M.; Gschwind, Michael K.; Gunnels, John A.; Hall, Shawn A.; Haring, Rudolf A.; Heidelberger, Philip; Inglett, Todd A.; Knudson, Brant L.; Kopcsay, Gerard V.; Kumar, Sameer; Mamidala, Amith R.; Marcella, James A.; Megerian, Mark G.; Miller, Douglas R.; Miller, Samuel J.; Muff, Adam J.; Mundy, Michael B.; O'Brien, John K.; O'Brien, Kathryn M.; Ohmacht, Martin; Parker, Jeffrey J.; Poole, Ruth J.; Ratterman, Joseph D.; Salapura, Valentina; Satterfield, David L.; Senger, Robert M.; Smith, Brian; Steinmacher-Burow, Burkhard; Stockdell, William M.; Stunkel, Craig B.; Sugavanam, Krishnan; Sugawara, Yutaka; Takken, Todd E.; Trager, Barry M.; Van Oosten, James L.; Wait, Charles D.; Walkup, Robert E.; Watson, Alfred T.; Wisniewski, Robert W.; Wu, Peng

    2015-07-14

    A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaOPS-scale computing, at decreased cost, power and footprint, and that allows for a maximum packaging density of processing nodes from an interconnect point of view. The Supercomputer exploits technological advances in VLSI that enables a computing model where many processors can be integrated into a single Application Specific Integrated Circuit (ASIC). Each ASIC computing node comprises a system-on-chip ASIC utilizing four or more processors integrated into one die, with each having full access to all system resources and enabling adaptive partitioning of the processors to functions such as compute or messaging I/O on an application by application basis, and preferably, enable adaptive partitioning of functions in accordance with various algorithmic phases within an application, or if I/O or other processors are underutilized, then can participate in computation or communication nodes are interconnected by a five dimensional torus network with DMA that optimally maximize the throughput of packet communications between nodes and minimize latency.

  6. QCD on the BlueGene/L Supercomputer

    International Nuclear Information System (INIS)

    Bhanot, G.; Chen, D.; Gara, A.; Sexton, J.; Vranas, P.

    2005-01-01

    In June 2004 QCD was simulated for the first time at sustained speed exceeding 1 TeraFlops in the BlueGene/L supercomputer at the IBM T.J. Watson Research Lab. The implementation and performance of QCD in the BlueGene/L is presented

  7. QCD on the BlueGene/L Supercomputer

    Science.gov (United States)

    Bhanot, G.; Chen, D.; Gara, A.; Sexton, J.; Vranas, P.

    2005-03-01

    In June 2004 QCD was simulated for the first time at sustained speed exceeding 1 TeraFlops in the BlueGene/L supercomputer at the IBM T.J. Watson Research Lab. The implementation and performance of QCD in the BlueGene/L is presented.

  8. Considering a new domain for antimicrobial stewardship: Topical antibiotics in the open surgical wound.

    Science.gov (United States)

    Edmiston, Charles E; Leaper, David; Spencer, Maureen; Truitt, Karen; Litz Fauerbach, Loretta; Graham, Denise; Johnson, Helen Boehm

    2017-11-01

    The global push to combat the problem of antimicrobial resistance has led to the development of antimicrobial stewardship programs (ASPs), which were recently mandated by The Joint Commission and the Centers for Medicare and Medicaid Services. However, the use of topical antibiotics in the open surgical wound is often not monitored by these programs nor is it subject to any evidence-based standardization of care. Survey results indicate that the practice of using topical antibiotics intraoperatively, in both irrigation fluids and powders, is widespread. Given the risks inherent in their use and the lack of evidence supporting it, the practice should be monitored as a core part of ASPs, and alternative agents, such as antiseptics, should be considered. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  9. Proceedings of the first energy research power supercomputer users symposium

    International Nuclear Information System (INIS)

    1991-01-01

    The Energy Research Power Supercomputer Users Symposium was arranged to showcase the richness of science that has been pursued and accomplished in this program through the use of supercomputers and now high performance parallel computers over the last year: this report is the collection of the presentations given at the Symposium. ''Power users'' were invited by the ER Supercomputer Access Committee to show that the use of these computational tools and the associated data communications network, ESNet, go beyond merely speeding up computations. Today the work often directly contributes to the advancement of the conceptual developments in their fields and the computational and network resources form the very infrastructure of today's science. The Symposium also provided an opportunity, which is rare in this day of network access to computing resources, for the invited users to compare and discuss their techniques and approaches with those used in other ER disciplines. The significance of new parallel architectures was highlighted by the interesting evening talk given by Dr. Stephen Orszag of Princeton University

  10. Graphics supercomputer for computational fluid dynamics research

    Science.gov (United States)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  11. A workbench for tera-flop supercomputing

    International Nuclear Information System (INIS)

    Resch, M.M.; Kuester, U.; Mueller, M.S.; Lang, U.

    2003-01-01

    Supercomputers currently reach a peak performance in the range of TFlop/s. With but one exception - the Japanese Earth Simulator - none of these systems has so far been able to also show a level of sustained performance for a variety of applications that comes close to the peak performance. Sustained TFlop/s are therefore rarely seen. The reasons are manifold and are well known: Bandwidth and latency both for main memory and for the internal network are the key internal technical problems. Cache hierarchies with large caches can bring relief but are no remedy to the problem. However, there are not only technical problems that inhibit the full exploitation by scientists of the potential of modern supercomputers. More and more organizational issues come to the forefront. This paper shows the approach of the High Performance Computing Center Stuttgart (HLRS) to deliver a sustained performance of TFlop/s for a wide range of applications from a large group of users spread over Germany. The core of the concept is the role of the data. Around this we design a simulation workbench that hides the complexity of interacting computers, networks and file systems from the user. (authors)

  12. A visual analytics system for optimizing the performance of large-scale networks in supercomputing systems

    Directory of Open Access Journals (Sweden)

    Takanori Fujiwara

    2018-03-01

    Full Text Available The overall efficiency of an extreme-scale supercomputer largely relies on the performance of its network interconnects. Several of the state of the art supercomputers use networks based on the increasingly popular Dragonfly topology. It is crucial to study the behavior and performance of different parallel applications running on Dragonfly networks in order to make optimal system configurations and design choices, such as job scheduling and routing strategies. However, in order to study these temporal network behavior, we would need a tool to analyze and correlate numerous sets of multivariate time-series data collected from the Dragonfly’s multi-level hierarchies. This paper presents such a tool–a visual analytics system–that uses the Dragonfly network to investigate the temporal behavior and optimize the communication performance of a supercomputer. We coupled interactive visualization with time-series analysis methods to help reveal hidden patterns in the network behavior with respect to different parallel applications and system configurations. Our system also provides multiple coordinated views for connecting behaviors observed at different levels of the network hierarchies, which effectively helps visual analysis tasks. We demonstrate the effectiveness of the system with a set of case studies. Our system and findings can not only help improve the communication performance of supercomputing applications, but also the network performance of next-generation supercomputers. Keywords: Supercomputing, Parallel communication network, Dragonfly networks, Time-series data, Performance analysis, Visual analytics

  13. Computational plasma physics and supercomputers. Revision 1

    International Nuclear Information System (INIS)

    Killeen, J.; McNamara, B.

    1985-01-01

    The Supercomputers of the 80's are introduced. They are 10 to 100 times more powerful than today's machines. The range of physics modeling in the fusion program is outlined. New machine architecture will influence particular models, but parallel processing poses new programming difficulties. Increasing realism in simulations will require better numerics and more elaborate mathematical models

  14. Application of Supercomputer Technologies for Simulation Of Socio-Economic Systems

    Directory of Open Access Journals (Sweden)

    Vladimir Valentinovich Okrepilov

    2015-06-01

    Full Text Available To date, an extensive experience has been accumulated in investigation of problems related to quality, assessment of management systems, modeling of economic system sustainability. The performed studies have created a basis for development of a new research area — Economics of Quality. Its tools allow to use opportunities of model simulation for construction of the mathematical models adequately reflecting the role of quality in natural, technical, social regularities of functioning of the complex socio-economic systems. Extensive application and development of models, and also system modeling with use of supercomputer technologies, on our deep belief, will bring the conducted research of socio-economic systems to essentially new level. Moreover, the current scientific research makes a significant contribution to model simulation of multi-agent social systems and that is not less important, it belongs to the priority areas in development of science and technology in our country. This article is devoted to the questions of supercomputer technologies application in public sciences, first of all, — regarding technical realization of the large-scale agent-focused models (AFM. The essence of this tool is that owing to the power computer increase it has become possible to describe the behavior of many separate fragments of a difficult system, as socio-economic systems are. The article also deals with the experience of foreign scientists and practicians in launching the AFM on supercomputers, and also the example of AFM developed in CEMI RAS, stages and methods of effective calculating kernel display of multi-agent system on architecture of a modern supercomputer will be analyzed. The experiments on the basis of model simulation on forecasting the population of St. Petersburg according to three scenarios as one of the major factors influencing the development of socio-economic system and quality of life of the population are presented in the

  15. Use of QUADRICS supercomputer as embedded simulator in emergency management systems

    International Nuclear Information System (INIS)

    Bove, R.; Di Costanzo, G.; Ziparo, A.

    1996-07-01

    The experience related to the implementation of a MRBT, atmospheric spreading model with a short duration releasing, are reported. This model was implemented on a QUADRICS-Q1 supercomputer. First is reported a description of the MRBT model. It is an analytical model to study the speadings of light gases realised in the atmosphere cause incidental releasing. The solution of diffusion equation is Gaussian like. It yield the concentration of pollutant substance released. The concentration is function of space and time. Thus the QUADRICS architecture is introduced. And the implementation of the model is described. At the end it will be consider the integration of the QUADRICS-based model as simulator in a emergency management system

  16. Centralized supercomputer support for magnetic fusion energy research

    International Nuclear Information System (INIS)

    Fuss, D.; Tull, G.G.

    1984-01-01

    High-speed computers with large memories are vital to magnetic fusion energy research. Magnetohydrodynamic (MHD), transport, equilibrium, Vlasov, particle, and Fokker-Planck codes that model plasma behavior play an important role in designing experimental hardware and interpreting the resulting data, as well as in advancing plasma theory itself. The size, architecture, and software of supercomputers to run these codes are often the crucial constraints on the benefits such computational modeling can provide. Hence, vector computers such as the CRAY-1 offer a valuable research resource. To meet the computational needs of the fusion program, the National Magnetic Fusion Energy Computer Center (NMFECC) was established in 1974 at the Lawrence Livermore National Laboratory. Supercomputers at the central computing facility are linked to smaller computer centers at each of the major fusion laboratories by a satellite communication network. In addition to providing large-scale computing, the NMFECC environment stimulates collaboration and the sharing of computer codes and data among the many fusion researchers in a cost-effective manner

  17. Extending ATLAS Computing to Commercial Clouds and Supercomputers

    CERN Document Server

    Nilsson, P; The ATLAS collaboration; Filipcic, A; Klimentov, A; Maeno, T; Oleynik, D; Panitkin, S; Wenaus, T; Wu, W

    2014-01-01

    The Large Hadron Collider will resume data collection in 2015 with substantially increased computing requirements relative to its first 2009-2013 run. A near doubling of the energy and the data rate, high level of event pile-up, and detector upgrades will mean the number and complexity of events to be analyzed will increase dramatically. A naive extrapolation of the Run 1 experience would suggest that a 5-6 fold increase in computing resources are needed - impossible within the anticipated flat computing budgets in the near future. Consequently ATLAS is engaged in an ambitious program to expand its computing to all available resources, notably including opportunistic use of commercial clouds and supercomputers. Such resources present new challenges in managing heterogeneity, supporting data flows, parallelizing workflows, provisioning software, and other aspects of distributed computing, all while minimizing operational load. We will present the ATLAS experience to date with clouds and supercomputers, and des...

  18. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    Energy Technology Data Exchange (ETDEWEB)

    De, K [University of Texas at Arlington; Jha, S [Rutgers University; Klimentov, A [Brookhaven National Laboratory (BNL); Maeno, T [Brookhaven National Laboratory (BNL); Nilsson, P [Brookhaven National Laboratory (BNL); Oleynik, D [University of Texas at Arlington; Panitkin, S [Brookhaven National Laboratory (BNL); Wells, Jack C [ORNL; Wenaus, T [Brookhaven National Laboratory (BNL)

    2016-01-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation

  19. Topics and topic prominence in two sign languages

    NARCIS (Netherlands)

    Kimmelman, V.

    2015-01-01

    In this paper we describe topic marking in Russian Sign Language (RSL) and Sign Language of the Netherlands (NGT) and discuss whether these languages should be considered topic prominent. The formal markers of topics in RSL are sentence-initial position, a prosodic break following the topic, and

  20. Tryton Supercomputer Capabilities for Analysis of Massive Data Streams

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2015-09-01

    Full Text Available The recently deployed supercomputer Tryton, located in the Academic Computer Center of Gdansk University of Technology, provides great means for massive parallel processing. Moreover, the status of the Center as one of the main network nodes in the PIONIER network enables the fast and reliable transfer of data produced by miscellaneous devices scattered in the area of the whole country. The typical examples of such data are streams containing radio-telescope and satellite observations. Their analysis, especially with real-time constraints, can be challenging and requires the usage of dedicated software components. We propose a solution for such parallel analysis using the supercomputer, supervised by the KASKADA platform, which with the conjunction with immerse 3D visualization techniques can be used to solve problems such as pulsar detection and chronometric or oil-spill simulation on the sea surface.

  1. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel's MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  2. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2013-12-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel\\'s MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  3. SUPERCOMPUTER SIMULATION OF CRITICAL PHENOMENA IN COMPLEX SOCIAL SYSTEMS

    Directory of Open Access Journals (Sweden)

    Petrus M.A. Sloot

    2014-09-01

    Full Text Available The paper describes a problem of computer simulation of critical phenomena in complex social systems on a petascale computing systems in frames of complex networks approach. The three-layer system of nested models of complex networks is proposed including aggregated analytical model to identify critical phenomena, detailed model of individualized network dynamics and model to adjust a topological structure of a complex network. The scalable parallel algorithm covering all layers of complex networks simulation is proposed. Performance of the algorithm is studied on different supercomputing systems. The issues of software and information infrastructure of complex networks simulation are discussed including organization of distributed calculations, crawling the data in social networks and results visualization. The applications of developed methods and technologies are considered including simulation of criminal networks disruption, fast rumors spreading in social networks, evolution of financial networks and epidemics spreading.

  4. Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.

    Energy Technology Data Exchange (ETDEWEB)

    Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brightwell, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In this paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.

  5. Cellular-automata supercomputers for fluid-dynamics modeling

    International Nuclear Information System (INIS)

    Margolus, N.; Toffoli, T.; Vichniac, G.

    1986-01-01

    We report recent developments in the modeling of fluid dynamics, and give experimental results (including dynamical exponents) obtained using cellular automata machines. Because of their locality and uniformity, cellular automata lend themselves to an extremely efficient physical realization; with a suitable architecture, an amount of hardware resources comparable to that of a home computer can achieve (in the simulation of cellular automata) the performance of a conventional supercomputer

  6. The TeraGyroid Experiment – Supercomputing 2003

    Directory of Open Access Journals (Sweden)

    R.J. Blake

    2005-01-01

    Full Text Available Amphiphiles are molecules with hydrophobic tails and hydrophilic heads. When dispersed in solvents, they self assemble into complex mesophases including the beautiful cubic gyroid phase. The goal of the TeraGyroid experiment was to study defect pathways and dynamics in these gyroids. The UK's supercomputing and USA's TeraGrid facilities were coupled together, through a dedicated high-speed network, into a single computational Grid for research work that peaked around the Supercomputing 2003 conference. The gyroids were modeled using lattice Boltzmann methods with parameter spaces explored using many 1283 and 3grid point simulations, this data being used to inform the world's largest three-dimensional time dependent simulation with 10243-grid points. The experiment generated some 2 TBytes of useful data. In terms of Grid technology the project demonstrated the migration of simulations (using Globus middleware to and fro across the Atlantic exploiting the availability of resources. Integration of the systems accelerated the time to insight. Distributed visualisation of the output datasets enabled the parameter space of the interactions within the complex fluid to be explored from a number of sites, informed by discourse over the Access Grid. The project was sponsored by EPSRC (UK and NSF (USA with trans-Atlantic optical bandwidth provided by British Telecommunications.

  7. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    Science.gov (United States)

    Klimentov, A.; De, K.; Jha, S.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Wells, J.; Wenaus, T.

    2016-10-01

    The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the

  8. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    International Nuclear Information System (INIS)

    Klimentov, A; Maeno, T; Nilsson, P; Panitkin, S; Wenaus, T; De, K; Oleynik, D; Jha, S; Wells, J

    2016-01-01

    The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the

  9. Analyzing the Interplay of Failures and Workload on a Leadership-Class Supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Meneses, Esteban [University of Pittsburgh; Ni, Xiang [University of Illinois at Urbana-Champaign; Jones, Terry R [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The unprecedented computational power of cur- rent supercomputers now makes possible the exploration of complex problems in many scientific fields, from genomic analysis to computational fluid dynamics. Modern machines are powerful because they are massive: they assemble millions of cores and a huge quantity of disks, cards, routers, and other components. But it is precisely the size of these machines that glooms the future of supercomputing. A system that comprises many components has a high chance to fail, and fail often. In order to make the next generation of supercomputers usable, it is imperative to use some type of fault tolerance platform to run applications on large machines. Most fault tolerance strategies can be optimized for the peculiarities of each system and boost efficacy by keeping the system productive. In this paper, we aim to understand how failure characterization can improve resilience in several layers of the software stack: applications, runtime systems, and job schedulers. We examine the Titan supercomputer, one of the fastest systems in the world. We analyze a full year of Titan in production and distill the failure patterns of the machine. By looking into Titan s log files and using the criteria of experts, we provide a detailed description of the types of failures. In addition, we inspect the job submission files and describe how the system is used. Using those two sources, we cross correlate failures in the machine to executing jobs and provide a picture of how failures affect the user experience. We believe such characterization is fundamental in developing appropriate fault tolerance solutions for Cray systems similar to Titan.

  10. Role of supercomputers in magnetic fusion and energy research programs

    International Nuclear Information System (INIS)

    Killeen, J.

    1985-06-01

    The importance of computer modeling in magnetic fusion (MFE) and energy research (ER) programs is discussed. The need for the most advanced supercomputers is described, and the role of the National Magnetic Fusion Energy Computer Center in meeting these needs is explained

  11. Supercomputer and cluster performance modeling and analysis efforts:2004-2006.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Ganti, Anand; Meyer, Harold (Hal) Edward; Stevenson, Joel O.; Benner, Robert E., Jr. (.,; .); Goudy, Susan Phelps; Doerfler, Douglas W.; Domino, Stefan Paul; Taylor, Mark A.; Malins, Robert Joseph; Scott, Ryan T.; Barnette, Daniel Wayne; Rajan, Mahesh; Ang, James Alfred; Black, Amalia Rebecca; Laub, Thomas William; Vaughan, Courtenay Thomas; Franke, Brian Claude

    2007-02-01

    This report describes efforts by the Performance Modeling and Analysis Team to investigate performance characteristics of Sandia's engineering and scientific applications on the ASC capability and advanced architecture supercomputers, and Sandia's capacity Linux clusters. Efforts to model various aspects of these computers are also discussed. The goals of these efforts are to quantify and compare Sandia's supercomputer and cluster performance characteristics; to reveal strengths and weaknesses in such systems; and to predict performance characteristics of, and provide guidelines for, future acquisitions and follow-on systems. Described herein are the results obtained from running benchmarks and applications to extract performance characteristics and comparisons, as well as modeling efforts, obtained during the time period 2004-2006. The format of the report, with hypertext links to numerous additional documents, purposefully minimizes the document size needed to disseminate the extensive results from our research.

  12. Ultrascalable petaflop parallel supercomputer

    Science.gov (United States)

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Chiu, George [Cross River, NY; Cipolla, Thomas M [Katonah, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Hall, Shawn [Pleasantville, NY; Haring, Rudolf A [Cortlandt Manor, NY; Heidelberger, Philip [Cortlandt Manor, NY; Kopcsay, Gerard V [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Salapura, Valentina [Chappaqua, NY; Sugavanam, Krishnan [Mahopac, NY; Takken, Todd [Brewster, NY

    2010-07-20

    A massively parallel supercomputer of petaOPS-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC) having up to four processing elements. The ASIC nodes are interconnected by multiple independent networks that optimally maximize the throughput of packet communications between nodes with minimal latency. The multiple networks may include three high-speed networks for parallel algorithm message passing including a Torus, collective network, and a Global Asynchronous network that provides global barrier and notification functions. These multiple independent networks may be collaboratively or independently utilized according to the needs or phases of an algorithm for optimizing algorithm processing performance. The use of a DMA engine is provided to facilitate message passing among the nodes without the expenditure of processing resources at the node.

  13. Direct exploitation of a top 500 Supercomputer for Analysis of CMS Data

    International Nuclear Information System (INIS)

    Cabrillo, I; Cabellos, L; Marco, J; Fernandez, J; Gonzalez, I

    2014-01-01

    The Altamira Supercomputer hosted at the Instituto de Fisica de Cantatbria (IFCA) entered in operation in summer 2012. Its last generation FDR Infiniband network used (for message passing) in parallel jobs, supports the connection to General Parallel File System (GPFS) servers, enabling an efficient simultaneous processing of multiple data demanding jobs. Sharing a common GPFS system and a single LDAP-based identification with the existing Grid clusters at IFCA allows CMS researchers to exploit the large instantaneous capacity of this supercomputer to execute analysis jobs. The detailed experience describing this opportunistic use for skimming and final analysis of CMS 2012 data for a specific physics channel, resulting in an order of magnitude reduction of the waiting time, is presented.

  14. Toward a Proof of Concept Cloud Framework for Physics Applications on Blue Gene Supercomputers

    International Nuclear Information System (INIS)

    Dreher, Patrick; Scullin, William; Vouk, Mladen

    2015-01-01

    Traditional high performance supercomputers are capable of delivering large sustained state-of-the-art computational resources to physics applications over extended periods of time using batch processing mode operating environments. However, today there is an increasing demand for more complex workflows that involve large fluctuations in the levels of HPC physics computational requirements during the simulations. Some of the workflow components may also require a richer set of operating system features and schedulers than normally found in a batch oriented HPC environment. This paper reports on progress toward a proof of concept design that implements a cloud framework onto BG/P and BG/Q platforms at the Argonne Leadership Computing Facility. The BG/P implementation utilizes the Kittyhawk utility and the BG/Q platform uses an experimental heterogeneous FusedOS operating system environment. Both platforms use the Virtual Computing Laboratory as the cloud computing system embedded within the supercomputer. This proof of concept design allows a cloud to be configured so that it can capitalize on the specialized infrastructure capabilities of a supercomputer and the flexible cloud configurations without resorting to virtualization. Initial testing of the proof of concept system is done using the lattice QCD MILC code. These types of user reconfigurable environments have the potential to deliver experimental schedulers and operating systems within a working HPC environment for physics computations that may be different from the native OS and schedulers on production HPC supercomputers. (paper)

  15. Plane-wave electronic structure calculations on a parallel supercomputer

    International Nuclear Information System (INIS)

    Nelson, J.S.; Plimpton, S.J.; Sears, M.P.

    1993-01-01

    The development of iterative solutions of Schrodinger's equation in a plane-wave (pw) basis over the last several years has coincided with great advances in the computational power available for performing the calculations. These dual developments have enabled many new and interesting condensed matter phenomena to be studied from a first-principles approach. The authors present a detailed description of the implementation on a parallel supercomputer (hypercube) of the first-order equation-of-motion solution to Schrodinger's equation, using plane-wave basis functions and ab initio separable pseudopotentials. By distributing the plane-waves across the processors of the hypercube many of the computations can be performed in parallel, resulting in decreases in the overall computation time relative to conventional vector supercomputers. This partitioning also provides ample memory for large Fast Fourier Transform (FFT) meshes and the storage of plane-wave coefficients for many hundreds of energy bands. The usefulness of the parallel techniques is demonstrated by benchmark timings for both the FFT's and iterations of the self-consistent solution of Schrodinger's equation for different sized Si unit cells of up to 512 atoms

  16. Problem solving in nuclear engineering using supercomputers

    International Nuclear Information System (INIS)

    Schmidt, F.; Scheuermann, W.; Schatz, A.

    1987-01-01

    The availability of supercomputers enables the engineer to formulate new strategies for problem solving. One such strategy is the Integrated Planning and Simulation System (IPSS). With the integrated systems, simulation models with greater consistency and good agreement with actual plant data can be effectively realized. In the present work some of the basic ideas of IPSS are described as well as some of the conditions necessary to build such systems. Hardware and software characteristics as realized are outlined. (orig.) [de

  17. FPS scientific and supercomputers computers in chemistry

    International Nuclear Information System (INIS)

    Curington, I.J.

    1987-01-01

    FPS Array Processors, scientific computers, and highly parallel supercomputers are used in nearly all aspects of compute-intensive computational chemistry. A survey is made of work utilizing this equipment, both published and current research. The relationship of the computer architecture to computational chemistry is discussed, with specific reference to Molecular Dynamics, Quantum Monte Carlo simulations, and Molecular Graphics applications. Recent installations of the FPS T-Series are highlighted, and examples of Molecular Graphics programs running on the FPS-5000 are shown

  18. Visualizing quantum scattering on the CM-2 supercomputer

    International Nuclear Information System (INIS)

    Richardson, J.L.

    1991-01-01

    We implement parallel algorithms for solving the time-dependent Schroedinger equation on the CM-2 supercomputer. These methods are unconditionally stable as well as unitary at each time step and have the advantage of being spatially local and explicit. We show how to visualize the dynamics of quantum scattering using techniques for visualizing complex wave functions. Several scattering problems are solved to demonstrate the use of these methods. (orig.)

  19. Integration of Titan supercomputer at OLCF with ATLAS Production System

    CERN Document Server

    AUTHOR|(SzGeCERN)643806; The ATLAS collaboration; De, Kaushik; Klimentov, Alexei; Nilsson, Paul; Oleynik, Danila; Padolski, Siarhei; Panitkin, Sergey; Wenaus, Torre

    2017-01-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this paper we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for jo...

  20. Integration of Titan supercomputer at OLCF with ATLAS production system

    CERN Document Server

    Panitkin, Sergey; The ATLAS collaboration

    2016-01-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this talk we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for job...

  1. Supercomputer algorithms for reactivity, dynamics and kinetics of small molecules

    International Nuclear Information System (INIS)

    Lagana, A.

    1989-01-01

    Even for small systems, the accurate characterization of reactive processes is so demanding of computer resources as to suggest the use of supercomputers having vector and parallel facilities. The full advantages of vector and parallel architectures can sometimes be obtained by simply modifying existing programs, vectorizing the manipulation of vectors and matrices, and requiring the parallel execution of independent tasks. More often, however, a significant time saving can be obtained only when the computer code undergoes a deeper restructuring, requiring a change in the computational strategy or, more radically, the adoption of a different theoretical treatment. This book discusses supercomputer strategies based upon act and approximate methods aimed at calculating the electronic structure and the reactive properties of small systems. The book shows how, in recent years, intense design activity has led to the ability to calculate accurate electronic structures for reactive systems, exact and high-level approximations to three-dimensional reactive dynamics, and to efficient directive and declaratory software for the modelling of complex systems

  2. Design of multiple sequence alignment algorithms on parallel, distributed memory supercomputers.

    Science.gov (United States)

    Church, Philip C; Goscinski, Andrzej; Holt, Kathryn; Inouye, Michael; Ghoting, Amol; Makarychev, Konstantin; Reumann, Matthias

    2011-01-01

    The challenge of comparing two or more genomes that have undergone recombination and substantial amounts of segmental loss and gain has recently been addressed for small numbers of genomes. However, datasets of hundreds of genomes are now common and their sizes will only increase in the future. Multiple sequence alignment of hundreds of genomes remains an intractable problem due to quadratic increases in compute time and memory footprint. To date, most alignment algorithms are designed for commodity clusters without parallelism. Hence, we propose the design of a multiple sequence alignment algorithm on massively parallel, distributed memory supercomputers to enable research into comparative genomics on large data sets. Following the methodology of the sequential progressiveMauve algorithm, we design data structures including sequences and sorted k-mer lists on the IBM Blue Gene/P supercomputer (BG/P). Preliminary results show that we can reduce the memory footprint so that we can potentially align over 250 bacterial genomes on a single BG/P compute node. We verify our results on a dataset of E.coli, Shigella and S.pneumoniae genomes. Our implementation returns results matching those of the original algorithm but in 1/2 the time and with 1/4 the memory footprint for scaffold building. In this study, we have laid the basis for multiple sequence alignment of large-scale datasets on a massively parallel, distributed memory supercomputer, thus enabling comparison of hundreds instead of a few genome sequences within reasonable time.

  3. Harnessing Petaflop-Scale Multi-Core Supercomputing for Problems in Space Science

    Science.gov (United States)

    Albright, B. J.; Yin, L.; Bowers, K. J.; Daughton, W.; Bergen, B.; Kwan, T. J.

    2008-12-01

    The particle-in-cell kinetic plasma code VPIC has been migrated successfully to the world's fastest supercomputer, Roadrunner, a hybrid multi-core platform built by IBM for the Los Alamos National Laboratory. How this was achieved will be described and examples of state-of-the-art calculations in space science, in particular, the study of magnetic reconnection, will be presented. With VPIC on Roadrunner, we have performed, for the first time, plasma PIC calculations with over one trillion particles, >100× larger than calculations considered "heroic" by community standards. This allows examination of physics at unprecedented scale and fidelity. Roadrunner is an example of an emerging paradigm in supercomputing: the trend toward multi-core systems with deep hierarchies and where memory bandwidth optimization is vital to achieving high performance. Getting VPIC to perform well on such systems is a formidable challenge: the core algorithm is memory bandwidth limited with low compute-to-data ratio and requires random access to memory in its inner loop. That we were able to get VPIC to perform and scale well, achieving >0.374 Pflop/s and linear weak scaling on real physics problems on up to the full 12240-core Roadrunner machine, bodes well for harnessing these machines for our community's needs in the future. Many of the design considerations encountered commute to other multi-core and accelerated (e.g., via GPU) platforms and we modified VPIC with flexibility in mind. These will be summarized and strategies for how one might adapt a code for such platforms will be shared. Work performed under the auspices of the U.S. DOE by the LANS LLC Los Alamos National Laboratory. Dr. Bowers is a LANL Guest Scientist; he is presently at D. E. Shaw Research LLC, 120 W 45th Street, 39th Floor, New York, NY 10036.

  4. Novel Supercomputing Approaches for High Performance Linear Algebra Using FPGAs, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Supercomputing plays a major role in many areas of science and engineering, and it has had tremendous impact for decades in areas such as aerospace, defense, energy,...

  5. BSMBench: a flexible and scalable supercomputer benchmark from computational particle physics

    CERN Document Server

    Bennett, Ed; Del Debbio, Luigi; Jordan, Kirk; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2016-01-01

    Benchmarking plays a central role in the evaluation of High Performance Computing architectures. Several benchmarks have been designed that allow users to stress various components of supercomputers. In order for the figures they provide to be useful, benchmarks need to be representative of the most common real-world scenarios. In this work, we introduce BSMBench, a benchmarking suite derived from Monte Carlo code used in computational particle physics. The advantage of this suite (which can be freely downloaded from http://www.bsmbench.org/) over others is the capacity to vary the relative importance of computation and communication. This enables the tests to simulate various practical situations. To showcase BSMBench, we perform a wide range of tests on various architectures, from desktop computers to state-of-the-art supercomputers, and discuss the corresponding results. Possible future directions of development of the benchmark are also outlined.

  6. High Performance Networks From Supercomputing to Cloud Computing

    CERN Document Server

    Abts, Dennis

    2011-01-01

    Datacenter networks provide the communication substrate for large parallel computer systems that form the ecosystem for high performance computing (HPC) systems and modern Internet applications. The design of new datacenter networks is motivated by an array of applications ranging from communication intensive climatology, complex material simulations and molecular dynamics to such Internet applications as Web search, language translation, collaborative Internet applications, streaming video and voice-over-IP. For both Supercomputing and Cloud Computing the network enables distributed applicati

  7. Intelligent Personal Supercomputer for Solving Scientific and Technical Problems

    Directory of Open Access Journals (Sweden)

    Khimich, O.M.

    2016-09-01

    Full Text Available New domestic intellіgent personal supercomputer of hybrid architecture Inparkom_pg for the mathematical modeling of processes in the defense industry, engineering, construction, etc. was developed. Intelligent software for the automatic research and tasks of computational mathematics with approximate data of different structures was designed. Applied software to provide mathematical modeling problems in construction, welding and filtration processes was implemented.

  8. Supercomputers and the future of computational atomic scattering physics

    International Nuclear Information System (INIS)

    Younger, S.M.

    1989-01-01

    The advent of the supercomputer has opened new vistas for the computational atomic physicist. Problems of hitherto unparalleled complexity are now being examined using these new machines, and important connections with other fields of physics are being established. This talk briefly reviews some of the most important trends in computational scattering physics and suggests some exciting possibilities for the future. 7 refs., 2 figs

  9. Visualization on supercomputing platform level II ASC milestone (3537-1B) results from Sandia.

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk (Kitware, Inc., Clifton Park, NY); Fabian, Nathan; Marion, Patrick (Kitware, Inc., Clifton Park, NY); Moreland, Kenneth D.

    2010-09-01

    This report provides documentation for the completion of the Sandia portion of the ASC Level II Visualization on the platform milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratories and Los Alamos National Laboratories. This milestone contains functionality required for performing visualization directly on a supercomputing platform, which is necessary for peta-scale visualization. Sandia's contribution concerns in-situ visualization, running a visualization in tandem with a solver. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is most computationally intensive portion of the visualization process. For terascale platforms, commodity clusters with graphics processors(GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the performance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. Scientific simulation on parallel supercomputers is traditionally performed in four

  10. Multi-petascale highly efficient parallel supercomputer

    Science.gov (United States)

    Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.; Blumrich, Matthias A.; Boyle, Peter; Brunheroto, Jose R.; Chen, Dong; Cher, Chen-Yong; Chiu, George L.; Christ, Norman; Coteus, Paul W.; Davis, Kristan D.; Dozsa, Gabor J.; Eichenberger, Alexandre E.; Eisley, Noel A.; Ellavsky, Matthew R.; Evans, Kahn C.; Fleischer, Bruce M.; Fox, Thomas W.; Gara, Alan; Giampapa, Mark E.; Gooding, Thomas M.; Gschwind, Michael K.; Gunnels, John A.; Hall, Shawn A.; Haring, Rudolf A.; Heidelberger, Philip; Inglett, Todd A.; Knudson, Brant L.; Kopcsay, Gerard V.; Kumar, Sameer; Mamidala, Amith R.; Marcella, James A.; Megerian, Mark G.; Miller, Douglas R.; Miller, Samuel J.; Muff, Adam J.; Mundy, Michael B.; O'Brien, John K.; O'Brien, Kathryn M.; Ohmacht, Martin; Parker, Jeffrey J.; Poole, Ruth J.; Ratterman, Joseph D.; Salapura, Valentina; Satterfield, David L.; Senger, Robert M.; Steinmacher-Burow, Burkhard; Stockdell, William M.; Stunkel, Craig B.; Sugavanam, Krishnan; Sugawara, Yutaka; Takken, Todd E.; Trager, Barry M.; Van Oosten, James L.; Wait, Charles D.; Walkup, Robert E.; Watson, Alfred T.; Wisniewski, Robert W.; Wu, Peng

    2018-05-15

    A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaflop-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC). The ASIC nodes are interconnected by a five dimensional torus network that optimally maximize the throughput of packet communications between nodes and minimize latency. The network implements collective network and a global asynchronous network that provides global barrier and notification functions. Integrated in the node design include a list-based prefetcher. The memory system implements transaction memory, thread level speculation, and multiversioning cache that improves soft error rate at the same time and supports DMA functionality allowing for parallel processing message-passing.

  11. Personal Supercomputing for Monte Carlo Simulation Using a GPU

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jae-Yong; Koo, Yang-Hyun; Lee, Byung-Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-05-15

    Since the usability, accessibility, and maintenance of a personal computer (PC) are very good, a PC is a useful computer simulation tool for researchers. It has enough calculation power to simulate a small scale system with the improved performance of a PC's CPU. However, if a system is large or long time scale, we need a cluster computer or supercomputer. Recently great changes have occurred in the PC calculation environment. A graphic process unit (GPU) on a graphic card, only used to calculate display data, has a superior calculation capability to a PC's CPU. This GPU calculation performance is a match for the supercomputer in 2000. Although it has such a great calculation potential, it is not easy to program a simulation code for GPU due to difficult programming techniques for converting a calculation matrix to a 3D rendering image using graphic APIs. In 2006, NVIDIA provided the Software Development Kit (SDK) for the programming environment for NVIDIA's graphic cards, which is called the Compute Unified Device Architecture (CUDA). It makes the programming on the GPU easy without knowledge of the graphic APIs. This paper describes the basic architectures of NVIDIA's GPU and CUDA, and carries out a performance benchmark for the Monte Carlo simulation.

  12. Personal Supercomputing for Monte Carlo Simulation Using a GPU

    International Nuclear Information System (INIS)

    Oh, Jae-Yong; Koo, Yang-Hyun; Lee, Byung-Ho

    2008-01-01

    Since the usability, accessibility, and maintenance of a personal computer (PC) are very good, a PC is a useful computer simulation tool for researchers. It has enough calculation power to simulate a small scale system with the improved performance of a PC's CPU. However, if a system is large or long time scale, we need a cluster computer or supercomputer. Recently great changes have occurred in the PC calculation environment. A graphic process unit (GPU) on a graphic card, only used to calculate display data, has a superior calculation capability to a PC's CPU. This GPU calculation performance is a match for the supercomputer in 2000. Although it has such a great calculation potential, it is not easy to program a simulation code for GPU due to difficult programming techniques for converting a calculation matrix to a 3D rendering image using graphic APIs. In 2006, NVIDIA provided the Software Development Kit (SDK) for the programming environment for NVIDIA's graphic cards, which is called the Compute Unified Device Architecture (CUDA). It makes the programming on the GPU easy without knowledge of the graphic APIs. This paper describes the basic architectures of NVIDIA's GPU and CUDA, and carries out a performance benchmark for the Monte Carlo simulation

  13. Design and performance characterization of electronic structure calculations on massively parallel supercomputers

    DEFF Research Database (Denmark)

    Romero, N. A.; Glinsvad, Christian; Larsen, Ask Hjorth

    2013-01-01

    Density function theory (DFT) is the most widely employed electronic structure method because of its favorable scaling with system size and accuracy for a broad range of molecular and condensed-phase systems. The advent of massively parallel supercomputers has enhanced the scientific community...

  14. Computational Science with the Titan Supercomputer: Early Outcomes and Lessons Learned

    Science.gov (United States)

    Wells, Jack

    2014-03-01

    Modeling and simulation with petascale computing has supercharged the process of innovation and understanding, dramatically accelerating time-to-insight and time-to-discovery. This presentation will focus on early outcomes from the Titan supercomputer at the Oak Ridge National Laboratory. Titan has over 18,000 hybrid compute nodes consisting of both CPUs and GPUs. In this presentation, I will discuss the lessons we have learned in deploying Titan and preparing applications to move from conventional CPU architectures to a hybrid machine. I will present early results of materials applications running on Titan and the implications for the research community as we prepare for exascale supercomputer in the next decade. Lastly, I will provide an overview of user programs at the Oak Ridge Leadership Computing Facility with specific information how researchers may apply for allocations of computing resources. This research used resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.

  15. Integration of PanDA workload management system with Titan supercomputer at OLCF

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00300320; Klimentov, Alexei; Oleynik, Danila; Panitkin, Sergey; Petrosyan, Artem; Vaniachine, Alexandre; Wenaus, Torre; Schovancova, Jaroslava

    2015-01-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently distributes jobs to more than 100,000 cores at well over 100 Grid sites, next LHC data taking run will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modi ed PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multi-core worker nodes. It also gives PanDA new capability to collect, in real time, information about unused...

  16. Integration of PanDA workload management system with Titan supercomputer at OLCF

    CERN Document Server

    Panitkin, Sergey; The ATLAS collaboration; Klimentov, Alexei; Oleynik, Danila; Petrosyan, Artem; Schovancova, Jaroslava; Vaniachine, Alexandre; Wenaus, Torre

    2015-01-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently uses more than 100,000 cores at well over 100 Grid sites with a peak performance of 0.3 petaFLOPS, next LHC data taking run will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multi-core worker nodes. It also gives PanDA new capability to collect, in real tim...

  17. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  18. ParaBTM: A Parallel Processing Framework for Biomedical Text Mining on Supercomputers.

    Science.gov (United States)

    Xing, Yuting; Wu, Chengkun; Yang, Xi; Wang, Wei; Zhu, En; Yin, Jianping

    2018-04-27

    A prevailing way of extracting valuable information from biomedical literature is to apply text mining methods on unstructured texts. However, the massive amount of literature that needs to be analyzed poses a big data challenge to the processing efficiency of text mining. In this paper, we address this challenge by introducing parallel processing on a supercomputer. We developed paraBTM, a runnable framework that enables parallel text mining on the Tianhe-2 supercomputer. It employs a low-cost yet effective load balancing strategy to maximize the efficiency of parallel processing. We evaluated the performance of paraBTM on several datasets, utilizing three types of named entity recognition tasks as demonstration. Results show that, in most cases, the processing efficiency can be greatly improved with parallel processing, and the proposed load balancing strategy is simple and effective. In addition, our framework can be readily applied to other tasks of biomedical text mining besides NER.

  19. Explaining the gap between theoretical peak performance and real performance for supercomputer architectures

    International Nuclear Information System (INIS)

    Schoenauer, W.; Haefner, H.

    1993-01-01

    The basic architectures of vector and parallel computers with their properties are presented. Then the memory size and the arithmetic operations in the context of memory bandwidth are discussed. For the exemplary discussion of a single operation micro-measurements of the vector triad for the IBM 3090 VF and the CRAY Y-MP/8 are presented. They reveal the details of the losses for a single operation. Then we analyze the global performance of a whole supercomputer by identifying reduction factors that bring down the theoretical peak performance to the poor real performance. The responsibilities of the manufacturer and of the user for these losses are dicussed. Then the price-performance ratio for different architectures in a snapshot of January 1991 is briefly mentioned. Finally some remarks to a user-friendly architecture for a supercomputer will be made. (orig.)

  20. HPL and STREAM Benchmarks on SANAM Supercomputer

    KAUST Repository

    Bin Sulaiman, Riman A.

    2017-01-01

    SANAM supercomputer was jointly built by KACST and FIAS in 2012 ranking second that year in the Green500 list with a power efficiency of 2.3 GFLOPS/W (Rohr et al., 2014). It is a heterogeneous accelerator-based HPC system that has 300 compute nodes. Each node includes two Intel Xeon E5?2650 CPUs, two AMD FirePro S10000 dual GPUs and 128 GiB of main memory. In this work, the seven benchmarks of HPCC were installed and configured to reassess the performance of SANAM, as part of an unpublished master thesis, after it was reassembled in the Kingdom of Saudi Arabia. We present here detailed results of HPL and STREAM benchmarks.

  1. HPL and STREAM Benchmarks on SANAM Supercomputer

    KAUST Repository

    Bin Sulaiman, Riman A.

    2017-03-13

    SANAM supercomputer was jointly built by KACST and FIAS in 2012 ranking second that year in the Green500 list with a power efficiency of 2.3 GFLOPS/W (Rohr et al., 2014). It is a heterogeneous accelerator-based HPC system that has 300 compute nodes. Each node includes two Intel Xeon E5?2650 CPUs, two AMD FirePro S10000 dual GPUs and 128 GiB of main memory. In this work, the seven benchmarks of HPCC were installed and configured to reassess the performance of SANAM, as part of an unpublished master thesis, after it was reassembled in the Kingdom of Saudi Arabia. We present here detailed results of HPL and STREAM benchmarks.

  2. An efficient implementation of a backpropagation learning algorithm on quadrics parallel supercomputer

    International Nuclear Information System (INIS)

    Taraglio, S.; Massaioli, F.

    1995-08-01

    A parallel implementation of a library to build and train Multi Layer Perceptrons via the Back Propagation algorithm is presented. The target machine is the SIMD massively parallel supercomputer Quadrics. Performance measures are provided on three different machines with different number of processors, for two network examples. A sample source code is given

  3. Supercomputing Centers and Electricity Service Providers

    DEFF Research Database (Denmark)

    Patki, Tapasya; Bates, Natalie; Ghatikar, Girish

    2016-01-01

    from a detailed, quantitative survey-based analysis and compare the perspectives of the European grid and SCs to the ones of the United States (US). We then show that contrary to the expectation, SCs in the US are more open toward cooperating and developing demand-management strategies with their ESPs......Supercomputing Centers (SCs) have high and variable power demands, which increase the challenges of the Electricity Service Providers (ESPs) with regards to efficient electricity distribution and reliable grid operation. High penetration of renewable energy generation further exacerbates...... this problem. In order to develop a symbiotic relationship between the SCs and their ESPs and to support effective power management at all levels, it is critical to understand and analyze how the existing relationships were formed and how these are expected to evolve. In this paper, we first present results...

  4. Integration of PanDA workload management system with Titan supercomputer at OLCF

    Science.gov (United States)

    De, K.; Klimentov, A.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.

    2015-12-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently distributes jobs to more than 100,000 cores at well over 100 Grid sites, the future LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). The current approach utilizes a modified PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multicore worker nodes. It also gives PanDA new capability to collect, in real time, information about unused worker nodes on Titan, which allows precise definition of the size and duration of jobs submitted to Titan according to available free resources. This capability significantly reduces PanDA job wait time while improving Titan's utilization efficiency. This implementation was tested with a variety of Monte-Carlo workloads on Titan and is being tested on several other supercomputing platforms. Notice: This manuscript has been authored, by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  5. Building more powerful less expensive supercomputers using Processing-In-Memory (PIM) LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Richard C.

    2009-09-01

    This report details the accomplishments of the 'Building More Powerful Less Expensive Supercomputers Using Processing-In-Memory (PIM)' LDRD ('PIM LDRD', number 105809) for FY07-FY09. Latency dominates all levels of supercomputer design. Within a node, increasing memory latency, relative to processor cycle time, limits CPU performance. Between nodes, the same increase in relative latency impacts scalability. Processing-In-Memory (PIM) is an architecture that directly addresses this problem using enhanced chip fabrication technology and machine organization. PIMs combine high-speed logic and dense, low-latency, high-bandwidth DRAM, and lightweight threads that tolerate latency by performing useful work during memory transactions. This work examines the potential of PIM-based architectures to support mission critical Sandia applications and an emerging class of more data intensive informatics applications. This work has resulted in a stronger architecture/implementation collaboration between 1400 and 1700. Additionally, key technology components have impacted vendor roadmaps, and we are in the process of pursuing these new collaborations. This work has the potential to impact future supercomputer design and construction, reducing power and increasing performance. This final report is organized as follow: this summary chapter discusses the impact of the project (Section 1), provides an enumeration of publications and other public discussion of the work (Section 1), and concludes with a discussion of future work and impact from the project (Section 1). The appendix contains reprints of the refereed publications resulting from this work.

  6. Supercomputers and the mathematical modeling of high complexity problems

    International Nuclear Information System (INIS)

    Belotserkovskii, Oleg M

    2010-01-01

    This paper is a review of many works carried out by members of our scientific school in past years. The general principles of constructing numerical algorithms for high-performance computers are described. Several techniques are highlighted and these are based on the method of splitting with respect to physical processes and are widely used in computing nonlinear multidimensional processes in fluid dynamics, in studies of turbulence and hydrodynamic instabilities and in medicine and other natural sciences. The advances and developments related to the new generation of high-performance supercomputing in Russia are presented.

  7. The Pawsey Supercomputer geothermal cooling project

    Science.gov (United States)

    Regenauer-Lieb, K.; Horowitz, F.; Western Australian Geothermal Centre Of Excellence, T.

    2010-12-01

    The Australian Government has funded the Pawsey supercomputer in Perth, Western Australia, providing computational infrastructure intended to support the future operations of the Australian Square Kilometre Array radiotelescope and to boost next-generation computational geosciences in Australia. Supplementary funds have been directed to the development of a geothermal exploration well to research the potential for direct heat use applications at the Pawsey Centre site. Cooling the Pawsey supercomputer may be achieved by geothermal heat exchange rather than by conventional electrical power cooling, thus reducing the carbon footprint of the Pawsey Centre and demonstrating an innovative green technology that is widely applicable in industry and urban centres across the world. The exploration well is scheduled to be completed in 2013, with drilling due to commence in the third quarter of 2011. One year is allocated to finalizing the design of the exploration, monitoring and research well. Success in the geothermal exploration and research program will result in an industrial-scale geothermal cooling facility at the Pawsey Centre, and will provide a world-class student training environment in geothermal energy systems. A similar system is partially funded and in advanced planning to provide base-load air-conditioning for the main campus of the University of Western Australia. Both systems are expected to draw ~80-95 degrees C water from aquifers lying between 2000 and 3000 meters depth from naturally permeable rocks of the Perth sedimentary basin. The geothermal water will be run through absorption chilling devices, which only require heat (as opposed to mechanical work) to power a chilled water stream adequate to meet the cooling requirements. Once the heat has been removed from the geothermal water, licensing issues require the water to be re-injected back into the aquifer system. These systems are intended to demonstrate the feasibility of powering large-scale air

  8. Heat dissipation computations of a HVDC ground electrode using a supercomputer

    International Nuclear Information System (INIS)

    Greiss, H.; Mukhedkar, D.; Lagace, P.J.

    1990-01-01

    This paper reports on the temperature, of soil surrounding a High Voltage Direct Current (HVDC) toroidal ground electrode of practical dimensions, in both homogeneous and non-homogeneous soils that was computed at incremental points in time using finite difference methods on a supercomputer. Curves of the response were computed and plotted at several locations within the soil in the vicinity of the ground electrode for various values of the soil parameters

  9. Argonne National Lab deploys Force10 networks' massively dense ethernet switch for supercomputing cluster

    CERN Multimedia

    2003-01-01

    "Force10 Networks, Inc. today announced that Argonne National Laboratory (Argonne, IL) has successfully deployed Force10 E-Series switch/routers to connect to the TeraGrid, the world's largest supercomputing grid, sponsored by the National Science Foundation (NSF)" (1/2 page).

  10. A supercomputing application for reactors core design and optimization

    International Nuclear Information System (INIS)

    Hourcade, Edouard; Gaudier, Fabrice; Arnaud, Gilles; Funtowiez, David; Ammar, Karim

    2010-01-01

    Advanced nuclear reactor designs are often intuition-driven processes where designers first develop or use simplified simulation tools for each physical phenomenon involved. Through the project development, complexity in each discipline increases and implementation of chaining/coupling capabilities adapted to supercomputing optimization process are often postponed to a further step so that task gets increasingly challenging. In the context of renewal in reactor designs, project of first realization are often run in parallel with advanced design although very dependant on final options. As a consequence, the development of tools to globally assess/optimize reactor core features, with the on-going design methods accuracy, is needed. This should be possible within reasonable simulation time and without advanced computer skills needed at project management scale. Also, these tools should be ready to easily cope with modeling progresses in each discipline through project life-time. An early stage development of multi-physics package adapted to supercomputing is presented. The URANIE platform, developed at CEA and based on the Data Analysis Framework ROOT, is very well adapted to this approach. It allows diversified sampling techniques (SRS, LHS, qMC), fitting tools (neuronal networks...) and optimization techniques (genetic algorithm). Also data-base management and visualization are made very easy. In this paper, we'll present the various implementing steps of this core physics tool where neutronics, thermo-hydraulics, and fuel mechanics codes are run simultaneously. A relevant example of optimization of nuclear reactor safety characteristics will be presented. Also, flexibility of URANIE tool will be illustrated with the presentation of several approaches to improve Pareto front quality. (author)

  11. Performance Evaluation of Supercomputers using HPCC and IMB Benchmarks

    Science.gov (United States)

    Saini, Subhash; Ciotti, Robert; Gunney, Brian T. N.; Spelce, Thomas E.; Koniges, Alice; Dossa, Don; Adamidis, Panagiotis; Rabenseifner, Rolf; Tiyyagura, Sunil R.; Mueller, Matthias; hide

    2006-01-01

    The HPC Challenge (HPCC) benchmark suite and the Intel MPI Benchmark (IMB) are used to compare and evaluate the combined performance of processor, memory subsystem and interconnect fabric of five leading supercomputers - SGI Altix BX2, Cray XI, Cray Opteron Cluster, Dell Xeon cluster, and NEC SX-8. These five systems use five different networks (SGI NUMALINK4, Cray network, Myrinet, InfiniBand, and NEC IXS). The complete set of HPCC benchmarks are run on each of these systems. Additionally, we present Intel MPI Benchmarks (IMB) results to study the performance of 11 MPI communication functions on these systems.

  12. An Interface for Biomedical Big Data Processing on the Tianhe-2 Supercomputer.

    Science.gov (United States)

    Yang, Xi; Wu, Chengkun; Lu, Kai; Fang, Lin; Zhang, Yong; Li, Shengkang; Guo, Guixin; Du, YunFei

    2017-12-01

    Big data, cloud computing, and high-performance computing (HPC) are at the verge of convergence. Cloud computing is already playing an active part in big data processing with the help of big data frameworks like Hadoop and Spark. The recent upsurge of high-performance computing in China provides extra possibilities and capacity to address the challenges associated with big data. In this paper, we propose Orion-a big data interface on the Tianhe-2 supercomputer-to enable big data applications to run on Tianhe-2 via a single command or a shell script. Orion supports multiple users, and each user can launch multiple tasks. It minimizes the effort needed to initiate big data applications on the Tianhe-2 supercomputer via automated configuration. Orion follows the "allocate-when-needed" paradigm, and it avoids the idle occupation of computational resources. We tested the utility and performance of Orion using a big genomic dataset and achieved a satisfactory performance on Tianhe-2 with very few modifications to existing applications that were implemented in Hadoop/Spark. In summary, Orion provides a practical and economical interface for big data processing on Tianhe-2.

  13. Correlated Topic Vector for Scene Classification.

    Science.gov (United States)

    Wei, Pengxu; Qin, Fei; Wan, Fang; Zhu, Yi; Jiao, Jianbin; Ye, Qixiang

    2017-07-01

    Scene images usually involve semantic correlations, particularly when considering large-scale image data sets. This paper proposes a novel generative image representation, correlated topic vector, to model such semantic correlations. Oriented from the correlated topic model, correlated topic vector intends to naturally utilize the correlations among topics, which are seldom considered in the conventional feature encoding, e.g., Fisher vector, but do exist in scene images. It is expected that the involvement of correlations can increase the discriminative capability of the learned generative model and consequently improve the recognition accuracy. Incorporated with the Fisher kernel method, correlated topic vector inherits the advantages of Fisher vector. The contributions to the topics of visual words have been further employed by incorporating the Fisher kernel framework to indicate the differences among scenes. Combined with the deep convolutional neural network (CNN) features and Gibbs sampling solution, correlated topic vector shows great potential when processing large-scale and complex scene image data sets. Experiments on two scene image data sets demonstrate that correlated topic vector improves significantly the deep CNN features, and outperforms existing Fisher kernel-based features.

  14. Supercomputations and big-data analysis in strong-field ultrafast optical physics: filamentation of high-peak-power ultrashort laser pulses

    Science.gov (United States)

    Voronin, A. A.; Panchenko, V. Ya; Zheltikov, A. M.

    2016-06-01

    High-intensity ultrashort laser pulses propagating in gas media or in condensed matter undergo complex nonlinear spatiotemporal evolution where temporal transformations of optical field waveforms are strongly coupled to an intricate beam dynamics and ultrafast field-induced ionization processes. At the level of laser peak powers orders of magnitude above the critical power of self-focusing, the beam exhibits modulation instabilities, producing random field hot spots and breaking up into multiple noise-seeded filaments. This problem is described by a (3  +  1)-dimensional nonlinear field evolution equation, which needs to be solved jointly with the equation for ultrafast ionization of a medium. Analysis of this problem, which is equivalent to solving a billion-dimensional evolution problem, is only possible by means of supercomputer simulations augmented with coordinated big-data processing of large volumes of information acquired through theory-guiding experiments and supercomputations. Here, we review the main challenges of supercomputations and big-data processing encountered in strong-field ultrafast optical physics and discuss strategies to confront these challenges.

  15. Quantum Hamiltonian Physics with Supercomputers

    International Nuclear Information System (INIS)

    Vary, James P.

    2014-01-01

    The vision of solving the nuclear many-body problem in a Hamiltonian framework with fundamental interactions tied to QCD via Chiral Perturbation Theory is gaining support. The goals are to preserve the predictive power of the underlying theory, to test fundamental symmetries with the nucleus as laboratory and to develop new understandings of the full range of complex quantum phenomena. Advances in theoretical frameworks (renormalization and many-body methods) as well as in computational resources (new algorithms and leadership-class parallel computers) signal a new generation of theory and simulations that will yield profound insights into the origins of nuclear shell structure, collective phenomena and complex reaction dynamics. Fundamental discovery opportunities also exist in such areas as physics beyond the Standard Model of Elementary Particles, the transition between hadronic and quark–gluon dominated dynamics in nuclei and signals that characterize dark matter. I will review some recent achievements and present ambitious consensus plans along with their challenges for a coming decade of research that will build new links between theory, simulations and experiment. Opportunities for graduate students to embark upon careers in the fast developing field of supercomputer simulations is also discussed

  16. Quantum Hamiltonian Physics with Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Vary, James P.

    2014-06-15

    The vision of solving the nuclear many-body problem in a Hamiltonian framework with fundamental interactions tied to QCD via Chiral Perturbation Theory is gaining support. The goals are to preserve the predictive power of the underlying theory, to test fundamental symmetries with the nucleus as laboratory and to develop new understandings of the full range of complex quantum phenomena. Advances in theoretical frameworks (renormalization and many-body methods) as well as in computational resources (new algorithms and leadership-class parallel computers) signal a new generation of theory and simulations that will yield profound insights into the origins of nuclear shell structure, collective phenomena and complex reaction dynamics. Fundamental discovery opportunities also exist in such areas as physics beyond the Standard Model of Elementary Particles, the transition between hadronic and quark–gluon dominated dynamics in nuclei and signals that characterize dark matter. I will review some recent achievements and present ambitious consensus plans along with their challenges for a coming decade of research that will build new links between theory, simulations and experiment. Opportunities for graduate students to embark upon careers in the fast developing field of supercomputer simulations is also discussed.

  17. Credibility improves topical blog post retrieval

    NARCIS (Netherlands)

    Weerkamp, W.; de Rijke, M.

    2008-01-01

    Topical blog post retrieval is the task of ranking blog posts with respect to their relevance for a given topic. To improve topical blog post retrieval we incorporate textual credibility indicators in the retrieval process. We consider two groups of indicators: post level (determined using

  18. Coherent 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an Optimal Supercomputer Optical Switch Fabric

    DEFF Research Database (Denmark)

    Karinou, Fotini; Borkowski, Robert; Zibar, Darko

    2013-01-01

    We demonstrate, for the first time, the feasibility of using 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an optimized cell switching supercomputer optical interconnect architecture based on semiconductor optical amplifiers as ON/OFF gates.......We demonstrate, for the first time, the feasibility of using 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an optimized cell switching supercomputer optical interconnect architecture based on semiconductor optical amplifiers as ON/OFF gates....

  19. Cooperative visualization and simulation in a supercomputer environment

    International Nuclear Information System (INIS)

    Ruehle, R.; Lang, U.; Wierse, A.

    1993-01-01

    The article takes a closer look on the requirements being imposed by the idea to integrate all the components into a homogeneous software environment. To this end several methods for the distribtuion of applications in dependence of certain problem types are discussed. The currently available methods at the University of Stuttgart Computer Center for the distribution of applications are further explained. Finally the aims and characteristics of a European sponsored project, called PAGEIN, are explained, which fits perfectly into the line of developments at RUS. The aim of the project is to experiment with future cooperative working modes of aerospace scientists in a high speed distributed supercomputing environment. Project results will have an impact on the development of real future scientific application environments. (orig./DG)

  20. Parallel supercomputing: Advanced methods, algorithms, and software for large-scale linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Carey, G.F.; Young, D.M.

    1993-12-31

    The program outlined here is directed to research on methods, algorithms, and software for distributed parallel supercomputers. Of particular interest are finite element methods and finite difference methods together with sparse iterative solution schemes for scientific and engineering computations of very large-scale systems. Both linear and nonlinear problems will be investigated. In the nonlinear case, applications with bifurcation to multiple solutions will be considered using continuation strategies. The parallelizable numerical methods of particular interest are a family of partitioning schemes embracing domain decomposition, element-by-element strategies, and multi-level techniques. The methods will be further developed incorporating parallel iterative solution algorithms with associated preconditioners in parallel computer software. The schemes will be implemented on distributed memory parallel architectures such as the CRAY MPP, Intel Paragon, the NCUBE3, and the Connection Machine. We will also consider other new architectures such as the Kendall-Square (KSQ) and proposed machines such as the TERA. The applications will focus on large-scale three-dimensional nonlinear flow and reservoir problems with strong convective transport contributions. These are legitimate grand challenge class computational fluid dynamics (CFD) problems of significant practical interest to DOE. The methods developed and algorithms will, however, be of wider interest.

  1. Feynman diagrams sampling for quantum field theories on the QPACE 2 supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Rappl, Florian

    2016-08-01

    This work discusses the application of Feynman diagram sampling in quantum field theories. The method uses a computer simulation to sample the diagrammatic space obtained in a series expansion. For running large physical simulations powerful computers are obligatory, effectively splitting the thesis in two parts. The first part deals with the method of Feynman diagram sampling. Here the theoretical background of the method itself is discussed. Additionally, important statistical concepts and the theory of the strong force, quantum chromodynamics, are introduced. This sets the context of the simulations. We create and evaluate a variety of models to estimate the applicability of diagrammatic methods. The method is then applied to sample the perturbative expansion of the vertex correction. In the end we obtain the value for the anomalous magnetic moment of the electron. The second part looks at the QPACE 2 supercomputer. This includes a short introduction to supercomputers in general, as well as a closer look at the architecture and the cooling system of QPACE 2. Guiding benchmarks of the InfiniBand network are presented. At the core of this part, a collection of best practices and useful programming concepts are outlined, which enables the development of efficient, yet easily portable, applications for the QPACE 2 system.

  2. Use of high performance networks and supercomputers for real-time flight simulation

    Science.gov (United States)

    Cleveland, Jeff I., II

    1993-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be consistent in processing time and be completed in as short a time as possible. These operations include simulation mathematical model computation and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to the Computer Automated Measurement and Control (CAMAC) technology which resulted in a factor of ten increase in the effective bandwidth and reduced latency of modules necessary for simulator communication. This technology extension is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC are completing the development of the use of supercomputers for mathematical model computation to support real-time flight simulation. This includes the development of a real-time operating system and development of specialized software and hardware for the simulator network. This paper describes the data acquisition technology and the development of supercomputing for flight simulation.

  3. Federal Market Information Technology in the Post Flash Crash Era: Roles for Supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E. Wes; Leinweber, David; Ruebel, Oliver; Wu, Kesheng

    2011-09-16

    This paper describes collaborative work between active traders, regulators, economists, and supercomputing researchers to replicate and extend investigations of the Flash Crash and other market anomalies in a National Laboratory HPC environment. Our work suggests that supercomputing tools and methods will be valuable to market regulators in achieving the goal of market safety, stability, and security. Research results using high frequency data and analytics are described, and directions for future development are discussed. Currently the key mechanism for preventing catastrophic market action are “circuit breakers.” We believe a more graduated approach, similar to the “yellow light” approach in motorsports to slow down traffic, might be a better way to achieve the same goal. To enable this objective, we study a number of indicators that could foresee hazards in market conditions and explore options to confirm such predictions. Our tests confirm that Volume Synchronized Probability of Informed Trading (VPIN) and a version of volume Herfindahl-Hirschman Index (HHI) for measuring market fragmentation can indeed give strong signals ahead of the Flash Crash event on May 6 2010. This is a preliminary step toward a full-fledged early-warning system for unusual market conditions.

  4. A fast random number generator for the Intel Paragon supercomputer

    Science.gov (United States)

    Gutbrod, F.

    1995-06-01

    A pseudo-random number generator is presented which makes optimal use of the architecture of the i860-microprocessor and which is expected to have a very long period. It is therefore a good candidate for use on the parallel supercomputer Paragon XP. In the assembler version, it needs 6.4 cycles for a real∗4 random number. There is a FORTRAN routine which yields identical numbers up to rare and minor rounding discrepancies, and it needs 28 cycles. The FORTRAN performance on other microprocessors is somewhat better. Arguments for the quality of the generator and some numerical tests are given.

  5. Development of a Cloud Resolving Model for Heterogeneous Supercomputers

    Science.gov (United States)

    Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.

    2017-12-01

    A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.

  6. Topics in quantum field theory

    NARCIS (Netherlands)

    Dams, C.J.F.

    2006-01-01

    In this PhD-thesis some topics in quantum field theory are considered. The first chapter gives a background to these topics. The second chapter discusses renormalization. In particular it is shown how loop calculations can be performed when using the axial gauge fixing. Fermion creation and

  7. Vector and parallel processors in computational science

    International Nuclear Information System (INIS)

    Duff, I.S.; Reid, J.K.

    1985-01-01

    This book presents the papers given at a conference which reviewed the new developments in parallel and vector processing. Topics considered at the conference included hardware (array processors, supercomputers), programming languages, software aids, numerical methods (e.g., Monte Carlo algorithms, iterative methods, finite elements, optimization), and applications (e.g., neutron transport theory, meteorology, image processing)

  8. Frequently updated noise threat maps created with use of supercomputing grid

    Directory of Open Access Journals (Sweden)

    Szczodrak Maciej

    2014-09-01

    Full Text Available An innovative supercomputing grid services devoted to noise threat evaluation were presented. The services described in this paper concern two issues, first is related to the noise mapping, while the second one focuses on assessment of the noise dose and its influence on the human hearing system. The discussed serviceswere developed within the PL-Grid Plus Infrastructure which accumulates Polish academic supercomputer centers. Selected experimental results achieved by the usage of the services proposed were presented. The assessment of the environmental noise threats includes creation of the noise maps using either ofline or online data, acquired through a grid of the monitoring stations. A concept of estimation of the source model parameters based on the measured sound level for the purpose of creating frequently updated noise maps was presented. Connecting the noise mapping grid service with a distributed sensor network enables to automatically update noise maps for a specified time period. Moreover, a unique attribute of the developed software is the estimation of the auditory effects evoked by the exposure to noise. The estimation method uses a modified psychoacoustic model of hearing and is based on the calculated noise level values and on the given exposure period. Potential use scenarios of the grid services for research or educational purpose were introduced. Presentation of the results of predicted hearing threshold shift caused by exposure to excessive noise can raise the public awareness of the noise threats.

  9. Performance characteristics of hybrid MPI/OpenMP implementations of NAS parallel benchmarks SP and BT on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2011-01-01

    The NAS Parallel Benchmarks (NPB) are well-known applications with the fixed algorithms for evaluating parallel systems and tools. Multicore supercomputers provide a natural programming paradigm for hybrid programs, whereby OpenMP can be used with the data sharing with the multicores that comprise a node and MPI can be used with the communication between nodes. In this paper, we use SP and BT benchmarks of MPI NPB 3.3 as a basis for a comparative approach to implement hybrid MPI/OpenMP versions of SP and BT. In particular, we can compare the performance of the hybrid SP and BT with the MPI counterparts on large-scale multicore supercomputers. Our performance results indicate that the hybrid SP outperforms the MPI SP by up to 20.76%, and the hybrid BT outperforms the MPI BT by up to 8.58% on up to 10,000 cores on BlueGene/P at Argonne National Laboratory and Jaguar (Cray XT4/5) at Oak Ridge National Laboratory. We also use performance tools and MPI trace libraries available on these supercomputers to further investigate the performance characteristics of the hybrid SP and BT.

  10. Performance characteristics of hybrid MPI/OpenMP implementations of NAS parallel benchmarks SP and BT on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-03-29

    The NAS Parallel Benchmarks (NPB) are well-known applications with the fixed algorithms for evaluating parallel systems and tools. Multicore supercomputers provide a natural programming paradigm for hybrid programs, whereby OpenMP can be used with the data sharing with the multicores that comprise a node and MPI can be used with the communication between nodes. In this paper, we use SP and BT benchmarks of MPI NPB 3.3 as a basis for a comparative approach to implement hybrid MPI/OpenMP versions of SP and BT. In particular, we can compare the performance of the hybrid SP and BT with the MPI counterparts on large-scale multicore supercomputers. Our performance results indicate that the hybrid SP outperforms the MPI SP by up to 20.76%, and the hybrid BT outperforms the MPI BT by up to 8.58% on up to 10,000 cores on BlueGene/P at Argonne National Laboratory and Jaguar (Cray XT4/5) at Oak Ridge National Laboratory. We also use performance tools and MPI trace libraries available on these supercomputers to further investigate the performance characteristics of the hybrid SP and BT.

  11. Simulation of x-rays in refractive structure by the Monte Carlo method using the supercomputer SKIF

    International Nuclear Information System (INIS)

    Yaskevich, Yu.R.; Kravchenko, O.I.; Soroka, I.I.; Chembrovskij, A.G.; Kolesnik, A.S.; Serikova, N.V.; Petrov, P.V.; Kol'chevskij, N.N.

    2013-01-01

    Software 'Xray-SKIF' for the simulation of the X-rays in refractive structures by the Monte-Carlo method using the supercomputer SKIF BSU are developed. The program generates a large number of rays propagated from a source to the refractive structure. The ray trajectory under assumption of geometrical optics is calculated. Absorption is calculated for each ray inside of refractive structure. Dynamic arrays are used for results of calculation rays parameters, its restore the X-ray field distributions very fast at different position of detector. It was found that increasing the number of processors leads to proportional decreasing of calculation time: simulation of 10 8 X-rays using supercomputer with the number of processors from 1 to 30 run-times equal 3 hours and 6 minutes, respectively. 10 9 X-rays are calculated by software 'Xray-SKIF' which allows to reconstruct the X-ray field after refractive structure with a special resolution of 1 micron. (authors)

  12. The design and implementation of cost-effective algorithms for direct solution of banded linear systems on the vector processor system 32 supercomputer

    Science.gov (United States)

    Samba, A. S.

    1985-01-01

    The problem of solving banded linear systems by direct (non-iterative) techniques on the Vector Processor System (VPS) 32 supercomputer is considered. Two efficient direct methods for solving banded linear systems on the VPS 32 are described. The vector cyclic reduction (VCR) algorithm is discussed in detail. The performance of the VCR on a three parameter model problem is also illustrated. The VCR is an adaptation of the conventional point cyclic reduction algorithm. The second direct method is the Customized Reduction of Augmented Triangles' (CRAT). CRAT has the dominant characteristics of an efficient VPS 32 algorithm. CRAT is tailored to the pipeline architecture of the VPS 32 and as a consequence the algorithm is implicitly vectorizable.

  13. Communication Characterization and Optimization of Applications Using Topology-Aware Task Mapping on Large Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sreepathi, Sarat [ORNL; D' Azevedo, Eduardo [ORNL; Philip, Bobby [ORNL; Worley, Patrick H [ORNL

    2016-01-01

    On large supercomputers, the job scheduling systems may assign a non-contiguous node allocation for user applications depending on available resources. With parallel applications using MPI (Message Passing Interface), the default process ordering does not take into account the actual physical node layout available to the application. This contributes to non-locality in terms of physical network topology and impacts communication performance of the application. In order to mitigate such performance penalties, this work describes techniques to identify suitable task mapping that takes the layout of the allocated nodes as well as the application's communication behavior into account. During the first phase of this research, we instrumented and collected performance data to characterize communication behavior of critical US DOE (United States - Department of Energy) applications using an augmented version of the mpiP tool. Subsequently, we developed several reordering methods (spectral bisection, neighbor join tree etc.) to combine node layout and application communication data for optimized task placement. We developed a tool called mpiAproxy to facilitate detailed evaluation of the various reordering algorithms without requiring full application executions. This work presents a comprehensive performance evaluation (14,000 experiments) of the various task mapping techniques in lowering communication costs on Titan, the leadership class supercomputer at Oak Ridge National Laboratory.

  14. Plasma turbulence calculations on supercomputers

    International Nuclear Information System (INIS)

    Carreras, B.A.; Charlton, L.A.; Dominguez, N.; Drake, J.B.; Garcia, L.; Leboeuf, J.N.; Lee, D.K.; Lynch, V.E.; Sidikman, K.

    1991-01-01

    Although the single-particle picture of magnetic confinement is helpful in understanding some basic physics of plasma confinement, it does not give a full description. Collective effects dominate plasma behavior. Any analysis of plasma confinement requires a self-consistent treatment of the particles and fields. The general picture is further complicated because the plasma, in general, is turbulent. The study of fluid turbulence is a rather complex field by itself. In addition to the difficulties of classical fluid turbulence, plasma turbulence studies face the problems caused by the induced magnetic turbulence, which couples field by itself. In addition to the difficulties of classical fluid turbulence, plasma turbulence studies face the problems caused by the induced magnetic turbulence, which couples back to the fluid. Since the fluid is not a perfect conductor, this turbulence can lead to changes in the topology of the magnetic field structure, causing the magnetic field lines to wander radially. Because the plasma fluid flows along field lines, they carry the particles with them, and this enhances the losses caused by collisions. The changes in topology are critical for the plasma confinement. The study of plasma turbulence and the concomitant transport is a challenging problem. Because of the importance of solving the plasma turbulence problem for controlled thermonuclear research, the high complexity of the problem, and the necessity of attacking the problem with supercomputers, the study of plasma turbulence in magnetic confinement devices is a Grand Challenge problem

  15. Reactive flow simulations in complex geometries with high-performance supercomputing

    International Nuclear Information System (INIS)

    Rehm, W.; Gerndt, M.; Jahn, W.; Vogelsang, R.; Binninger, B.; Herrmann, M.; Olivier, H.; Weber, M.

    2000-01-01

    In this paper, we report on a modern field code cluster consisting of state-of-the-art reactive Navier-Stokes- and reactive Euler solvers that has been developed on vector- and parallel supercomputers at the research center Juelich. This field code cluster is used for hydrogen safety analyses of technical systems, for example, in the field of nuclear reactor safety and conventional hydrogen demonstration plants with fuel cells. Emphasis is put on the assessment of combustion loads, which could result from slow, fast or rapid flames, including transition from deflagration to detonation. As a sample of proof tests, the special tools have been tested for specific tasks, based on the comparison of experimental and numerical results, which are in reasonable agreement. (author)

  16. Storage-Intensive Supercomputing Benchmark Study

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A

    2007-10-30

    Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows

  17. Parallel Multivariate Spatio-Temporal Clustering of Large Ecological Datasets on Hybrid Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sreepathi, Sarat [ORNL; Kumar, Jitendra [ORNL; Mills, Richard T. [Argonne National Laboratory; Hoffman, Forrest M. [ORNL; Sripathi, Vamsi [Intel Corporation; Hargrove, William Walter [United States Department of Agriculture (USDA), United States Forest Service (USFS)

    2017-09-01

    A proliferation of data from vast networks of remote sensing platforms (satellites, unmanned aircraft systems (UAS), airborne etc.), observational facilities (meteorological, eddy covariance etc.), state-of-the-art sensors, and simulation models offer unprecedented opportunities for scientific discovery. Unsupervised classification is a widely applied data mining approach to derive insights from such data. However, classification of very large data sets is a complex computational problem that requires efficient numerical algorithms and implementations on high performance computing (HPC) platforms. Additionally, increasing power, space, cooling and efficiency requirements has led to the deployment of hybrid supercomputing platforms with complex architectures and memory hierarchies like the Titan system at Oak Ridge National Laboratory. The advent of such accelerated computing architectures offers new challenges and opportunities for big data analytics in general and specifically, large scale cluster analysis in our case. Although there is an existing body of work on parallel cluster analysis, those approaches do not fully meet the needs imposed by the nature and size of our large data sets. Moreover, they had scaling limitations and were mostly limited to traditional distributed memory computing platforms. We present a parallel Multivariate Spatio-Temporal Clustering (MSTC) technique based on k-means cluster analysis that can target hybrid supercomputers like Titan. We developed a hybrid MPI, CUDA and OpenACC implementation that can utilize both CPU and GPU resources on computational nodes. We describe performance results on Titan that demonstrate the scalability and efficacy of our approach in processing large ecological data sets.

  18. High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers

    Science.gov (United States)

    Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas

    2017-04-01

    Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for

  19. Integration of Titan supercomputer at OLCF with ATLAS Production System

    Science.gov (United States)

    Barreiro Megino, F.; De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Padolski, S.; Panitkin, S.; Wells, J.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this paper we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for job submission to Titan’s batch queues and local data management, with lightweight MPI wrappers to run single node workloads in parallel on Titan’s multi-core worker nodes. It provides for running of standard ATLAS production jobs on unused resources (backfill) on Titan. The system already allowed ATLAS to collect on Titan millions of core-hours per month, execute hundreds of thousands jobs, while simultaneously improving Titans utilization efficiency. We will discuss the details of the implementation, current experience with running the system, as well as future plans aimed at improvements in scalability and efficiency. Notice: This manuscript has been authored, by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to

  20. Lectures in Supercomputational Neurosciences Dynamics in Complex Brain Networks

    CERN Document Server

    Graben, Peter beim; Thiel, Marco; Kurths, Jürgen

    2008-01-01

    Computational Neuroscience is a burgeoning field of research where only the combined effort of neuroscientists, biologists, psychologists, physicists, mathematicians, computer scientists, engineers and other specialists, e.g. from linguistics and medicine, seem to be able to expand the limits of our knowledge. The present volume is an introduction, largely from the physicists' perspective, to the subject matter with in-depth contributions by system neuroscientists. A conceptual model for complex networks of neurons is introduced that incorporates many important features of the real brain, such as various types of neurons, various brain areas, inhibitory and excitatory coupling and the plasticity of the network. The computational implementation on supercomputers, which is introduced and discussed in detail in this book, will enable the readers to modify and adapt the algortihm for their own research. Worked-out examples of applications are presented for networks of Morris-Lecar neurons to model the cortical co...

  1. Neutron spectroscopy, nuclear structure, related topics. Abstracts

    International Nuclear Information System (INIS)

    Sukhovoj, A.M.

    1996-01-01

    Neutron spectroscopy, nuclear structure and related topics are considered. P, T-breaking, neutron beta decay, neutron radiative capture and neutron polarizability are discussed. Reaction with fast neutrons, methodical aspect low-energy fission are considered too

  2. Symbolic simulation of engineering systems on a supercomputer

    International Nuclear Information System (INIS)

    Ragheb, M.; Gvillo, D.; Makowitz, H.

    1986-01-01

    Model-Based Production-Rule systems for analysis are developed for the symbolic simulation of Complex Engineering systems on a CRAY X-MP Supercomputer. The Fault-Tree and Event-Tree Analysis methodologies from Systems-Analysis are used for problem representation and are coupled to the Rule-Based System Paradigm from Knowledge Engineering to provide modelling of engineering devices. Modelling is based on knowledge of the structure and function of the device rather than on human expertise alone. To implement the methodology, we developed a production-Rule Analysis System that uses both backward-chaining and forward-chaining: HAL-1986. The inference engine uses an Induction-Deduction-Oriented antecedent-consequent logic and is programmed in Portable Standard Lisp (PSL). The inference engine is general and can accommodate general modifications and additions to the knowledge base. The methodologies used will be demonstrated using a model for the identification of faults, and subsequent recovery from abnormal situations in Nuclear Reactor Safety Analysis. The use of the exposed methodologies for the prognostication of future device responses under operational and accident conditions using coupled symbolic and procedural programming is discussed

  3. Micro-mechanical Simulations of Soils using Massively Parallel Supercomputers

    Directory of Open Access Journals (Sweden)

    David W. Washington

    2004-06-01

    Full Text Available In this research a computer program, Trubal version 1.51, based on the Discrete Element Method was converted to run on a Connection Machine (CM-5,a massively parallel supercomputer with 512 nodes, to expedite the computational times of simulating Geotechnical boundary value problems. The dynamic memory algorithm in Trubal program did not perform efficiently in CM-2 machine with the Single Instruction Multiple Data (SIMD architecture. This was due to the communication overhead involving global array reductions, global array broadcast and random data movement. Therefore, a dynamic memory algorithm in Trubal program was converted to a static memory arrangement and Trubal program was successfully converted to run on CM-5 machines. The converted program was called "TRUBAL for Parallel Machines (TPM." Simulating two physical triaxial experiments and comparing simulation results with Trubal simulations validated the TPM program. With a 512 nodes CM-5 machine TPM produced a nine-fold speedup demonstrating the inherent parallelism within algorithms based on the Discrete Element Method.

  4. Large scale simulations of lattice QCD thermodynamics on Columbia Parallel Supercomputers

    International Nuclear Information System (INIS)

    Ohta, Shigemi

    1989-01-01

    The Columbia Parallel Supercomputer project aims at the construction of a parallel processing, multi-gigaflop computer optimized for numerical simulations of lattice QCD. The project has three stages; 16-node, 1/4GF machine completed in April 1985, 64-node, 1GF machine completed in August 1987, and 256-node, 16GF machine now under construction. The machines all share a common architecture; a two dimensional torus formed from a rectangular array of N 1 x N 2 independent and identical processors. A processor is capable of operating in a multi-instruction multi-data mode, except for periods of synchronous interprocessor communication with its four nearest neighbors. Here the thermodynamics simulations on the two working machines are reported. (orig./HSI)

  5. Unique Methodologies for Nano/Micro Manufacturing Job Training Via Desktop Supercomputer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, Clyde [Northern Illinois Univ., DeKalb, IL (United States); Karonis, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Lurio, Laurence [Northern Illinois Univ., DeKalb, IL (United States); Piot, Philippe [Northern Illinois Univ., DeKalb, IL (United States); Xiao, Zhili [Northern Illinois Univ., DeKalb, IL (United States); Glatz, Andreas [Northern Illinois Univ., DeKalb, IL (United States); Pohlman, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Hou, Minmei [Northern Illinois Univ., DeKalb, IL (United States); Demir, Veysel [Northern Illinois Univ., DeKalb, IL (United States); Song, Jie [Northern Illinois Univ., DeKalb, IL (United States); Duffin, Kirk [Northern Illinois Univ., DeKalb, IL (United States); Johns, Mitrick [Northern Illinois Univ., DeKalb, IL (United States); Sims, Thomas [Northern Illinois Univ., DeKalb, IL (United States); Yin, Yanbin [Northern Illinois Univ., DeKalb, IL (United States)

    2012-11-21

    This project establishes an initiative in high speed (Teraflop)/large-memory desktop supercomputing for modeling and simulation of dynamic processes important for energy and industrial applications. It provides a training ground for employment of current students in an emerging field with skills necessary to access the large supercomputing systems now present at DOE laboratories. It also provides a foundation for NIU faculty to quantum leap beyond their current small cluster facilities. The funding extends faculty and student capability to a new level of analytic skills with concomitant publication avenues. The components of the Hewlett Packard computer obtained by the DOE funds create a hybrid combination of a Graphics Processing System (12 GPU/Teraflops) and a Beowulf CPU system (144 CPU), the first expandable via the NIU GAEA system to ~60 Teraflops integrated with a 720 CPU Beowulf system. The software is based on access to the NVIDIA/CUDA library and the ability through MATLAB multiple licenses to create additional local programs. A number of existing programs are being transferred to the CPU Beowulf Cluster. Since the expertise necessary to create the parallel processing applications has recently been obtained at NIU, this effort for software development is in an early stage. The educational program has been initiated via formal tutorials and classroom curricula designed for the coming year. Specifically, the cost focus was on hardware acquisitions and appointment of graduate students for a wide range of applications in engineering, physics and computer science.

  6. Use of QUADRICS supercomputer as embedded simulator in emergency management systems; Utilizzo del calcolatore QUADRICS come simulatore in linea in un sistema di gestione delle emergenze

    Energy Technology Data Exchange (ETDEWEB)

    Bove, R.; Di Costanzo, G.; Ziparo, A. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dip. Energia

    1996-07-01

    The experience related to the implementation of a MRBT, atmospheric spreading model with a short duration releasing, are reported. This model was implemented on a QUADRICS-Q1 supercomputer. First is reported a description of the MRBT model. It is an analytical model to study the speadings of light gases realised in the atmosphere cause incidental releasing. The solution of diffusion equation is Gaussian like. It yield the concentration of pollutant substance released. The concentration is function of space and time. Thus the QUADRICS architecture is introduced. And the implementation of the model is described. At the end it will be consider the integration of the QUADRICS-based model as simulator in a emergency management system.

  7. High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers

    Science.gov (United States)

    Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.

    2017-12-01

    The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and

  8. De Novo Ultrascale Atomistic Simulations On High-End Parallel Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Nakano, A; Kalia, R K; Nomura, K; Sharma, A; Vashishta, P; Shimojo, F; van Duin, A; Goddard, III, W A; Biswas, R; Srivastava, D; Yang, L H

    2006-09-04

    We present a de novo hierarchical simulation framework for first-principles based predictive simulations of materials and their validation on high-end parallel supercomputers and geographically distributed clusters. In this framework, high-end chemically reactive and non-reactive molecular dynamics (MD) simulations explore a wide solution space to discover microscopic mechanisms that govern macroscopic material properties, into which highly accurate quantum mechanical (QM) simulations are embedded to validate the discovered mechanisms and quantify the uncertainty of the solution. The framework includes an embedded divide-and-conquer (EDC) algorithmic framework for the design of linear-scaling simulation algorithms with minimal bandwidth complexity and tight error control. The EDC framework also enables adaptive hierarchical simulation with automated model transitioning assisted by graph-based event tracking. A tunable hierarchical cellular decomposition parallelization framework then maps the O(N) EDC algorithms onto Petaflops computers, while achieving performance tunability through a hierarchy of parameterized cell data/computation structures, as well as its implementation using hybrid Grid remote procedure call + message passing + threads programming. High-end computing platforms such as IBM BlueGene/L, SGI Altix 3000 and the NSF TeraGrid provide an excellent test grounds for the framework. On these platforms, we have achieved unprecedented scales of quantum-mechanically accurate and well validated, chemically reactive atomistic simulations--1.06 billion-atom fast reactive force-field MD and 11.8 million-atom (1.04 trillion grid points) quantum-mechanical MD in the framework of the EDC density functional theory on adaptive multigrids--in addition to 134 billion-atom non-reactive space-time multiresolution MD, with the parallel efficiency as high as 0.998 on 65,536 dual-processor BlueGene/L nodes. We have also achieved an automated execution of hierarchical QM

  9. An Optimized Parallel FDTD Topology for Challenging Electromagnetic Simulations on Supercomputers

    Directory of Open Access Journals (Sweden)

    Shugang Jiang

    2015-01-01

    Full Text Available It may not be a challenge to run a Finite-Difference Time-Domain (FDTD code for electromagnetic simulations on a supercomputer with more than 10 thousands of CPU cores; however, to make FDTD code work with the highest efficiency is a challenge. In this paper, the performance of parallel FDTD is optimized through MPI (message passing interface virtual topology, based on which a communication model is established. The general rules of optimal topology are presented according to the model. The performance of the method is tested and analyzed on three high performance computing platforms with different architectures in China. Simulations including an airplane with a 700-wavelength wingspan, and a complex microstrip antenna array with nearly 2000 elements are performed very efficiently using a maximum of 10240 CPU cores.

  10. Development of a high performance eigensolver on the peta-scale next generation supercomputer system

    International Nuclear Information System (INIS)

    Imamura, Toshiyuki; Yamada, Susumu; Machida, Masahiko

    2010-01-01

    For the present supercomputer systems, a multicore and multisocket processors are necessary to build a system, and choice of interconnection is essential. In addition, for effective development of a new code, high performance, scalable, and reliable numerical software is one of the key items. ScaLAPACK and PETSc are well-known software on distributed memory parallel computer systems. It is needless to say that highly tuned software towards new architecture like many-core processors must be chosen for real computation. In this study, we present a high-performance and high-scalable eigenvalue solver towards the next-generation supercomputer system, so called 'K-computer' system. We have developed two versions, the standard version (eigen s) and enhanced performance version (eigen sx), which are developed on the T2K cluster system housed at University of Tokyo. Eigen s employs the conventional algorithms; Householder tridiagonalization, divide and conquer (DC) algorithm, and Householder back-transformation. They are carefully implemented with blocking technique and flexible two-dimensional data-distribution to reduce the overhead of memory traffic and data transfer, respectively. Eigen s performs excellently on the T2K system with 4096 cores (theoretical peak is 37.6 TFLOPS), and it shows fine performance 3.0 TFLOPS with a two hundred thousand dimensional matrix. The enhanced version, eigen sx, uses more advanced algorithms; the narrow-band reduction algorithm, DC for band matrices, and the block Householder back-transformation with WY-representation. Even though this version is still on a test stage, it shows 4.7 TFLOPS with the same dimensional matrix on eigen s. (author)

  11. Task-Driven Comparison of Topic Models.

    Science.gov (United States)

    Alexander, Eric; Gleicher, Michael

    2016-01-01

    Topic modeling, a method of statistically extracting thematic content from a large collection of texts, is used for a wide variety of tasks within text analysis. Though there are a growing number of tools and techniques for exploring single models, comparisons between models are generally reduced to a small set of numerical metrics. These metrics may or may not reflect a model's performance on the analyst's intended task, and can therefore be insufficient to diagnose what causes differences between models. In this paper, we explore task-centric topic model comparison, considering how we can both provide detail for a more nuanced understanding of differences and address the wealth of tasks for which topic models are used. We derive comparison tasks from single-model uses of topic models, which predominantly fall into the categories of understanding topics, understanding similarity, and understanding change. Finally, we provide several visualization techniques that facilitate these tasks, including buddy plots, which combine color and position encodings to allow analysts to readily view changes in document similarity.

  12. Watson will see you now: a supercomputer to help clinicians make informed treatment decisions.

    Science.gov (United States)

    Doyle-Lindrud, Susan

    2015-02-01

    IBM has collaborated with several cancer care providers to develop and train the IBM supercomputer Watson to help clinicians make informed treatment decisions. When a patient is seen in clinic, the oncologist can input all of the clinical information into the computer system. Watson will then review all of the data and recommend treatment options based on the latest evidence and guidelines. Once the oncologist makes the treatment decision, this information can be sent directly to the insurance company for approval. Watson has the ability to standardize care and accelerate the approval process, a benefit to the healthcare provider and the patient.

  13. Affordable and accurate large-scale hybrid-functional calculations on GPU-accelerated supercomputers

    Science.gov (United States)

    Ratcliff, Laura E.; Degomme, A.; Flores-Livas, José A.; Goedecker, Stefan; Genovese, Luigi

    2018-03-01

    Performing high accuracy hybrid functional calculations for condensed matter systems containing a large number of atoms is at present computationally very demanding or even out of reach if high quality basis sets are used. We present a highly optimized multiple graphics processing unit implementation of the exact exchange operator which allows one to perform fast hybrid functional density-functional theory (DFT) calculations with systematic basis sets without additional approximations for up to a thousand atoms. With this method hybrid DFT calculations of high quality become accessible on state-of-the-art supercomputers within a time-to-solution that is of the same order of magnitude as traditional semilocal-GGA functionals. The method is implemented in a portable open-source library.

  14. Topics in elementary particle physics

    International Nuclear Information System (INIS)

    Dugan, M.J.

    1985-01-01

    Topics in elementary particle physics are discussed. Models with N = 2 supersymmetry are constructed. The CP violation properties of a class of N = 1 supergravity models are analyzed. The structure of a composite Higgs model is investigated. The implications of a 17 keV neutrino are considered

  15. Parallel simulation of tsunami inundation on a large-scale supercomputer

    Science.gov (United States)

    Oishi, Y.; Imamura, F.; Sugawara, D.

    2013-12-01

    An accurate prediction of tsunami inundation is important for disaster mitigation purposes. One approach is to approximate the tsunami wave source through an instant inversion analysis using real-time observation data (e.g., Tsushima et al., 2009) and then use the resulting wave source data in an instant tsunami inundation simulation. However, a bottleneck of this approach is the large computational cost of the non-linear inundation simulation and the computational power of recent massively parallel supercomputers is helpful to enable faster than real-time execution of a tsunami inundation simulation. Parallel computers have become approximately 1000 times faster in 10 years (www.top500.org), and so it is expected that very fast parallel computers will be more and more prevalent in the near future. Therefore, it is important to investigate how to efficiently conduct a tsunami simulation on parallel computers. In this study, we are targeting very fast tsunami inundation simulations on the K computer, currently the fastest Japanese supercomputer, which has a theoretical peak performance of 11.2 PFLOPS. One computing node of the K computer consists of 1 CPU with 8 cores that share memory, and the nodes are connected through a high-performance torus-mesh network. The K computer is designed for distributed-memory parallel computation, so we have developed a parallel tsunami model. Our model is based on TUNAMI-N2 model of Tohoku University, which is based on a leap-frog finite difference method. A grid nesting scheme is employed to apply high-resolution grids only at the coastal regions. To balance the computation load of each CPU in the parallelization, CPUs are first allocated to each nested layer in proportion to the number of grid points of the nested layer. Using CPUs allocated to each layer, 1-D domain decomposition is performed on each layer. In the parallel computation, three types of communication are necessary: (1) communication to adjacent neighbours for the

  16. Adventures in supercomputing: An innovative program for high school teachers

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, C.E.; Hicks, H.R.; Summers, B.G. [Oak Ridge National Lab., TN (United States); Staten, D.G. [Wartburg Central High School, TN (United States)

    1994-12-31

    Within the realm of education, seldom does an innovative program become available with the potential to change an educator`s teaching methodology. Adventures in Supercomputing (AiS), sponsored by the U.S. Department of Energy (DOE), is such a program. It is a program for high school teachers that changes the teacher paradigm from a teacher-directed approach of teaching to a student-centered approach. {open_quotes}A student-centered classroom offers better opportunities for development of internal motivation, planning skills, goal setting and perseverance than does the traditional teacher-directed mode{close_quotes}. Not only is the process of teaching changed, but the cross-curricula integration within the AiS materials is remarkable. Written from a teacher`s perspective, this paper will describe the AiS program and its effects on teachers and students, primarily at Wartburg Central High School, in Wartburg, Tennessee. The AiS program in Tennessee is sponsored by Oak Ridge National Laboratory (ORNL).

  17. Re-inventing electromagnetics - Supercomputing solution of Maxwell's equations via direct time integration on space grids

    International Nuclear Information System (INIS)

    Taflove, A.

    1992-01-01

    This paper summarizes the present state and future directions of applying finite-difference and finite-volume time-domain techniques for Maxwell's equations on supercomputers to model complex electromagnetic wave interactions with structures. Applications so far have been dominated by radar cross section technology, but by no means are limited to this area. In fact, the gains we have made place us on the threshold of being able to make tremendous contributions to non-defense electronics and optical technology. Some of the most interesting research in these commercial areas is summarized. 47 refs

  18. Earth and environmental science in the 1980's: Part 1: Environmental data systems, supercomputer facilities and networks

    Science.gov (United States)

    1986-01-01

    Overview descriptions of on-line environmental data systems, supercomputer facilities, and networks are presented. Each description addresses the concepts of content, capability, and user access relevant to the point of view of potential utilization by the Earth and environmental science community. The information on similar systems or facilities is presented in parallel fashion to encourage and facilitate intercomparison. In addition, summary sheets are given for each description, and a summary table precedes each section.

  19. Efficient development of memory bounded geo-applications to scale on modern supercomputers

    Science.gov (United States)

    Räss, Ludovic; Omlin, Samuel; Licul, Aleksandar; Podladchikov, Yuri; Herman, Frédéric

    2016-04-01

    Numerical modeling is an actual key tool in the area of geosciences. The current challenge is to solve problems that are multi-physics and for which the length scale and the place of occurrence might not be known in advance. Also, the spatial extend of the investigated domain might strongly vary in size, ranging from millimeters for reactive transport to kilometers for glacier erosion dynamics. An efficient way to proceed is to develop simple but robust algorithms that perform well and scale on modern supercomputers and permit therefore very high-resolution simulations. We propose an efficient approach to solve memory bounded real-world applications on modern supercomputers architectures. We optimize the software to run on our newly acquired state-of-the-art GPU cluster "octopus". Our approach shows promising preliminary results on important geodynamical and geomechanical problematics: we have developed a Stokes solver for glacier flow and a poromechanical solver including complex rheologies for nonlinear waves in stressed rocks porous rocks. We solve the system of partial differential equations on a regular Cartesian grid and use an iterative finite difference scheme with preconditioning of the residuals. The MPI communication happens only locally (point-to-point); this method is known to scale linearly by construction. The "octopus" GPU cluster, which we use for the computations, has been designed to achieve maximal data transfer throughput at minimal hardware cost. It is composed of twenty compute nodes, each hosting four Nvidia Titan X GPU accelerators. These high-density nodes are interconnected with a parallel (dual-rail) FDR InfiniBand network. Our efforts show promising preliminary results for the different physics investigated. The glacier flow solver achieves good accuracy in the relevant benchmarks and the coupled poromechanical solver permits to explain previously unresolvable focused fluid flow as a natural outcome of the porosity setup. In both cases

  20. Research center Juelich to install Germany's most powerful supercomputer new IBM System for science and research will achieve 5.8 trillion computations per second

    CERN Multimedia

    2002-01-01

    "The Research Center Juelich, Germany, and IBM today announced that they have signed a contract for the delivery and installation of a new IBM supercomputer at the Central Institute for Applied Mathematics" (1/2 page).

  1. MEGADOCK 4.0: an ultra-high-performance protein-protein docking software for heterogeneous supercomputers.

    Science.gov (United States)

    Ohue, Masahito; Shimoda, Takehiro; Suzuki, Shuji; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka

    2014-11-15

    The application of protein-protein docking in large-scale interactome analysis is a major challenge in structural bioinformatics and requires huge computing resources. In this work, we present MEGADOCK 4.0, an FFT-based docking software that makes extensive use of recent heterogeneous supercomputers and shows powerful, scalable performance of >97% strong scaling. MEGADOCK 4.0 is written in C++ with OpenMPI and NVIDIA CUDA 5.0 (or later) and is freely available to all academic and non-profit users at: http://www.bi.cs.titech.ac.jp/megadock. akiyama@cs.titech.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  2. Wavelet transform-vector quantization compression of supercomputer ocean model simulation output

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, J N; Brislawn, C M

    1992-11-12

    We describe a new procedure for efficient compression of digital information for storage and transmission purposes. The algorithm involves a discrete wavelet transform subband decomposition of the data set, followed by vector quantization of the wavelet transform coefficients using application-specific vector quantizers. The new vector quantizer design procedure optimizes the assignment of both memory resources and vector dimensions to the transform subbands by minimizing an exponential rate-distortion functional subject to constraints on both overall bit-rate and encoder complexity. The wavelet-vector quantization method, which originates in digital image compression. is applicable to the compression of other multidimensional data sets possessing some degree of smoothness. In this paper we discuss the use of this technique for compressing the output of supercomputer simulations of global climate models. The data presented here comes from Semtner-Chervin global ocean models run at the National Center for Atmospheric Research and at the Los Alamos Advanced Computing Laboratory.

  3. Topical melatonin for treatment of androgenetic alopecia.

    Science.gov (United States)

    Fischer, Tobias W; Trüeb, Ralph M; Hänggi, Gabriella; Innocenti, Marcello; Elsner, Peter

    2012-10-01

    In the search for alternative agents to oral finasteride and topical minoxidil for the treatment of androgenetic alopecia (AGA), melatonin, a potent antioxidant and growth modulator, was identified as a promising candidate based on in vitro and in vivo studies. One pharmacodynamic study on topical application of melatonin and four clinical pre-post studies were performed in patients with androgenetic alopecia or general hair loss and evaluated by standardised questionnaires, TrichoScan, 60-second hair count test and hair pull test. FIVE CLINICAL STUDIES SHOWED POSITIVE EFFECTS OF A TOPICAL MELATONIN SOLUTION IN THE TREATMENT OF AGA IN MEN AND WOMEN WHILE SHOWING GOOD TOLERABILITY: (1) Pharmacodynamics under once-daily topical application in the evening showed no significant influence on endogenous serum melatonin levels. (2) An observational study involving 30 men and women showed a significant reduction in the degree of severity of alopecia after 30 and 90 days (P melatonin solution can be considered as a treatment option in androgenetic alopecia.

  4. Computational fluid dynamics: complex flows requiring supercomputers. January 1975-July 1988 (Citations from the INSPEC: Information Services for the Physics and Engineering Communities data base). Report for January 1975-July 1988

    International Nuclear Information System (INIS)

    1988-08-01

    This bibliography contains citations concerning computational fluid dynamics (CFD), a new method in computational science to perform complex flow simulations in three dimensions. Applications include aerodynamic design and analysis for aircraft, rockets, and missiles, and automobiles; heat-transfer studies; and combustion processes. Included are references to supercomputers, array processors, and parallel processors where needed for complete, integrated design. Also included are software packages and grid-generation techniques required to apply CFD numerical solutions. Numerical methods for fluid dynamics, not requiring supercomputers, are found in a separate published search. (Contains 83 citations fully indexed and including a title list.)

  5. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  6. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu

    2013-07-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  7. Research to application: Supercomputing trends for the 90's - Opportunities for interdisciplinary computations

    International Nuclear Information System (INIS)

    Shankar, V.

    1991-01-01

    The progression of supercomputing is reviewed from the point of view of computational fluid dynamics (CFD), and multidisciplinary problems impacting the design of advanced aerospace configurations are addressed. The application of full potential and Euler equations to transonic and supersonic problems in the 70s and early 80s is outlined, along with Navier-Stokes computations widespread during the late 80s and early 90s. Multidisciplinary computations currently in progress are discussed, including CFD and aeroelastic coupling for both static and dynamic flexible computations, CFD, aeroelastic, and controls coupling for flutter suppression and active control, and the development of a computational electromagnetics technology based on CFD methods. Attention is given to computational challenges standing in a way of the concept of establishing a computational environment including many technologies. 40 refs

  8. Car2x with software defined networks, network functions virtualization and supercomputers technical and scientific preparations for the Amsterdam Arena telecoms fieldlab

    NARCIS (Netherlands)

    Meijer R.J.; Cushing R.; De Laat C.; Jackson P.; Klous S.; Koning R.; Makkes M.X.; Meerwijk A.

    2015-01-01

    In the invited talk 'Car2x with SDN, NFV and supercomputers' we report about how our past work with SDN [1, 2] allows the design of a smart mobility fieldlab in the huge parking lot the Amsterdam Arena. We explain how we can engineer and test software that handle the complex conditions of the Car2X

  9. A user-friendly web portal for T-Coffee on supercomputers

    Directory of Open Access Journals (Sweden)

    Koetsier Jos

    2011-05-01

    Full Text Available Abstract Background Parallel T-Coffee (PTC was the first parallel implementation of the T-Coffee multiple sequence alignment tool. It is based on MPI and RMA mechanisms. Its purpose is to reduce the execution time of the large-scale sequence alignments. It can be run on distributed memory clusters allowing users to align data sets consisting of hundreds of proteins within a reasonable time. However, most of the potential users of this tool are not familiar with the use of grids or supercomputers. Results In this paper we show how PTC can be easily deployed and controlled on a super computer architecture using a web portal developed using Rapid. Rapid is a tool for efficiently generating standardized portlets for a wide range of applications and the approach described here is generic enough to be applied to other applications, or to deploy PTC on different HPC environments. Conclusions The PTC portal allows users to upload a large number of sequences to be aligned by the parallel version of TC that cannot be aligned by a single machine due to memory and execution time constraints. The web portal provides a user-friendly solution.

  10. 2011 annual meeting on nuclear technology. Pt. 4. Topical sessions

    International Nuclear Information System (INIS)

    Schoenfelder, Christian; Dams, Wolfgang

    2011-01-01

    Summary report on the Topical Session of the Annual Conference on Nuclear Technology held in Berlin, 17 to 19 May 2011: - Nuclear Competence in Germany and Europe. The Topical Session: - Sodium Cooled Fast Reactors -- will be covered in a report in a further issue of atw. The reports on the Topical Sessions: - CFD-Simulations for Safety Relevant Tasks; and - Final Disposal: From Scientific Basis to Application; - Characteristics of a High Reliability Organization (HRO) Considering Experience Gained from Events at Nuclear Power Stations -- have been covered in atw 7, 8/9, and 10 (2011). (orig.)

  11. Meatotomy using topical anesthesia: A painless option

    Directory of Open Access Journals (Sweden)

    Vinod Priyadarshi

    2015-01-01

    Conclusion: Use of topical anesthesia in form of Prilox (EMLA cream for meatotomy is safe and effective method that avoids painful injections and anxiety related to it and should be considered in most of such patients as an alternative of conventional penile blocks or general anesthesia.

  12. Portable implementation model for CFD simulations. Application to hybrid CPU/GPU supercomputers

    Science.gov (United States)

    Oyarzun, Guillermo; Borrell, Ricard; Gorobets, Andrey; Oliva, Assensi

    2017-10-01

    Nowadays, high performance computing (HPC) systems experience a disruptive moment with a variety of novel architectures and frameworks, without any clarity of which one is going to prevail. In this context, the portability of codes across different architectures is of major importance. This paper presents a portable implementation model based on an algebraic operational approach for direct numerical simulation (DNS) and large eddy simulation (LES) of incompressible turbulent flows using unstructured hybrid meshes. The strategy proposed consists in representing the whole time-integration algorithm using only three basic algebraic operations: sparse matrix-vector product, a linear combination of vectors and dot product. The main idea is based on decomposing the nonlinear operators into a concatenation of two SpMV operations. This provides high modularity and portability. An exhaustive analysis of the proposed implementation for hybrid CPU/GPU supercomputers has been conducted with tests using up to 128 GPUs. The main objective consists in understanding the challenges of implementing CFD codes on new architectures.

  13. Topics in Nonlinear Dynamics

    DEFF Research Database (Denmark)

    Mosekilde, Erik

    Through a significant number of detailed and realistic examples this book illustrates how the insights gained over the past couple of decades in the fields of nonlinear dynamics and chaos theory can be applied in practice. Aomng the topics considered are microbiological reaction systems, ecological...... food-web systems, nephron pressure and flow regulation, pulsatile secretion of hormones, thermostatically controlled radiator systems, post-stall maneuvering of aircrafts, transfer electron devices for microwave generation, economic long waves, human decision making behavior, and pattern formation...... in chemical reaction-diffusion systems....

  14. EDF's experience with supercomputing and challenges ahead - towards multi-physics and multi-scale approaches

    International Nuclear Information System (INIS)

    Delbecq, J.M.; Banner, D.

    2003-01-01

    Nuclear power plants are a major asset of the EDF company. To remain so, in particular in a context of deregulation, competitiveness, safety and public acceptance are three conditions. These stakes apply both to existing plants and to future reactors. The purpose of the presentation is to explain how supercomputing can help EDF to satisfy these requirements. Three examples are described in detail: ensuring optimal use of nuclear fuel under wholly safe conditions, understanding and simulating the material deterioration mechanisms and moving forward with numerical simulation for the performance of EDF's activities. In conclusion, a broader vision of EDF long term R and D in the field of numerical simulation is given and especially of five challenges taken up by EDF together with its industrial and scientific partners. (author)

  15. Link-topic model for biomedical abbreviation disambiguation.

    Science.gov (United States)

    Kim, Seonho; Yoon, Juntae

    2015-02-01

    The ambiguity of biomedical abbreviations is one of the challenges in biomedical text mining systems. In particular, the handling of term variants and abbreviations without nearby definitions is a critical issue. In this study, we adopt the concepts of topic of document and word link to disambiguate biomedical abbreviations. We newly suggest the link topic model inspired by the latent Dirichlet allocation model, in which each document is perceived as a random mixture of topics, where each topic is characterized by a distribution over words. Thus, the most probable expansions with respect to abbreviations of a given abstract are determined by word-topic, document-topic, and word-link distributions estimated from a document collection through the link topic model. The model allows two distinct modes of word generation to incorporate semantic dependencies among words, particularly long form words of abbreviations and their sentential co-occurring words; a word can be generated either dependently on the long form of the abbreviation or independently. The semantic dependency between two words is defined as a link and a new random parameter for the link is assigned to each word as well as a topic parameter. Because the link status indicates whether the word constitutes a link with a given specific long form, it has the effect of determining whether a word forms a unigram or a skipping/consecutive bigram with respect to the long form. Furthermore, we place a constraint on the model so that a word has the same topic as a specific long form if it is generated in reference to the long form. Consequently, documents are generated from the two hidden parameters, i.e. topic and link, and the most probable expansion of a specific abbreviation is estimated from the parameters. Our model relaxes the bag-of-words assumption of the standard topic model in which the word order is neglected, and it captures a richer structure of text than does the standard topic model by considering

  16. Topical tags vs non-topical tags : Towards a bipartite classification?

    NARCIS (Netherlands)

    Basile, Valerio; Peroni, Silvio; Tamburini, Fabio; Vitali, Fabio

    2015-01-01

    In this paper we investigate whether it is possible to create a computational approach that allows us to distinguish topical tags (i.e. talking about the topic of a resource) and non-topical tags (i.e. describing aspects of a resource that are not related to its topic) in folksonomies, in a way that

  17. Analyzing research trends on drug safety using topic modeling.

    Science.gov (United States)

    Zou, Chen

    2018-04-06

    Published drug safety data has evolved in the past decade due to scientific and technological advances in the relevant research fields. Considering that a vast amount of scientific literature has been published in this area, it is not easy to identify the key information. Topic modeling has emerged as a powerful tool to extract meaningful information from a large volume of unstructured texts. Areas covered: We analyzed the titles and abstracts of 4347 articles in four journals dedicated to drug safety from 2007 to 2016. We applied Latent Dirichlet allocation (LDA) model to extract 50 main topics, and conducted trend analysis to explore the temporal popularity of these topics over years. Expert Opinion/Commentary: We found that 'benefit-risk assessment and communication', 'diabetes' and 'biologic therapy for autoimmune diseases' are the top 3 most published topics. The topics relevant to the use of electronic health records/observational data for safety surveillance are becoming increasingly popular over time. Meanwhile, there is a slight decrease in research on signal detection based on spontaneous reporting, although spontaneous reporting still plays an important role in benefit-risk assessment. The topics related to medical conditions and treatment showed highly dynamic patterns over time.

  18. SOFTWARE FOR SUPERCOMPUTER SKIF “ProLit-lC” and “ProNRS-lC” FOR FOUNDRY AND METALLURGICAL PRODUCTIONS

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2008-01-01

    Full Text Available The data of modeling on supercomputer system SKIF of technological process of  molds filling by means of computer system 'ProLIT-lc', and also data of modeling of the steel pouring process by means ofTroNRS-lc'are presented. The influence of number of  processors of  multinuclear computer system SKIF on acceleration and time of  modeling of technological processes, connected with production of castings and slugs, is shown.

  19. Palacios and Kitten : high performance operating systems for scalable virtualized and native supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Widener, Patrick (University of New Mexico); Jaconette, Steven (Northwestern University); Bridges, Patrick G. (University of New Mexico); Xia, Lei (Northwestern University); Dinda, Peter (Northwestern University); Cui, Zheng.; Lange, John (Northwestern University); Hudson, Trammell B.; Levenhagen, Michael J.; Pedretti, Kevin Thomas Tauke; Brightwell, Ronald Brian

    2009-09-01

    Palacios and Kitten are new open source tools that enable applications, whether ported or not, to achieve scalable high performance on large machines. They provide a thin layer over the hardware to support both full-featured virtualized environments and native code bases. Kitten is an OS under development at Sandia that implements a lightweight kernel architecture to provide predictable behavior and increased flexibility on large machines, while also providing Linux binary compatibility. Palacios is a VMM that is under development at Northwestern University and the University of New Mexico. Palacios, which can be embedded into Kitten and other OSes, supports existing, unmodified applications and operating systems by using virtualization that leverages hardware technologies. We describe the design and implementation of both Kitten and Palacios. Our benchmarks show that they provide near native, scalable performance. Palacios and Kitten provide an incremental path to using supercomputer resources that is not performance-compromised.

  20. 1984 CERN school of computing

    International Nuclear Information System (INIS)

    1985-01-01

    The eighth CERN School of Computing covered subjects mainly related to computing for elementary-particle physics. These proceedings contain written versions of most of the lectures delivered at the School. Notes on the following topics are included: trigger and data-acquisition plans for the LEP experiments; unfolding methods in high-energy physics experiments; Monte Carlo techniques; relational data bases; data networks and open systems; the Newcastle connection; portable operating systems; expert systems; microprocessors - from basic chips to complete systems; algorithms for parallel computers; trends in supercomputers and computational physics; supercomputing and related national projects in Japan; application of VLSI in high-energy physics, and single-user systems. See hints under the relevant topics. (orig./HSI)

  1. Supercomputer debugging workshop `92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-02-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  2. Topics in the Foundations of General Relativity and Newtonian Gravitation Theory

    CERN Document Server

    Malament, David B

    2012-01-01

    In Topics in the Foundations of General Relativity and Newtonian Gravitation Theory, David B. Malament presents the basic logical-mathematical structure of general relativity and considers a number of special topics concerning the foundations of general relativity and its relation to Newtonian gravitation theory. These special topics include the geometrized formulation of Newtonian theory (also known as Newton-Cartan theory), the concept of rotation in general relativity, and Gödel spacetime. One of the highlights of the book is a no-go theorem that can be understood to show that there is

  3. Oral Versus Topical Diclofenac Sodium in the Treatment of Osteoarthritis.

    Science.gov (United States)

    Tieppo Francio, Vinicius; Davani, Saeid; Towery, Chris; Brown, Tony L

    2017-06-01

    Osteoarthritis (OA) is one of the most common causes of joint pain in the United States and non-steroidal anti-inflammatories (NSAIDs), such as Diclofenac sodium, which is currently available in two main routes of administration; oral and topical distribution have been established as one of the standard treatments for OA. Generally, oral NSAIDs are well tolerated; however our narrative review suggests that the topical solution had a better tolerability property than oral Diclofenac sodium, especially due to side effects of gastrointestinal bleeding with the utilization of the oral format. In addition, the topical route may be considered a reasonable selection by clinicians for management of musculoskeletal pain in those patients with a history of potential risk and adverse side effects. Most studies reviewed comparing oral versus topical solution of Diclofenac sodium revealed comparable efficacy, with minimal side effects utilizing the topical route. The key point of this narrative review is to help clinicians that currently must decide between very inexpensive diclofenac oral presentations and expensive topical presentations especially in the elderly population and the pros and cons of such decision-making process.

  4. Performance Analysis and Scaling Behavior of the Terrestrial Systems Modeling Platform TerrSysMP in Large-Scale Supercomputing Environments

    Science.gov (United States)

    Kollet, S. J.; Goergen, K.; Gasper, F.; Shresta, P.; Sulis, M.; Rihani, J.; Simmer, C.; Vereecken, H.

    2013-12-01

    In studies of the terrestrial hydrologic, energy and biogeochemical cycles, integrated multi-physics simulation platforms take a central role in characterizing non-linear interactions, variances and uncertainties of system states and fluxes in reciprocity with observations. Recently developed integrated simulation platforms attempt to honor the complexity of the terrestrial system across multiple time and space scales from the deeper subsurface including groundwater dynamics into the atmosphere. Technically, this requires the coupling of atmospheric, land surface, and subsurface-surface flow models in supercomputing environments, while ensuring a high-degree of efficiency in the utilization of e.g., standard Linux clusters and massively parallel resources. A systematic performance analysis including profiling and tracing in such an application is crucial in the understanding of the runtime behavior, to identify optimum model settings, and is an efficient way to distinguish potential parallel deficiencies. On sophisticated leadership-class supercomputers, such as the 28-rack 5.9 petaFLOP IBM Blue Gene/Q 'JUQUEEN' of the Jülich Supercomputing Centre (JSC), this is a challenging task, but even more so important, when complex coupled component models are to be analysed. Here we want to present our experience from coupling, application tuning (e.g. 5-times speedup through compiler optimizations), parallel scaling and performance monitoring of the parallel Terrestrial Systems Modeling Platform TerrSysMP. The modeling platform consists of the weather prediction system COSMO of the German Weather Service; the Community Land Model, CLM of NCAR; and the variably saturated surface-subsurface flow code ParFlow. The model system relies on the Multiple Program Multiple Data (MPMD) execution model where the external Ocean-Atmosphere-Sea-Ice-Soil coupler (OASIS3) links the component models. TerrSysMP has been instrumented with the performance analysis tool Scalasca and analyzed

  5. Changing the Topic. Topic Position in Ancient Greek Word Order

    NARCIS (Netherlands)

    Allan, R.J.

    2014-01-01

    Ancient Greek, topics can be expressed as intra-clausal constituents but they can also precede or follow the main clause as extra-clausal constituents. Together, these various topic expressions constitute a coherent system of complementary pragmatic functions. For a comprehensive account of topic

  6. Assessment techniques for a learning-centered curriculum: evaluation design for adventures in supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Helland, B. [Ames Lab., IA (United States); Summers, B.G. [Oak Ridge National Lab., TN (United States)

    1996-09-01

    As the classroom paradigm shifts from being teacher-centered to being learner-centered, student assessments are evolving from typical paper and pencil testing to other methods of evaluation. Students should be probed for understanding, reasoning, and critical thinking abilities rather than their ability to return memorized facts. The assessment of the Department of Energy`s pilot program, Adventures in Supercomputing (AiS), offers one example of assessment techniques developed for learner-centered curricula. This assessment has employed a variety of methods to collect student data. Methods of assessment used were traditional testing, performance testing, interviews, short questionnaires via email, and student presentations of projects. The data obtained from these sources have been analyzed by a professional assessment team at the Center for Children and Technology. The results have been used to improve the AiS curriculum and establish the quality of the overall AiS program. This paper will discuss the various methods of assessment used and the results.

  7. The BlueGene/L Supercomputer and Quantum ChromoDynamics

    International Nuclear Information System (INIS)

    Vranas, P; Soltz, R

    2006-01-01

    In summary our update contains: (1) Perfect speedup sustaining 19.3% of peak for the Wilson D D-slash Dirac operator. (2) Measurements of the full Conjugate Gradient (CG) inverter that inverts the Dirac operator. The CG inverter contains two global sums over the entire machine. Nevertheless, our measurements retain perfect speedup scaling demonstrating the robustness of our methods. (3) We ran on the largest BG/L system, the LLNL 64 rack BG/L supercomputer, and obtained a sustained speed of 59.1 TFlops. Furthermore, the speedup scaling of the Dirac operator and of the CG inverter are perfect all the way up to the full size of the machine, 131,072 cores (please see Figure II). The local lattice is rather small (4 x 4 x 4 x 16) while the total lattice has been a lattice QCD vision for thermodynamic studies (a total of 128 x 128 x 256 x 32 lattice sites). This speed is about five times larger compared to the speed we quoted in our submission. As we have pointed out in our paper QCD is notoriously sensitive to network and memory latencies, has a relatively high communication to computation ratio which can not be overlapped in BGL in virtual node mode, and as an application is in a class of its own. The above results are thrilling to us and a 30 year long dream for lattice QCD

  8. Modeling radiative transport in ICF plasmas on an IBM SP2 supercomputer

    International Nuclear Information System (INIS)

    Johansen, J.A.; MacFarlane, J.J.; Moses, G.A.

    1995-01-01

    At the University of Wisconsin-Madison the authors have integrated a collisional-radiative-equilibrium model into their CONRAD radiation-hydrodynamics code. This integrated package allows them to accurately simulate the transport processes involved in ICF plasmas; including the important effects of self-absorption of line-radiation. However, as they increase the amount of atomic structure utilized in their transport models, the computational demands increase nonlinearly. In an attempt to meet this increased computational demand, they have recently embarked on a mission to parallelize the CONRAD program. The parallel CONRAD development is being performed on an IBM SP2 supercomputer. The parallelism is based on a message passing paradigm, and is being implemented using PVM. At the present time they have determined that approximately 70% of the sequential program can be executed in parallel. Accordingly, they expect that the parallel version will yield a speedup on the order of three times that of the sequential version. This translates into only 10 hours of execution time for the parallel version, whereas the sequential version required 30 hours

  9. New guidelines for topical NSAIDs in the osteoarthritis treatment paradigm.

    Science.gov (United States)

    Altman, Roy D

    2010-12-01

    Osteoarthritis (OA), the most common form of arthritis, often affects hands, hips, and knees and involves an estimated 26.9 million US adults. Women have a higher prevalence of OA, and the risk of developing OA increases with age, obesity, and joint malalignment. OA typically presents with pain and reduced function. Therapeutic programs are often multimodal and must take into account pharmaceutical toxicities and patient comorbidities. For example, nonsteroidal anti-inflammatory drugs (NSAIDs) are associated with cardiovascular, gastrointestinal, and renal adverse events. Topical NSAIDs offer efficacy with reduced systemic drug exposure. This is a review of current guideline recommendations regarding the use of topical NSAIDs in OA of the hand and knee. Articles were identified by PubMed search (January 1, 2000 to May 21, 2010). Several current guidelines for management of OA recommend topical NSAIDs, indicating them as a safe and effective treatment. One guideline recommends that topical NSAIDs be considered as first-line pharmacologic therapy. A US guideline for knee OA recommends topical NSAIDs in older patients and in patients with increased gastrointestinal risk. The consensus across US and European OA guidelines is that topical NSAIDs are a safe and effective treatment for OA. Because the research base on topical NSAIDs for OA is small, guidelines will continue to evolve.

  10. Exploiting the functional and taxonomic structure of genomic data by probabilistic topic modeling.

    Science.gov (United States)

    Chen, Xin; Hu, Xiaohua; Lim, Tze Y; Shen, Xiajiong; Park, E K; Rosen, Gail L

    2012-01-01

    In this paper, we present a method that enable both homology-based approach and composition-based approach to further study the functional core (i.e., microbial core and gene core, correspondingly). In the proposed method, the identification of major functionality groups is achieved by generative topic modeling, which is able to extract useful information from unlabeled data. We first show that generative topic model can be used to model the taxon abundance information obtained by homology-based approach and study the microbial core. The model considers each sample as a “document,” which has a mixture of functional groups, while each functional group (also known as a “latent topic”) is a weight mixture of species. Therefore, estimating the generative topic model for taxon abundance data will uncover the distribution over latent functions (latent topic) in each sample. Second, we show that, generative topic model can also be used to study the genome-level composition of “N-mer” features (DNA subreads obtained by composition-based approaches). The model consider each genome as a mixture of latten genetic patterns (latent topics), while each functional pattern is a weighted mixture of the “N-mer” features, thus the existence of core genomes can be indicated by a set of common N-mer features. After studying the mutual information between latent topics and gene regions, we provide an explanation of the functional roles of uncovered latten genetic patterns. The experimental results demonstrate the effectiveness of proposed method.

  11. Optical systems for synchrotron radiation. Lecture 1. Introductory topics. Revision

    International Nuclear Information System (INIS)

    Howells, M.R.

    1986-02-01

    Various fundamental topics are considered which underlie the design and use of optical systems for synchrotron radiation. The point of view of linear system theory is chosen which acts as a unifying concept throughout the series. In this context the important optical quantities usually appear as either impulse response functions (Green's functions) or frequency transfer functions (Fourier Transforms of the Green's functions). Topics include the damped harmonic oscillator, free-space optical field propagation, optical properties of materials, dispersion, and the Kramers-Kronig relations

  12. Solving sparse linear least squares problems on some supercomputers by using large dense blocks

    DEFF Research Database (Denmark)

    Hansen, Per Christian; Ostromsky, T; Sameh, A

    1997-01-01

    technique is preferable to sparse matrix technique when the matrices are not large, because the high computational speed compensates fully the disadvantages of using more arithmetic operations and more storage. For very large matrices the computations must be organized as a sequence of tasks in each......Efficient subroutines for dense matrix computations have recently been developed and are available on many high-speed computers. On some computers the speed of many dense matrix operations is near to the peak-performance. For sparse matrices storage and operations can be saved by operating only...... and storing only nonzero elements. However, the price is a great degradation of the speed of computations on supercomputers (due to the use of indirect addresses, to the need to insert new nonzeros in the sparse storage scheme, to the lack of data locality, etc.). On many high-speed computers a dense matrix...

  13. Los Alamos Science, Fall 1983 No. 9

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, N G [ed.

    1983-10-01

    Topics covered in this issue include: cellular automata, gene expression, gen-bank and its promise for molecular genetics, and frontiers of supercomputing. Abstracts have been prepared for the individual items. (GHT)

  14. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Directory of Open Access Journals (Sweden)

    De K.

    2016-01-01

    Full Text Available The Large Hadron Collider (LHC, operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed AnalysisWorkload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF, is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF’s Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.

  15. HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies

    Science.gov (United States)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.

    2017-10-01

    PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.

  16. Recent advances and perspectives in topical oral anesthesia.

    Science.gov (United States)

    Franz-Montan, Michelle; Ribeiro, Lígia Nunes de Morais; Volpato, Maria Cristina; Cereda, Cintia Maria Saia; Groppo, Francisco Carlos; Tofoli, Giovana Randomille; de Araújo, Daniele Ribeiro; Santi, Patrizia; Padula, Cristina; de Paula, Eneida

    2017-05-01

    Topical anesthesia is widely used in dentistry to reduce pain caused by needle insertion and injection of the anesthetic. However, successful anesthesia is not always achieved using the formulations that are currently commercially available. As a result, local anesthesia is still one of the procedures that is most feared by dental patients. Drug delivery systems (DDSs) provide ways of improving the efficacy of topical agents. Areas covered: An overview of the structure and permeability of oral mucosa is given, followed by a review of DDSs designed for dental topical anesthesia and their related clinical trials. Chemical approaches to enhance permeation and anesthesia efficacy, or to promote superficial anesthesia, include nanostructured carriers (liposomes, cyclodextrins, polymeric nanoparticle systems, solid lipid nanoparticles, and nanostructured lipid carriers) and different pharmaceutical dosage forms (patches, bio- and mucoadhesive systems, and hydrogels). Physical methods include pre-cooling, vibration, iontophoresis, and microneedle arrays. Expert opinion: The combination of different chemical and physical methods is an attractive option for effective topical anesthesia in oral mucosa. This comprehensive review should provide the readers with the most relevant options currently available to assist pain-free dental anesthesia. The findings should be considered for future clinical trials.

  17. Topical phenytoin for treating pressure ulcers.

    Science.gov (United States)

    Hao, Xiang Yong; Li, Hong Ling; Su, He; Cai, Hui; Guo, Tian Kang; Liu, Ruifeng; Jiang, Lei; Shen, Yan Fei

    2017-02-22

    reduced healing. We therefore considered it to be insufficient to determine the effect of topical phenytoin on ulcer healing. One study compared topical phenytoin with triple antibiotic ointment, however, none of the outcomes of interest to this review were reported. No adverse drug reactions or interactions were detected in any of the three RCTs. Minimal pain was reported in all groups in one trial that compared topical phenytoin with hydrocolloid dressings and triple antibiotic ointment. This review has considered the available evidence and the result shows that it is uncertain whether topical phenytoin improves ulcer healing for patients with grade I and II pressure ulcers. No adverse events were reported from three small trials and minimal pain was reported in one trial. Therefore, further rigorous, adequately powered RCTs examining the effects of topical phenytoin for treating pressure ulcers, and to report on adverse events, quality of life and costs are necessary.

  18. Topics in supersymmetric theories

    International Nuclear Information System (INIS)

    Nemeschansky, D.D.

    1984-01-01

    This thesis discusses four different topics in supersymmetric theories. In the first part models in which supersymmetry is broken by the Fayet-Iliopoulos mechanism are considered. The possibility that scalar quark and lepton masses might arise radiatively in such theories is explored. In the second part supersymmetric grand unified models with a sliding singlet are considered. The author reviews the argument that the sliding singlet does not work in models with large supersymmetry breaking. Then he considers the possibility of using a sliding singlet with low energy supersymmetry breaking. The third part of the thesis deals with the entropy problem of supersymmetric theories. Most supersymmetric models possess a decoupled particle with mass of order 100 GeV which is copiously produced in the early universe and whose decay produces huge amounts of entropy. The author shows how this problem can be avoided in theories in which the hidden sector contains several light fields. In the fourth part effective Lagrangians for supersymmetric theories are studied. The anomalous pion interaction for supersymmetric theories is written down. General properties of this term are studied both on compact and non-compact manifolds

  19. Freshman Health Topics

    Science.gov (United States)

    Hovde, Karen

    2011-01-01

    This article examines a cluster of health topics that are frequently selected by students in lower division classes. Topics address issues relating to addictive substances, including alcohol and tobacco, eating disorders, obesity, and dieting. Analysis of the topics examines their interrelationships and organization in the reference literature.…

  20. Topical anti-infective sinonasal irrigations: update and literature review.

    Science.gov (United States)

    Lee, Jivianne T; Chiu, Alexander G

    2014-01-01

    Sinonasal anti-infective irrigations have emerged as a promising therapeutic modality in the comprehensive management of chronic rhinosinusitis (CRS), particularly in the context of recalcitrant disease. The purpose of this article was to delineate the current spectrum of topical anti-infective therapies available and evaluate their role in the treatment of CRS. A systematic literature review was performed on all studies investigating the use of topical antimicrobial solutions in the medical therapy of CRS. Anti-infective irrigations were stratified into topical antibacterial, antifungal, and additive preparations according to their composition and respective microbicidal properties. The use of topical antibiotic irrigations has been supported by low-level studies in the treatment of refractory CRS, with optimal results achieved in patients who have undergone prior functional endoscopic sinus surgery and received culture-directed therapy. Multiple evidence-based reviews have not established any clinical benefit with the administration of topical antifungals, and their use is not currently recommended in the management of routine CRS. Topical additives including surfactants may be beneficial as adjunctive treatment for recalcitrant CRS, but additional research is needed to investigate their efficacy in comparison with other agents and establish safety profiles. Topical anti-infective solutions are not recommended as first-line therapy for routine CRS but may be considered as a potential option for patients with refractory CRS who have failed traditional medical and surgical intervention. Additional research is necessary to determine which patient populations would derive the most benefit from each respective irrigation regimen and identify potential toxicities associated with prolonged use.

  1. EDF's experience with supercomputing and challenges ahead - towards multi-physics and multi-scale approaches

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M.; Banner, D. [Electricite de France (EDF)- R and D Division, 92 - Clamart (France)

    2003-07-01

    Nuclear power plants are a major asset of the EDF company. To remain so, in particular in a context of deregulation, competitiveness, safety and public acceptance are three conditions. These stakes apply both to existing plants and to future reactors. The purpose of the presentation is to explain how supercomputing can help EDF to satisfy these requirements. Three examples are described in detail: ensuring optimal use of nuclear fuel under wholly safe conditions, understanding and simulating the material deterioration mechanisms and moving forward with numerical simulation for the performance of EDF's activities. In conclusion, a broader vision of EDF long term R and D in the field of numerical simulation is given and especially of five challenges taken up by EDF together with its industrial and scientific partners. (author)

  2. Performance Evaluation of an Intel Haswell- and Ivy Bridge-Based Supercomputer Using Scientific and Engineering Applications

    Science.gov (United States)

    Saini, Subhash; Hood, Robert T.; Chang, Johnny; Baron, John

    2016-01-01

    We present a performance evaluation conducted on a production supercomputer of the Intel Xeon Processor E5- 2680v3, a twelve-core implementation of the fourth-generation Haswell architecture, and compare it with Intel Xeon Processor E5-2680v2, an Ivy Bridge implementation of the third-generation Sandy Bridge architecture. Several new architectural features have been incorporated in Haswell including improvements in all levels of the memory hierarchy as well as improvements to vector instructions and power management. We critically evaluate these new features of Haswell and compare with Ivy Bridge using several low-level benchmarks including subset of HPCC, HPCG and four full-scale scientific and engineering applications. We also present a model to predict the performance of HPCG and Cart3D within 5%, and Overflow within 10% accuracy.

  3. Evaluating topic models with stability

    CSIR Research Space (South Africa)

    De Waal, A

    2008-11-01

    Full Text Available Topic models are unsupervised techniques that extract likely topics from text corpora, by creating probabilistic word-topic and topic-document associations. Evaluation of topic models is a challenge because (a) topic models are often employed...

  4. Topical report review status

    International Nuclear Information System (INIS)

    1997-08-01

    This report provides industry with procedures for submitting topical reports, guidance on how the U.S. Nuclear Regulatory Commission (NRC) processes and responds to topical report submittals, and an accounting, with review schedules, of all topical reports currently accepted for review schedules, of all topical reports currently accepted for review by the NRC. This report will be published annually. Each sponsoring organization with one or more topical reports accepted for review copies

  5. 369 TFlop/s molecular dynamics simulations on the Roadrunner general-purpose heterogeneous supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Swaminarayan, Sriram [Los Alamos National Laboratory; Germann, Timothy C [Los Alamos National Laboratory; Kadau, Kai [Los Alamos National Laboratory; Fossum, Gordon C [IBM CORPORATION

    2008-01-01

    The authors present timing and performance numbers for a short-range parallel molecular dynamics (MD) code, SPaSM, that has been rewritten for the heterogeneous Roadrunner supercomputer. Each Roadrunner compute node consists of two AMD Opteron dual-core microprocessors and four PowerXCell 8i enhanced Cell microprocessors, so that there are four MPI ranks per node, each with one Opteron and one Cell. The interatomic forces are computed on the Cells (each with one PPU and eight SPU cores), while the Opterons are used to direct inter-rank communication and perform I/O-heavy periodic analysis, visualization, and checkpointing tasks. The performance measured for our initial implementation of a standard Lennard-Jones pair potential benchmark reached a peak of 369 Tflop/s double-precision floating-point performance on the full Roadrunner system (27.7% of peak), corresponding to 124 MFlop/Watt/s at a price of approximately 3.69 MFlops/dollar. They demonstrate an initial target application, the jetting and ejection of material from a shocked surface.

  6. Syntacticized topics in Kurmuk

    DEFF Research Database (Denmark)

    Andersen, Torben

    2015-01-01

    This article argues that Kurmuk, a little-described Western Nilotic language, is characterized by a syntacticized topic whose grammatical relation is variable. In this language, declarative clauses have as topic an obligatory preverbal NP which is either a subject, an object or an adjunct....... The grammatical relation of the topic is expressed by a voice-like inflection of the verb, here called orientation. While subject-orientation is morphologically unmarked, object-oriented and adjunct-oriented verbs are marked by a subject suffix or by a suffix indicating that the topic is not subject, and adjunct......-orientation differs from object-orientation by a marked tone pattern. Topic choice largely reflects information structure by indicating topic continuity. The topic also plays a crucial role in relative clauses and in clauses with contrastive constituent focus, in that objects and adjuncts can only be relativized...

  7. Supercomputer debugging workshop '92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-01-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  8. Topical Drugs for Pain Relief

    Directory of Open Access Journals (Sweden)

    Anjali Srinivasan

    2015-03-01

    Full Text Available Topical therapy helps patients with oral and perioral pain problems such as ulcers, burning mouth syndrome, temporomandibular disorders, neuromas, neuropathies and neuralgias. Topical drugs used in the field of dentistry are topical anaesthetics, topical analgesics, topical antibiotics and topical corticosteroids. It provides symptomatic/curative effect. Topical drugs are easy to apply, avoids hepatic first pass metabolism and more sites specific. But it can only be used for medications that require low plasma concentrations to achieve a therapeutic effect.

  9. New Mexico High School Supercomputing Challenge, 1990--1995: Five years of making a difference to students, teachers, schools, and communities. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Foster, M.; Kratzer, D.

    1996-02-01

    The New Mexico High School Supercomputing Challenge is an academic program dedicated to increasing interest in science and math among high school students by introducing them to high performance computing. This report provides a summary and evaluation of the first five years of the program, describes the program and shows the impact that it has had on high school students, their teachers, and their communities. Goals and objectives are reviewed and evaluated, growth and development of the program are analyzed, and future directions are discussed.

  10. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  11. Topical report review status

    International Nuclear Information System (INIS)

    1982-08-01

    A Topical Report Review Status is scheduled to be published semi-annually. The primary purpose of this document is to provide periodic progress reports of on-going topical report reviews, to identify those topical reports for which the Nuclear Regulatory Commission (NRC) staff review has been completed and, to the extent practicable, to provide NRC management with sufficient information regarding the conduct of the topical report program to permit taking whatever actions deemed necessary or appropriate. This document is also intended to be a source of information to NRC Licensing Project Managers and other NRC personnel regarding the status of topical reports which may be referenced in applications for which they have responsibility. This status report is published primarily for internal NRC use in managing the topical report program, but is also used by NRC to advise the industry of report review status

  12. Topical report review status

    International Nuclear Information System (INIS)

    1983-01-01

    A Topical Report Review Status is scheduled to be published semi-annually. The primary purpose of this document is to provide periodic progress reports of on-going topical report reviews, to identify those topical reports for which the Nuclear Regulatory Commission (NRC) staff review has been completed and, to the extent practicable, to provide NRC management with sufficient information regarding the conduct of the topical report program to permit taking whatever actions deemed necessary or appropriate. This document is also intended to be a source of information to NRC Licensing Project Managers and other NRC personnel regarding the status of topical reports which may be referenced in applications for which they have responsibility. This status report is published primarily for internal NRC use in managing the topical report program, but is also used by NRC to advise the industry of report review status

  13. A criticality safety analysis code using a vectorized Monte Carlo method on the HITAC S-810 supercomputer

    International Nuclear Information System (INIS)

    Morimoto, Y.; Maruyama, H.

    1987-01-01

    A vectorized Monte Carlo criticality safety analysis code has been developed on the vector supercomputer HITAC S-810. In this code, a multi-particle tracking algorithm was adopted for effective utilization of the vector processor. A flight analysis with pseudo-scattering was developed to reduce the computational time needed for flight analysis, which represents the bulk of computational time. This new algorithm realized a speed-up of factor 1.5 over the conventional flight analysis. The code also adopted the multigroup cross section constants library of the Bodarenko type with 190 groups, with 132 groups being for fast and epithermal regions and 58 groups being for the thermal region. Evaluation work showed that this code reproduce the experimental results to an accuracy of about 1 % for the effective neutron multiplication factor. (author)

  14. 75 FR 26647 - Ophthalmic and Topical Dosage Form New Animal Drugs; Ivermectin Topical Solution

    Science.gov (United States)

    2010-05-12

    .... FDA-2010-N-0002] Ophthalmic and Topical Dosage Form New Animal Drugs; Ivermectin Topical Solution... are treated with a topical solution of ivermectin. DATES: This rule is effective May 12, 2010. FOR... ANADA 200-340 for PRIVERMECTIN (ivermectin), a topical solution used on cattle to control infestations...

  15. Benchmarking Further Single Board Computers for Building a Mini Supercomputer for Simulation of Telecommunication Systems

    Directory of Open Access Journals (Sweden)

    Gábor Lencse

    2016-01-01

    Full Text Available Parallel Discrete Event Simulation (PDES with the conservative synchronization method can be efficiently used for the performance analysis of telecommunication systems because of their good lookahead properties. For PDES, a cost effective execution platform may be built by using single board computers (SBCs, which offer relatively high computation capacity compared to their price or power consumption and especially to the space they take up. A benchmarking method is proposed and its operation is demonstrated by benchmarking ten different SBCs, namely Banana Pi, Beaglebone Black, Cubieboard2, Odroid-C1+, Odroid-U3+, Odroid-XU3 Lite, Orange Pi Plus, Radxa Rock Lite, Raspberry Pi Model B+, and Raspberry Pi 2 Model B+. Their benchmarking results are compared to find out which one should be used for building a mini supercomputer for parallel discrete-event simulation of telecommunication systems. The SBCs are also used to build a heterogeneous cluster and the performance of the cluster is tested, too.

  16. Women's Health Topics

    Science.gov (United States)

    ... Information by Audience For Women Women's Health Topics Women's Health Topics Share Tweet Linkedin Pin it More sharing options Linkedin Pin it Email Print National Women's Health Week May 13 - 19, 2018 Join us ...

  17. Fast and Accurate Simulation of the Cray XMT Multithreaded Supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Villa, Oreste; Tumeo, Antonino; Secchi, Simone; Manzano Franco, Joseph B.

    2012-12-31

    Irregular applications, such as data mining and analysis or graph-based computations, show unpredictable memory/network access patterns and control structures. Highly multithreaded architectures with large processor counts, like the Cray MTA-1, MTA-2 and XMT, appear to address their requirements better than commodity clusters. However, the research on highly multithreaded systems is currently limited by the lack of adequate architectural simulation infrastructures due to issues such as size of the machines, memory footprint, simulation speed, accuracy and customization. At the same time, Shared-memory MultiProcessors (SMPs) with multi-core processors have become an attractive platform to simulate large scale machines. In this paper, we introduce a cycle-level simulator of the highly multithreaded Cray XMT supercomputer. The simulator runs unmodified XMT applications. We discuss how we tackled the challenges posed by its development, detailing the techniques introduced to make the simulation as fast as possible while maintaining a high accuracy. By mapping XMT processors (ThreadStorm with 128 hardware threads) to host computing cores, the simulation speed remains constant as the number of simulated processors increases, up to the number of available host cores. The simulator supports zero-overhead switching among different accuracy levels at run-time and includes a network model that takes into account contention. On a modern 48-core SMP host, our infrastructure simulates a large set of irregular applications 500 to 2000 times slower than real time when compared to a 128-processor XMT, while remaining within 10\\% of accuracy. Emulation is only from 25 to 200 times slower than real time.

  18. PFLOTRAN: Reactive Flow & Transport Code for Use on Laptops to Leadership-Class Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan; Mills, Richard T.

    2012-04-18

    PFLOTRAN, a next-generation reactive flow and transport code for modeling subsurface processes, has been designed from the ground up to run efficiently on machines ranging from leadership-class supercomputers to laptops. Based on an object-oriented design, the code is easily extensible to incorporate additional processes. It can interface seamlessly with Fortran 9X, C and C++ codes. Domain decomposition parallelism is employed, with the PETSc parallel framework used to manage parallel solvers, data structures and communication. Features of the code include a modular input file, implementation of high-performance I/O using parallel HDF5, ability to perform multiple realization simulations with multiple processors per realization in a seamless manner, and multiple modes for multiphase flow and multicomponent geochemical transport. Chemical reactions currently implemented in the code include homogeneous aqueous complexing reactions and heterogeneous mineral precipitation/dissolution, ion exchange, surface complexation and a multirate kinetic sorption model. PFLOTRAN has demonstrated petascale performance using 2{sup 17} processor cores with over 2 billion degrees of freedom. Accomplishments achieved to date include applications to the Hanford 300 Area and modeling CO{sub 2} sequestration in deep geologic formations.

  19. Film forming systems for topical and transdermal drug delivery

    Directory of Open Access Journals (Sweden)

    Kashmira Kathe

    2017-11-01

    Full Text Available Skin is considered as an important route of administration of drugs for both local and systemic effects. The effectiveness of topical therapy depends on the physicochemical properties of the drug and adherence of the patient to the treatment regimen as well as the system's ability to adhere to skin during the therapy so as to promote drug penetration through the skin barrier. Conventional formulations for topical and dermatological administration of drugs have certain limitations like poor adherence to skin, poor permeability and compromised patient compliance. For the treatment of diseases of body tissues and wounds, the drug has to be maintained at the site of treatment for an effective period of time. Topical film forming systems are such developing drug delivery systems meant for topical application to the skin, which adhere to the body, forming a thin transparent film and provide delivery of the active ingredients to the body tissue. These are intended for skin application as emollient or protective and for local action or transdermal penetration of medicament for systemic action. The transparency is an appreciable feature of this polymeric system which greatly influences the patient acceptance. In the current discussion, the film forming systems are described as a promising choice for topical and transdermal drug delivery. Further the various types of film forming systems (sprays/solutions, gels and emulsions along with their evaluation parameters have also been reviewed.

  20. Hierarchical approach to optimization of parallel matrix multiplication on large-scale platforms

    KAUST Repository

    Hasanov, Khalid; Quintin, Jean-Noë l; Lastovetsky, Alexey

    2014-01-01

    -scale parallelism in mind. Indeed, while in 1990s a system with few hundred cores was considered a powerful supercomputer, modern top supercomputers have millions of cores. In this paper, we present a hierarchical approach to optimization of message-passing parallel

  1. NASA's Climate in a Box: Desktop Supercomputing for Open Scientific Model Development

    Science.gov (United States)

    Wojcik, G. S.; Seablom, M. S.; Lee, T. J.; McConaughy, G. R.; Syed, R.; Oloso, A.; Kemp, E. M.; Greenseid, J.; Smith, R.

    2009-12-01

    NASA's High Performance Computing Portfolio in cooperation with its Modeling, Analysis, and Prediction program intends to make its climate and earth science models more accessible to a larger community. A key goal of this effort is to open the model development and validation process to the scientific community at large such that a natural selection process is enabled and results in a more efficient scientific process. One obstacle to others using NASA models is the complexity of the models and the difficulty in learning how to use them. This situation applies not only to scientists who regularly use these models but also non-typical users who may want to use the models such as scientists from different domains, policy makers, and teachers. Another obstacle to the use of these models is that access to high performance computing (HPC) accounts, from which the models are implemented, can be restrictive with long wait times in job queues and delays caused by an arduous process of obtaining an account, especially for foreign nationals. This project explores the utility of using desktop supercomputers in providing a complete ready-to-use toolkit of climate research products to investigators and on demand access to an HPC system. One objective of this work is to pre-package NASA and NOAA models so that new users will not have to spend significant time porting the models. In addition, the prepackaged toolkit will include tools, such as workflow, visualization, social networking web sites, and analysis tools, to assist users in running the models and analyzing the data. The system architecture to be developed will allow for automatic code updates for each user and an effective means with which to deal with data that are generated. We plan to investigate several desktop systems, but our work to date has focused on a Cray CX1. Currently, we are investigating the potential capabilities of several non-traditional development environments. While most NASA and NOAA models are

  2. Topics of internal medicine for undergraduate dental education: a qualitative study.

    Science.gov (United States)

    Kunde, A; Harendza, S

    2015-08-01

    Due to the ageing population, internal medicine has become increasingly important for dental education. Although several studies have reported dentists' dissatisfaction with their internal medicine training, no guidelines exist for internal medicine learning objectives in dental education. The aim of this study was to identify topics of internal medicine considered to be relevant for dental education by dentists and internists. Eight dentists from private dental practices in Hamburg and eight experienced internal medicine consultants from Hamburg University Hospital were recruited for semi-structured interviews about internal medicine topics relevant for dentists. Internal diseases were clustered into representative subspecialties. Dentists and internists were also asked to rate medical diseases or emergencies compiled from the literature by their relevance to dental education. Coagulopathy and endocarditis were rated highest by dentists, whilst anaphylaxis was rated highest by internists. Dentists rated hepatitis, HIV, organ transplantation and head/neck neoplasm significantly higher than internists. The largest number of different internal diseases mentioned by dentists or internists could be clustered under cardiovascular diseases. The number of specific diseases dentists considered to be relevant for dental education was higher in the subspecialties cardiovascular diseases, haematology/oncology and infectiology. We identified the internal medicine topics most relevant for dental education by surveying practising dentists and internists. The relevance of these topics should be confirmed by larger quantitative studies to develop guidelines how to design specific learning objectives for internal medicine in the dental curriculum. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  4. Topic Visualization and Survival Analysis

    OpenAIRE

    Wang, Ping Jr

    2017-01-01

    Latent semantic structure in a text collection is called a topic. In this thesis, we aim to visualize topics in the scientific literature and detect active or inactive research areas based on their lifetime. Topics were extracted from over 1 million abstracts from the arXiv.org database using Latent Dirichlet Allocation (LDA). Hellinger distance measures similarity between two topics. Topics are determined to be relevant if their pairwise distances are smaller than the threshold of Hellinger ...

  5. GeneTopics - interpretation of gene sets via literature-driven topic models

    Science.gov (United States)

    2013-01-01

    Background Annotation of a set of genes is often accomplished through comparison to a library of labelled gene sets such as biological processes or canonical pathways. However, this approach might fail if the employed libraries are not up to date with the latest research, don't capture relevant biological themes or are curated at a different level of granularity than is required to appropriately analyze the input gene set. At the same time, the vast biomedical literature offers an unstructured repository of the latest research findings that can be tapped to provide thematic sub-groupings for any input gene set. Methods Our proposed method relies on a gene-specific text corpus and extracts commonalities between documents in an unsupervised manner using a topic model approach. We automatically determine the number of topics summarizing the corpus and calculate a gene relevancy score for each topic allowing us to eliminate non-specific topics. As a result we obtain a set of literature topics in which each topic is associated with a subset of the input genes providing directly interpretable keywords and corresponding documents for literature research. Results We validate our method based on labelled gene sets from the KEGG metabolic pathway collection and the genetic association database (GAD) and show that the approach is able to detect topics consistent with the labelled annotation. Furthermore, we discuss the results on three different types of experimentally derived gene sets, (1) differentially expressed genes from a cardiac hypertrophy experiment in mice, (2) altered transcript abundance in human pancreatic beta cells, and (3) genes implicated by GWA studies to be associated with metabolite levels in a healthy population. In all three cases, we are able to replicate findings from the original papers in a quick and semi-automated manner. Conclusions Our approach provides a novel way of automatically generating meaningful annotations for gene sets that are directly

  6. Evaluation of contact sensitivity to topical drugs in patients with contact dermatitis

    Directory of Open Access Journals (Sweden)

    Bilge Bülbül Şen

    2013-03-01

    Full Text Available Background and Design: Topical drugs are an important group of contact allergens. The present study aimed to evaluate contact sensitivity to topical drugs in patients with contact dermatitis. Materials and Methods: Between 2003 and 2008, 129 patients were followed up at the Department of Dermatology at Ankara University School of Medicine with clinically suspected contact sensitivity to topical drugs. In this study, the patch test reactions to the European Standard Battery and topical drugs used by the patients and medicament patch test results were evaluated. Results: Positive patch test reaction to one or more allergens was found in 80 (62.0% of 129 patients included in the study. Sixty-one of the 80 patients (61/129, 47.3% had positive patch test reaction to medicaments. Medicament sensitivity was detected in 37.9% (49/129 of subjects. Nitrofurazone was found to be the most common allergen (18.6%. Discussion: The present study showed that topical drugs are a frequent cause of allergic contact dermatitis. Therefore, the probability of contact sensitivity to topical drugs should also be considered in patients with the clinical diagnosis of allergic contact dermatitis and, suspected cases should be evaluated further with patch testing in order to find the responsible allergens.

  7. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.

    1998-12-01

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  8. Face-masks for facial atopic eczema: consider a hydrocolloid dressing.

    Science.gov (United States)

    Rademaker, Marius

    2013-08-01

    Facial involvement of atopic eczema in young children can be difficult to manage. Chronic scratching and rubbing, combined with parental reluctance to use topical corticosteroids on the face, often results in recalcitrant facial eczema. While wet wraps are a useful management option for moderate/severe atopic eczema involving the trunk and limbs they are difficult to use on the face. We describe the use of a face-mask using a widely available adhesive hydrocolloid dressing (DuoDerm extra thin) in three children with recalcitrant facial atopic eczema. Symptomatic control of itch or soreness was obtained within hours and the facial atopic eczema was markedly improved by 7 days. The face-masks were easy to apply, each lasting 1-4 days. One patient had a single adjuvant application of a potent topical corticosteroid under the hydrocolloid dressing. All three patients had long remissions (greater than 3 months) of their facial eczema, although all continued to have significant eczema involving their trunk and limbs. Face-masks made from hydrocolloid dressings, with or without topical corticosteroids, are worth considering in children with recalcitrant facial eczema. © 2012 The Author. Australasian Journal of Dermatology © 2012 The Australasian College of Dermatologists.

  9. A comparison of evaluation metrics for biomedical journals, articles, and websites in terms of sensitivity to topic.

    Science.gov (United States)

    Fu, Lawrence D; Aphinyanaphongs, Yindalon; Wang, Lily; Aliferis, Constantin F

    2011-08-01

    Evaluating the biomedical literature and health-related websites for quality are challenging information retrieval tasks. Current commonly used methods include impact factor for journals, PubMed's clinical query filters and machine learning-based filter models for articles, and PageRank for websites. Previous work has focused on the average performance of these methods without considering the topic, and it is unknown how performance varies for specific topics or focused searches. Clinicians, researchers, and users should be aware when expected performance is not achieved for specific topics. The present work analyzes the behavior of these methods for a variety of topics. Impact factor, clinical query filters, and PageRank vary widely across different topics while a topic-specific impact factor and machine learning-based filter models are more stable. The results demonstrate that a method may perform excellently on average but struggle when used on a number of narrower topics. Topic-adjusted metrics and other topic robust methods have an advantage in such situations. Users of traditional topic-sensitive metrics should be aware of their limitations. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. A Comparison of Evaluation Metrics for Biomedical Journals, Articles, and Websites in Terms of Sensitivity to Topic

    Science.gov (United States)

    Fu, Lawrence D.; Aphinyanaphongs, Yindalon; Wang, Lily; Aliferis, Constantin F.

    2011-01-01

    Evaluating the biomedical literature and health-related websites for quality are challenging information retrieval tasks. Current commonly used methods include impact factor for journals, PubMed’s clinical query filters and machine learning-based filter models for articles, and PageRank for websites. Previous work has focused on the average performance of these methods without considering the topic, and it is unknown how performance varies for specific topics or focused searches. Clinicians, researchers, and users should be aware when expected performance is not achieved for specific topics. The present work analyzes the behavior of these methods for a variety of topics. Impact factor, clinical query filters, and PageRank vary widely across different topics while a topic-specific impact factor and machine learning-based filter models are more stable. The results demonstrate that a method may perform excellently on average but struggle when used on a number of narrower topics. Topic adjusted metrics and other topic robust methods have an advantage in such situations. Users of traditional topic-sensitive metrics should be aware of their limitations. PMID:21419864

  11. Incorporating Topic Assignment Constraint and Topic Correlation Limitation into Clinical Goal Discovering for Clinical Pathway Mining

    Directory of Open Access Journals (Sweden)

    Xiao Xu

    2017-01-01

    Full Text Available Clinical pathways are widely used around the world for providing quality medical treatment and controlling healthcare cost. However, the expert-designed clinical pathways can hardly deal with the variances among hospitals and patients. It calls for more dynamic and adaptive process, which is derived from various clinical data. Topic-based clinical pathway mining is an effective approach to discover a concise process model. Through this approach, the latent topics found by latent Dirichlet allocation (LDA represent the clinical goals. And process mining methods are used to extract the temporal relations between these topics. However, the topic quality is usually not desirable due to the low performance of the LDA in clinical data. In this paper, we incorporate topic assignment constraint and topic correlation limitation into the LDA to enhance the ability of discovering high-quality topics. Two real-world datasets are used to evaluate the proposed method. The results show that the topics discovered by our method are with higher coherence, informativeness, and coverage than the original LDA. These quality topics are suitable to represent the clinical goals. Also, we illustrate that our method is effective in generating a comprehensive topic-based clinical pathway model.

  12. Health Topic XML File Description

    Science.gov (United States)

    ... this page: https://medlineplus.gov/xmldescription.html Health Topic XML File Description: MedlinePlus To use the sharing ... information categories assigned. Example of a Full Health Topic Record A record for a MedlinePlus health topic ...

  13. Evaluating the networking characteristics of the Cray XC-40 Intel Knights Landing-based Cori supercomputer at NERSC

    Energy Technology Data Exchange (ETDEWEB)

    Doerfler, Douglas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Austin, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cook, Brandon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Deslippe, Jack [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kandalla, Krishna [Cray Inc, Bloomington, MN (United States); Mendygral, Peter [Cray Inc, Bloomington, MN (United States)

    2017-09-12

    There are many potential issues associated with deploying the Intel Xeon Phi™ (code named Knights Landing [KNL]) manycore processor in a large-scale supercomputer. One in particular is the ability to fully utilize the high-speed communications network, given that the serial performance of a Xeon Phi TM core is a fraction of a Xeon®core. In this paper, we take a look at the trade-offs associated with allocating enough cores to fully utilize the Aries high-speed network versus cores dedicated to computation, e.g., the trade-off between MPI and OpenMP. In addition, we evaluate new features of Cray MPI in support of KNL, such as internode optimizations. We also evaluate one-sided programming models such as Unified Parallel C. We quantify the impact of the above trade-offs and features using a suite of National Energy Research Scientific Computing Center applications.

  14. Identifying Topics in Microblogs Using Wikipedia.

    Science.gov (United States)

    Yıldırım, Ahmet; Üsküdarlı, Suzan; Özgür, Arzucan

    2016-01-01

    Twitter is an extremely high volume platform for user generated contributions regarding any topic. The wealth of content created at real-time in massive quantities calls for automated approaches to identify the topics of the contributions. Such topics can be utilized in numerous ways, such as public opinion mining, marketing, entertainment, and disaster management. Towards this end, approaches to relate single or partial posts to knowledge base items have been proposed. However, in microblogging systems like Twitter, topics emerge from the culmination of a large number of contributions. Therefore, identifying topics based on collections of posts, where individual posts contribute to some aspect of the greater topic is necessary. Models, such as Latent Dirichlet Allocation (LDA), propose algorithms for relating collections of posts to sets of keywords that represent underlying topics. In these approaches, figuring out what the specific topic(s) the keyword sets represent remains as a separate task. Another issue in topic detection is the scope, which is often limited to specific domain, such as health. This work proposes an approach for identifying domain-independent specific topics related to sets of posts. In this approach, individual posts are processed and then aggregated to identify key tokens, which are then mapped to specific topics. Wikipedia article titles are selected to represent topics, since they are up to date, user-generated, sophisticated articles that span topics of human interest. This paper describes the proposed approach, a prototype implementation, and a case study based on data gathered during the heavily contributed periods corresponding to the four US election debates in 2012. The manually evaluated results (0.96 precision) and other observations from the study are discussed in detail.

  15. Should Euthanasia Be Considered Iatrogenic?

    Science.gov (United States)

    Barone, Silvana; Unguru, Yoram

    2017-08-01

    As more countries adopt laws and regulations concerning euthanasia, pediatric euthanasia has become an important topic of discussion. Conceptions of what constitutes harm to patients are fluid and highly dependent on a myriad of factors including, but not limited to, health care ethics, family values, and cultural context. Euthanasia could be viewed as iatrogenic insofar as it results in an outcome (death) that some might consider inherently negative. However, this perspective fails to acknowledge that death, the outcome of euthanasia, is not an inadvertent or preventable complication but rather the goal of the medical intervention. Conversely, the refusal to engage in the practice of euthanasia might be conceived as iatrogenic insofar as it might inadvertently prolong patient suffering. This article will explore cultural and social factors informing families', health care professionals', and society's views on pediatric euthanasia in selected countries. © 2017 American Medical Association. All Rights Reserved.

  16. Regulatory Information By Topic

    Science.gov (United States)

    EPA develops and enforces regulations that span many environmental topics, from acid rain reduction to wetlands restoration. Each topic listed below may include related laws and regulations, compliance enforcement information, policies guidance

  17. Identifying Topics in Microblogs Using Wikipedia.

    Directory of Open Access Journals (Sweden)

    Ahmet Yıldırım

    Full Text Available Twitter is an extremely high volume platform for user generated contributions regarding any topic. The wealth of content created at real-time in massive quantities calls for automated approaches to identify the topics of the contributions. Such topics can be utilized in numerous ways, such as public opinion mining, marketing, entertainment, and disaster management. Towards this end, approaches to relate single or partial posts to knowledge base items have been proposed. However, in microblogging systems like Twitter, topics emerge from the culmination of a large number of contributions. Therefore, identifying topics based on collections of posts, where individual posts contribute to some aspect of the greater topic is necessary. Models, such as Latent Dirichlet Allocation (LDA, propose algorithms for relating collections of posts to sets of keywords that represent underlying topics. In these approaches, figuring out what the specific topic(s the keyword sets represent remains as a separate task. Another issue in topic detection is the scope, which is often limited to specific domain, such as health. This work proposes an approach for identifying domain-independent specific topics related to sets of posts. In this approach, individual posts are processed and then aggregated to identify key tokens, which are then mapped to specific topics. Wikipedia article titles are selected to represent topics, since they are up to date, user-generated, sophisticated articles that span topics of human interest. This paper describes the proposed approach, a prototype implementation, and a case study based on data gathered during the heavily contributed periods corresponding to the four US election debates in 2012. The manually evaluated results (0.96 precision and other observations from the study are discussed in detail.

  18. Automatic topic identification of health-related messages in online health community using text classification.

    Science.gov (United States)

    Lu, Yingjie

    2013-01-01

    To facilitate patient involvement in online health community and obtain informative support and emotional support they need, a topic identification approach was proposed in this paper for identifying automatically topics of the health-related messages in online health community, thus assisting patients in reaching the most relevant messages for their queries efficiently. Feature-based classification framework was presented for automatic topic identification in our study. We first collected the messages related to some predefined topics in a online health community. Then we combined three different types of features, n-gram-based features, domain-specific features and sentiment features to build four feature sets for health-related text representation. Finally, three different text classification techniques, C4.5, Naïve Bayes and SVM were adopted to evaluate our topic classification model. By comparing different feature sets and different classification techniques, we found that n-gram-based features, domain-specific features and sentiment features were all considered to be effective in distinguishing different types of health-related topics. In addition, feature reduction technique based on information gain was also effective to improve the topic classification performance. In terms of classification techniques, SVM outperformed C4.5 and Naïve Bayes significantly. The experimental results demonstrated that the proposed approach could identify the topics of online health-related messages efficiently.

  19. Labour Market Effects of Employment Protection. IAB Labour Market Research Topics.

    Science.gov (United States)

    Walwei, Ulrich

    The labor market effects of employment protection were examined in a study of Germany's employment protection regulations and their impact on employment practices and patterns. The following topics were considered: (1) the question of whether Germany's labor market problems are a result of regulations; (2) employment security as a subject of labor…

  20. Differential Topic Models.

    Science.gov (United States)

    Chen, Changyou; Buntine, Wray; Ding, Nan; Xie, Lexing; Du, Lan

    2015-02-01

    In applications we may want to compare different document collections: they could have shared content but also different and unique aspects in particular collections. This task has been called comparative text mining or cross-collection modeling. We present a differential topic model for this application that models both topic differences and similarities. For this we use hierarchical Bayesian nonparametric models. Moreover, we found it was important to properly model power-law phenomena in topic-word distributions and thus we used the full Pitman-Yor process rather than just a Dirichlet process. Furthermore, we propose the transformed Pitman-Yor process (TPYP) to incorporate prior knowledge such as vocabulary variations in different collections into the model. To deal with the non-conjugate issue between model prior and likelihood in the TPYP, we thus propose an efficient sampling algorithm using a data augmentation technique based on the multinomial theorem. Experimental results show the model discovers interesting aspects of different collections. We also show the proposed MCMC based algorithm achieves a dramatically reduced test perplexity compared to some existing topic models. Finally, we show our model outperforms the state-of-the-art for document classification/ideology prediction on a number of text collections.

  1. Topic Model for Graph Mining.

    Science.gov (United States)

    Xuan, Junyu; Lu, Jie; Zhang, Guangquan; Luo, Xiangfeng

    2015-12-01

    Graph mining has been a popular research area because of its numerous application scenarios. Many unstructured and structured data can be represented as graphs, such as, documents, chemical molecular structures, and images. However, an issue in relation to current research on graphs is that they cannot adequately discover the topics hidden in graph-structured data which can be beneficial for both the unsupervised learning and supervised learning of the graphs. Although topic models have proved to be very successful in discovering latent topics, the standard topic models cannot be directly applied to graph-structured data due to the "bag-of-word" assumption. In this paper, an innovative graph topic model (GTM) is proposed to address this issue, which uses Bernoulli distributions to model the edges between nodes in a graph. It can, therefore, make the edges in a graph contribute to latent topic discovery and further improve the accuracy of the supervised and unsupervised learning of graphs. The experimental results on two different types of graph datasets show that the proposed GTM outperforms the latent Dirichlet allocation on classification by using the unveiled topics of these two models to represent graphs.

  2. Analyses of Research Topics in the Field of Informetrics Based on the Method of Topic Modeling

    OpenAIRE

    Sung-Chien Lin

    2014-01-01

    In this study, we used the approach of topic modeling to uncover the possible structure of research topics in the field of Informetrics, to explore the distribution of the topics over years, and to compare the core journals. In order to infer the structure of the topics in the field, the data of the papers published in the Journal of Informetricsand Scientometrics during 2007 to 2013 are retrieved from the database of the Web of Science as input of the approach of topic modeling. The results ...

  3. Analyses of Research Topics in the Field of Informetrics Based on the Method of Topic Modeling

    Directory of Open Access Journals (Sweden)

    Sung-Chien Lin

    2014-07-01

    Full Text Available In this study, we used the approach of topic modeling to uncover the possible structure of research topics in the field of Informetrics, to explore the distribution of the topics over years, and to compare the core journals. In order to infer the structure of the topics in the field, the data of the papers published in the Journal of Informetricsand Scientometrics during 2007 to 2013 are retrieved from the database of the Web of Science as input of the approach of topic modeling. The results of this study show that when the number of topics was set to 10, the topic model has the smallest perplexity. Although data scopes and analysis methodsare different to previous studies, the generating topics of this study are consistent with those results produced by analyses of experts. Empirical case studies and measurements of bibliometric indicators were concerned important in every year during the whole analytic period, and the field was increasing stability. Both the two core journals broadly paid more attention to all of the topics in the field of Informetrics. The Journal of Informetricsput particular emphasis on construction and applications ofbibliometric indicators and Scientometrics focused on the evaluation and the factors of productivity of countries, institutions, domains, and journals.

  4. Topical treatments of skin pain: a general review with a focus on hidradenitis suppurativa with topical agents.

    Science.gov (United States)

    Scheinfeld, Noah

    2014-07-15

    Hidradenitis Supprurativa (HS) is a painful chronic follicular disease. Few papers have addressed pain control for this debilitating condition. Possible topical agents include tricyclic antidepressants, opioids, anticonvulsants, NSAIDs, NMDA receptor antagonists, local anesthetics and other agents. The first line agents for the topical treatment of the cutaneous pain of HS are diclonefac gel 1% and liposomal xylocaine 4% and 5% cream or 5% ointment. The chief advantage of topical xylocaine is that is quick acting i.e. immediate however with a limited duration of effect 1-2 hours. The use of topical ketamine, which blocks n-methyl-D-aspartate receptors in a non-competitive fashion, might be a useful tool for the treatment of HS pain. Topical doxepin, which available in a 5% commercially preparation (Zonalon®) , makes patients drowsy and is not useful for controlling the pain of HS . Doxepin is available in a 3% or 3.3% concentration (which causes less drowsiness) from compounding pharmacies and can be used in compounded analgesic preparations with positive effect. Topical doxepin is preferred over use of topical amitriptyline because topical doxepin is more effective. Nevertheless, topical amitriptyline increase of the tactile and mechanical nociceptive thresholds and can be used for topical pain control in compound mixture of analgesics . Gabapentin and pregablin can also be used compounded with other agents in topical analgesic preparations with positive topical anesthetic effect. Capsaicin is not useful for topical treatment of the pain of HS. Sometimes compounded of anesthetics medications such as ketamine 10%, bupivacaine 1%, diclofenac 3%, doxepin 3% or 3.3%, and gabapentin 6% can extend the duration of effect so that medication only needs to be used 2 or 3 times a day. Still in my experience the easiest to get and most patient requested agent is topical diclonefac 1% gel.

  5. Using the LANSCE irradiation facility to predict the number of fatal soft errors in one of the world's fastest supercomputers

    International Nuclear Information System (INIS)

    Michalak, S.E.; Harris, K.W.; Hengartner, N.W.; Takala, B.E.; Wender, S.A.

    2005-01-01

    Los Alamos National Laboratory (LANL) is home to the Los Alamos Neutron Science Center (LANSCE). LANSCE is a unique facility because its neutron spectrum closely mimics the neutron spectrum at terrestrial and aircraft altitudes, but is many times more intense. Thus, LANSCE provides an ideal setting for accelerated testing of semiconductor and other devices that are susceptible to cosmic ray induced neutrons. Many industrial companies use LANSCE to estimate device susceptibility to cosmic ray induced neutrons, and it has also been used to test parts from one of LANL's supercomputers, the ASC (Advanced Simulation and Computing Program) Q. This paper discusses our use of the LANSCE facility to study components in Q including a comparison with failure data from Q

  6. Topical steroid-damaged skin

    Directory of Open Access Journals (Sweden)

    Anil Abraham

    2014-01-01

    Full Text Available Topical steroids, commonly used for a wide range of skin disorders, are associated with side effects both systemic and cutaneous. This article aims at bringing awareness among practitioners, about the cutaneous side effects of easily available, over the counter, topical steroids. This makes it important for us as dermatologists to weigh the usefulness of topical steroids versus their side effects, and to make an informed decision regarding their use in each individual based on other factors such as age, site involved and type of skin disorder.

  7. Mental Mechanisms for Topics Identification

    Directory of Open Access Journals (Sweden)

    Louis Massey

    2014-01-01

    Full Text Available Topics identification (TI is the process that consists in determining the main themes present in natural language documents. The current TI modeling paradigm aims at acquiring semantic information from statistic properties of large text datasets. We investigate the mental mechanisms responsible for the identification of topics in a single document given existing knowledge. Our main hypothesis is that topics are the result of accumulated neural activation of loosely organized information stored in long-term memory (LTM. We experimentally tested our hypothesis with a computational model that simulates LTM activation. The model assumes activation decay as an unavoidable phenomenon originating from the bioelectric nature of neural systems. Since decay should negatively affect the quality of topics, the model predicts the presence of short-term memory (STM to keep the focus of attention on a few words, with the expected outcome of restoring quality to a baseline level. Our experiments measured topics quality of over 300 documents with various decay rates and STM capacity. Our results showed that accumulated activation of loosely organized information was an effective mental computational commodity to identify topics. It was furthermore confirmed that rapid decay is detrimental to topics quality but that limited capacity STM restores quality to a baseline level, even exceeding it slightly.

  8. Selected Topics in Nuclear Theory. Lectures Given at the International Summer School on Selected Topics in Nuclear Theory

    Energy Technology Data Exchange (ETDEWEB)

    Janouch, F. [ed.

    1963-01-15

    An International Summer School on Selected Topics in Nuclear Theory was held during the period 20 August to 9 September 1962 in the Low Tatra Mountains, Czechoslovakia, under the auspices of the Nuclear Research Institute of the Czechoslovak Academy of Sciences, with financial support from the International Atomic Energy Agency. In view of the wide interest of the seven topics considered there and of the speed with which the field of theoretical physics is developing, the Agency decided to make available its facilities for rapid publication and to publish the lectures under its own imprint; however, all editorial and composition work has been performed under the supervision of the general editor. Dr. F. Janouch of the Nuclear Research Institute of the Czechoslovak Academy of Sciences. The problem of keeping in touch with the rapidly changing but fundamental field of theoretical physics is a difficult one, particularly for scientists in the developing countries. It is hoped that such publications as the present one and the companion volume containing the lectures presented at the Agency's Seminar on Theoretical Physics at Trieste w ill help, at least in a modest fashion, to overcome these difficulties.

  9. Improved Collaborative Filtering Algorithm using Topic Model

    Directory of Open Access Journals (Sweden)

    Liu Na

    2016-01-01

    Full Text Available Collaborative filtering algorithms make use of interactions rates between users and items for generating recommendations. Similarity among users or items is calculated based on rating mostly, without considering explicit properties of users or items involved. In this paper, we proposed collaborative filtering algorithm using topic model. We describe user-item matrix as document-word matrix and user are represented as random mixtures over item, each item is characterized by a distribution over users. The experiments showed that the proposed algorithm achieved better performance compared the other state-of-the-art algorithms on Movie Lens data sets.

  10. Satisfaction level with topical versus peribulbar anesthesia experienced by same patient for phacoemulsification

    Directory of Open Access Journals (Sweden)

    Nauman Ahmad

    2012-01-01

    Full Text Available Background: Various studies have assessed patient satisfaction with topical versus peribulbar anesthesia with conflicting results. Aim of study was to determine satisfaction level in same patient who gets topical anesthesia in one eye and peribulbar block in another eye. We propose that evaluation of various indicators of patient satisfaction will enable better selection of cases for topical anesthesia in the future. Methods: Eighty patients scheduled for phacoemulsification were enrolled in prospective, randomized, double-blind study. Each patient scheduled twice for one eye under topical anesthesia and other in peribulbar block. Pain, discomfort and pressure during application of local anesthetic, during phacoemulsification and at 2 hours after procedure were assessed on standard scales. Before discharge patient satisfaction level was checked with Iowa satisfaction with anesthesia scale (ISAS. The Student′s t-test was used to determine the significance of IOWA score in both groups. P<0.05 was considered significant. Results: Feeling of pain, pressure and discomfort scores during administration of topical anesthesia were all significantly lower compared to peribulbar anesthesia (P=0.004, 0.000, 0.002, respectively. In contrast, intraoperative scores were significantly higher in the topical anesthesia group compared to peribulbar anesthesia (P=0.022, 0.000, 0.000, respectively. Patient satisfaction measured with ISAS shows that peribulbar anesthesia with P=0.000 is strongly significant. Conclusion: Peribulbar anesthesia provided significantly better patient satisfaction in comparison with topical anesthesia when used for cataract surgery.

  11. Treatment with silver nitrate versus topical steroid treatment for umbilical granuloma: A non-inferiority randomized control trial.

    Directory of Open Access Journals (Sweden)

    Chikako Ogawa

    Full Text Available The aim of this prospective multicenter randomized controlled trial was to compare the efficacy of silver nitrate cauterization against that of topical steroid ointment in the treatment of neonatal umbilical granuloma.An open-label, non-inferiority randomized controlled trial was conducted from January 2013 to January 2016. The primary endpoint for the silver nitrate cauterization and topical steroid ointment groups was the healing rate after 2 weeks of treatment, applying a non-inferiority margin of 10%. The healing rate was evaluated until completion of 3 weeks of treatment.Participants comprised 207 neonates with newly diagnosed umbilical granuloma, randomized to receive silver nitrate cauterization (n = 104 or topical steroid ointment (n = 103. Healing rates after 2 weeks of treatment were 87.5% (91/104 in the silver nitrate cauterization and 82% (82/100 in the topical steroid ointment group group. The difference between groups was -5.5% (95% confidence interval, -19.1%, 8.4%, indicating that the non-inferiority criterion was not met. After 3 weeks of treatment, the healing rate with topical steroid ointment treatment was almost identical to that of silver nitrate cauterization (94/104 [90.4%] vs. 91/100 [91.0%]; 0.6% [-13.2 to 14.3]. No major complications occurred in either group.This study did not establish non-inferiority of topical steroid ointment treatment relative to silver nitrate cauterization, presumably due to lower healing rates than expected leading to an underpowered trial. However, considering that silver nitrate cauterization carries a distinct risk of chemical burns and that the overall efficacy of topical steroid ointment treatment is similar to that of silver nitrate cauterization, topical steroid ointment might be considered as a good alternative in the treatment of neonatal umbilical granuloma due to its safety and simplicity. To clarify non-inferiority, a larger study is needed.

  12. Effect of Topical Tacrolimus on Vitiligo in Children

    Directory of Open Access Journals (Sweden)

    Mohammed Ziaur Rahman Bhuiyan

    2016-01-01

    Full Text Available Background: Considering safe treatment modalities for children with vitiligo, search for newer therapeutic agents continues. Hence, new immunomodulatory agents such as calcineurinantagonists, frequently referred to as topical immunomodulators (TIMs have recently been introduced as new promising tools to treat acquired hypopigmentary disorders. Tacrolimus is safe in treating children due to lack of skin atrophy and less data are available on effect of topical tacrolimus on vitiligo. Objective: To see the effect of topical tacrolimus on vitiligo in children. Materials and Methods: This prospective study was done in outpatient department of Dermatology and Venereology, Chittagong Medical College Hospital (CMCH, Bangladesh. Clinically diagnosed vitiligo patients of up to 12 years age visiting Skin & VD OPD, CMCH during study period were the study population (total 30. The study was carried out from November 2007 to April 2008. Results: A total of 30 patients, 13 (43.33% males and 17 (56.66% females with focal, segmental or generalized vitiligo were studied. Seventy percent of study subjects were from 7–12 years of age. Topical tacrolimus 0.03% ointment was administered twice daily for 12 weeks to each patient. Repigmentation was complete (>75% in 43.33% cases (13/30, was moderate (50–75% in 33.33% (10/30, mild (<50% in 13.33% (4/30. Clinical adverse effects were noted in 6.67% (2/30 of cases where pruritus was in 3.33% (1/30 and burning in 3.33% (1/30. None of the reactions was severe, all were mild and well-tolerated and most occurred within the first month of initiation of treatment and resolved with continued use of drug and completely cured after the treatment completed. Nobody had to discontinue the therapy for side effects. Conclusion: In conclusion, tacrolimus ointment may be a rapidly efficacious and safe option for the treatment of vitiligo in children. The ease of topical self-administration with minimal side effects makes this novel

  13. NIC symposium 2010. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Muenster, Gernot [Muenster Univ. (Germany). Inst. fuer Theoretische Physik 1; Wolf, Dietrich [Duisburg-Essen Univ., Duisburg (Germany). Fakultaet fuer Physik; Kremer, Manfred (eds.) [Forschungszentrum Juelich GmbH (DE). Juelich Supercomputing Centre (JSC)

    2012-06-21

    The fifth NIC-Symposium gave an overview of the activities of the John von Neumann Institute for Computing (NIC) and of the results obtained in the last two years by research groups supported by the NIC. The large recent progress in supercomputing is highlighted by the fact that the newly installed Blue Gene/P system in Juelich - with a peak performance of 1 Petaflop/s - currently ranks number four in the TOP500 list. This development opens new dimensions in simulation science for researchers in Germany and Europe. NIC - a joint foundation of Forschungszentrum Juelich, Deutsches Elektronen-Synchrotron (DESY) and Gesellschaft fuer Schwerionenforschung (GSI) - supports with its members' supercomputer facilities about 130 research groups at universities and national labs working on computer simulations in various fields of science. Fifteen invited lectures covered selected topics in the following fields: Astrophysics Biophysics Chemistry Elementary Particle Physics Condensed Matter Materials Science Soft Matter Science Environmental Research Hydrodynamics and turbulence Plasma Physics Computer Science The talks are intended to inform a broad audience of scientists and the interested public about the research activities at NIC. The proceedings of the symposium cover projects that have been supported by the IBM supercomputers JUMP and IBM Blue Gene/P in Juelich and the APE topical computer at DESY-Zeuthen in an even wider range than the lectures.

  14. NIC symposium 2010. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Muenster, Gernot [Muenster Univ. (Germany). Inst. fuer Theoretische Physik 1; Wolf, Dietrich [Duisburg-Essen Univ., Duisburg (Germany). Fakultaet fuer Physik; Kremer, Manfred [Forschungszentrum Juelich GmbH (DE). Juelich Supercomputing Centre (JSC)

    2012-06-21

    The fifth NIC-Symposium gave an overview of the activities of the John von Neumann Institute for Computing (NIC) and of the results obtained in the last two years by research groups supported by the NIC. The large recent progress in supercomputing is highlighted by the fact that the newly installed Blue Gene/P system in Juelich - with a peak performance of 1 Petaflop/s - currently ranks number four in the TOP500 list. This development opens new dimensions in simulation science for researchers in Germany and Europe. NIC - a joint foundation of Forschungszentrum Juelich, Deutsches Elektronen-Synchrotron (DESY) and Gesellschaft fuer Schwerionenforschung (GSI) - supports with its members' supercomputer facilities about 130 research groups at universities and national labs working on computer simulations in various fields of science. Fifteen invited lectures covered selected topics in the following fields: Astrophysics Biophysics Chemistry Elementary Particle Physics Condensed Matter Materials Science Soft Matter Science Environmental Research Hydrodynamics and turbulence Plasma Physics Computer Science The talks are intended to inform a broad audience of scientists and the interested public about the research activities at NIC. The proceedings of the symposium cover projects that have been supported by the IBM supercomputers JUMP and IBM Blue Gene/P in Juelich and the APE topical computer at DESY-Zeuthen in an even wider range than the lectures.

  15. NIC symposium 2010. Proceedings

    International Nuclear Information System (INIS)

    Muenster, Gernot

    2012-01-01

    The fifth NIC-Symposium gave an overview of the activities of the John von Neumann Institute for Computing (NIC) and of the results obtained in the last two years by research groups supported by the NIC. The large recent progress in supercomputing is highlighted by the fact that the newly installed Blue Gene/P system in Juelich - with a peak performance of 1 Petaflop/s - currently ranks number four in the TOP500 list. This development opens new dimensions in simulation science for researchers in Germany and Europe. NIC - a joint foundation of Forschungszentrum Juelich, Deutsches Elektronen-Synchrotron (DESY) and Gesellschaft fuer Schwerionenforschung (GSI) - supports with its members' supercomputer facilities about 130 research groups at universities and national labs working on computer simulations in various fields of science. Fifteen invited lectures covered selected topics in the following fields: Astrophysics Biophysics Chemistry Elementary Particle Physics Condensed Matter Materials Science Soft Matter Science Environmental Research Hydrodynamics and turbulence Plasma Physics Computer Science The talks are intended to inform a broad audience of scientists and the interested public about the research activities at NIC. The proceedings of the symposium cover projects that have been supported by the IBM supercomputers JUMP and IBM Blue Gene/P in Juelich and the APE topical computer at DESY-Zeuthen in an even wider range than the lectures.

  16. Topical Treatment of Degenerative Knee Osteoarthritis.

    Science.gov (United States)

    Meng, Zengdong; Huang, Rongzhong

    2018-01-01

    This article reviews topical management strategies for degenerative osteoarthritis (OA) of the knee. A search of Pubmed, Embase and the Cochrane library using MeSH terms including "topical," "treatment," "knee" and "osteoarthritis" was carried out. Original research and review articles on the effectiveness and safety, recommendations from international published guidelines and acceptability studies of topical preparations were included. Current topical treatments included for the management of knee OA include topical nonsteroidal anti-inflammatory drugs, capsaicin, salicylates and physical treatments such as hot or cold therapy. Current treatment guidelines recommend topical nonsteroidal anti-inflammatory drugs as an alternative and even first-line therapy for OA management, especially among elderly patients. Guidelines on other topical treatments vary, from recommendations against their use, to in favor as alternative or simultaneous therapy, especially for patients with contraindications to other analgesics. Although often well-tolerated and preferred by many patients, clinical care still lags in the adoption of topical treatments. Aspects of efficacy, safety and patient quality of life data require further research. Copyright © 2018 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  17. Topical treatment of psoriasis: questionnaire results on topical therapy accessibility and influence of body surface area on usage.

    Science.gov (United States)

    Iversen, L; Lange, M M; Bissonette, R; Carvalho, A V E; van de Kerkhof, P C; Kirby, B; Kleyn, C E; Lynde, C W; van der Walt, J M; Wu, J J

    2017-07-01

    Topical treatment of mild to moderate psoriasis is first-line treatment and exhibits varying degrees of success across patient groups. Key factors influencing treatment success are physician topical treatment choice (high efficacy, low adverse events) and strict patient adherence. Currently, no formalized, international consensus guidelines exist to direct optimal topical treatment, although many countries have national guidelines. To describe and analyse cross-regional variations in the use and access of psoriasis topical therapies. The study was conducted as an observational cross-sectional study. A survey was distributed to dermatologists from the International Psoriasis Council (IPC) to assess topical therapy accessibility in 26 countries and to understand how body surface area (BSA) categories guide clinical decisions on topical use. Variation in the availability of tars, topical retinoids, dithranol and balneotherapy was reported. The vast majority of respondents (100% and 88.4%) used topical therapy as first-line monotherapy in situations with BSA 10%, the number of respondents who prescribe topical therapy decreased considerably. In addition, combination therapy of a topical drug and a systemic drug was frequently reported when BSA measured >10%. This physician survey provides new evidence on topical access and the influence of disease severity on topical usage in an effort to improve treatment strategies on a global level. © 2017 European Academy of Dermatology and Venereology.

  18. AHRQ series paper 3: identifying, selecting, and refining topics for comparative effectiveness systematic reviews: AHRQ and the effective health-care program.

    Science.gov (United States)

    Whitlock, Evelyn P; Lopez, Sarah A; Chang, Stephanie; Helfand, Mark; Eder, Michelle; Floyd, Nicole

    2010-05-01

    This article discusses the identification, selection, and refinement of topics for comparative effectiveness systematic reviews within the Agency for Healthcare Research and Quality's Effective Health Care (EHC) program. The EHC program seeks to align its research topic selection with the overall goals of the program, impartially and consistently apply predefined criteria to potential topics, involve stakeholders to identify high-priority topics, be transparent and accountable, and continually evaluate and improve processes. A topic prioritization group representing stakeholder and scientific perspectives evaluates topic nominations that fit within the EHC program (are "appropriate") to determine how "important" topics are as considered against seven criteria. The group then judges whether a new comparative effectiveness systematic review would be a duplication of existing research syntheses, and if not duplicative, if there is adequate type and volume of research to conduct a new systematic review. Finally, the group considers the "potential value and impact" of a comparative effectiveness systematic review. As the EHC program develops, ongoing challenges include ensuring the program addresses truly unmet needs for synthesized research because national and international efforts in this arena are uncoordinated, as well as engaging a range of stakeholders in program decisions while also achieving efficiency and timeliness.

  19. The Effect of Topical Structure Analysis Instruction on University Students' Writing Quality

    Science.gov (United States)

    Liangprayoon, Somlak; Chaya, Walaiporn; Thep-ackraphong, Tipa

    2013-01-01

    Coherence is considered one of the characteristics of effective writing. Topical structure analysis (TSA) has been taught to students as a revision strategy to raise their awareness of importance of textual coherence and helps them clearly understand its concept. This study aimed to investigate the effectiveness of TSA instruction in improving…

  20. Visualization at supercomputing centers: the tale of little big iron and the three skinny guys.

    Science.gov (United States)

    Bethel, E W; van Rosendale, J; Southard, D; Gaither, K; Childs, H; Brugger, E; Ahern, S

    2011-01-01

    Supercomputing centers are unique resources that aim to enable scientific knowledge discovery by employing large computational resources-the "Big Iron." Design, acquisition, installation, and management of the Big Iron are carefully planned and monitored. Because these Big Iron systems produce a tsunami of data, it's natural to colocate the visualization and analysis infrastructure. This infrastructure consists of hardware (Little Iron) and staff (Skinny Guys). Our collective experience suggests that design, acquisition, installation, and management of the Little Iron and Skinny Guys doesn't receive the same level of treatment as that of the Big Iron. This article explores the following questions about the Little Iron: How should we size the Little Iron to adequately support visualization and analysis of data coming off the Big Iron? What sort of capabilities must it have? Related questions concern the size of visualization support staff: How big should a visualization program be-that is, how many Skinny Guys should it have? What should the staff do? How much of the visualization should be provided as a support service, and how much should applications scientists be expected to do on their own?

  1. Topical botulinum toxin.

    Science.gov (United States)

    Collins, Ashley; Nasir, Adnan

    2010-03-01

    Nanotechnology is a rapidly growing discipline that capitalizes on the unique properties of matter engineered on the nanoscale. Vehicles incorporating nanotechnology have led to great strides in drug delivery, allowing for increased active ingredient stability, bioavailability, and site-specific targeting. Botulinum toxin has historically been used for the correction of neurological and neuromuscular disorders, such as torticollis, blepharospasm, and strabismus. Recent dermatological indications have been for the management of axillary hyperhydrosis and facial rhytides. Traditional methods of botulinum toxin delivery have been needle-based. These have been associated with increased pain and cost. Newer methods of botulinum toxin formulation have yielded topical preparations that are bioactive in small pilot clinical studies. While there are some risks associated with topical delivery, the refinement and standardization of delivery systems and techniques for the topical administration of botulinum toxin using nanotechnology is anticipated in the near future.

  2. Uncovering the Topic Landscape of Product-Service System Research: from Sustainability to Value Creation

    Directory of Open Access Journals (Sweden)

    Hakyeon Lee

    2018-03-01

    Full Text Available As the product-service system (PSS is considered a promising business model that can create more value for customers, PSS research has enjoyed remarkable growth in its volume and coverage over the last decade. This study aims to delineate the thematic landscape of PSS research by identifying latent topics from a large amount of scholarly data. Ten topics of PSS research are identified by applying the Latent Dirichlet Allocation (LDA model to 1229 PSS publications published between 2000 and 2016. The ten PSS topics are briefly reviewed to provide an overview of what has previously been studied in PSS research. We also investigate which topics rise or fall in popularity by identifying hot and cold topics of PSS research. It is observed that the focus of discussions on the benefits of PSS has shifted from sustainability to value creation. Also, increasing attention has been paid to more practical topics such as PSS implementation. The areas of subspecialty of the top ten PSS journals are also examined to explore the interdisciplinary nature of PSS research and thematic differences across disciplines. The findings of this study can provide rich implications for both academia and practice in the field of PSS.

  3. Topical methotrexate pretreatment enhances the therapeutic effect of topical 5-aminolevulinic acid-mediated photodynamic therapy on hamster buccal pouch precancers.

    Science.gov (United States)

    Yang, Deng-Fu; Lee, Jeng-Woei; Chen, Hsin-Ming; Hsu, Yih-Chih

    2014-09-01

    Topical 5-aminolevulinic acid-mediated photodynamic therapy (ALA-PDT) is effective for treatment of human oral precancerous lesions. This animal study aimed to assess whether topical methotrexate (MTX) pretreatment could enhance the therapeutic effect of topical ALA-PDT on hamster buccal pouch precancerous lesions. Twenty hamster buccal pouch precancerous lesions were treated with either topical ALA-PDT with topical MTX pretreatment (topical MTX-ALA-PDT group, n = 10) or topical ALA-PDT alone (topical ALA-PDT group, n = 10). The intracellular protoporphyrin IX (PpIX) level in another 12 precancerous lesions (n = 6 for either the topical MTX-ALA or topical ALA group) was monitored by fluorescence spectroscopy. The intracellular PpIX reached its peak level in precancerous lesions 6.5 hours and 2.5 hours after topical ALA application for the topical MTX-ALA group (5.63-fold higher in the lesion than in the normal mucosa) and topical ALA group (2.42-fold higher in the lesion than in the normal mucosa), respectively. The complete response rate of precancerous lesions was 80% for the topical MTX-ALA-PDT group and 70% for the topical ALA-PDT group. In addition, the topical MTX-ALA-PDT group required a significantly lower mean treatment number (2.1 ± 0.6) to achieve complete response than the topical ALA-PDT group (4.4 ± 1.3, p topical MTX-ALA-PDT group had a lower recurrence rate (12.5%) than the topical ALA-PDT group (28.6%). We conclude that topical MTX-pretreatment can increase intracellular PpIX production in hamster buccal pouch precancerous lesions and significantly improves the outcomes of the precancerous lesions treated with topical ALA-PDT. Copyright © 2014. Published by Elsevier B.V.

  4. Systemic vs. Topical Therapy for the Treatment of Vulvovaginal Candidiasis

    Directory of Open Access Journals (Sweden)

    Sebastian Faro

    1994-01-01

    Full Text Available It is estimated that 75% of all women will experience at least 1 episode of vulvovaginal candidiasis (VVC during their lifetimes. Most patients with acute VVC can be treated with short-term regimens that optimize compliance. Since current topical and oral antifungals have shown comparably high efficacy rates, other issues should be considered in determining the most appropriate therapy. It is possible that the use of short-duration narrow-spectrum agents may increase selection of more resistant organisms which will result in an increase of recurrent VVC (RVVC. Women who are known or suspected to be pregnant and women of childbearing age who are not using a reliable means of contraception should receive topical therapy, as should those who are breast-feeding or receiving drugs that can interact with an oral azole and those who have previously experienced adverse effects during azole therapy. Because of the potential risks associated with systemic treatment, topical therapy with a broad-spectrum agent should be the method of choice for VVC, whereas systemic therapy should be reserved for either RVVC or cases where the benefits outweigh any possible adverse reactions.

  5. Sandia`s network for Supercomputing `94: Linking the Los Alamos, Lawrence Livermore, and Sandia National Laboratories using switched multimegabit data service

    Energy Technology Data Exchange (ETDEWEB)

    Vahle, M.O.; Gossage, S.A.; Brenkosh, J.P. [Sandia National Labs., Albuquerque, NM (United States). Advanced Networking Integration Dept.

    1995-01-01

    Supercomputing `94, a high-performance computing and communications conference, was held November 14th through 18th, 1994 in Washington DC. For the past four years, Sandia National Laboratories has used this conference to showcase and focus its communications and networking endeavors. At the 1994 conference, Sandia built a Switched Multimegabit Data Service (SMDS) network running at 44.736 megabits per second linking its private SMDS network between its facilities in Albuquerque, New Mexico and Livermore, California to the convention center in Washington, D.C. For the show, the network was also extended from Sandia, New Mexico to Los Alamos National Laboratory and from Sandia, California to Lawrence Livermore National Laboratory. This paper documents and describes this network and how it was used at the conference.

  6. Science teacher orientations and PCK across science topics in grade 9 earth science

    Science.gov (United States)

    Campbell, Todd; Melville, Wayne; Goodwin, Dawne

    2017-07-01

    While the literature is replete with studies examining teacher knowledge and pedagogical content knowledge (PCK), few studies have investigated how science teacher orientations (STOs) shape classroom instruction. Therefore, this research explores the interplay between a STOs and the topic specificity of PCK across two science topics within a grade 9 earth science course. Through interviews and observations of one teacher's classroom across two sequentially taught, this research contests the notion that teachers hold a single way of conceptualising science teaching and learning. In this, we consider if multiple ontologies can provide potential explanatory power for characterising instructional enactments. In earlier work with the teacher in this study, using generic interview prompts and general discussions about science teaching and learning, we accepted the existence of a unitary STO and its promise of consistent reformed instruction in the classroom. However, upon close examination of instruction focused on different science topics, evidence was found to demonstrate the explanatory power of multiple ontologies for shaping characteristically different epistemological constructions across science topics. This research points to the need for care in generalising about teacher practice, as it reveals that a teacher's practice, and orientation, can vary, dependent on the context and science topics taught.

  7. Recent advances in topical anesthesia

    Science.gov (United States)

    2016-01-01

    Topical anesthetics act on the peripheral nerves and reduce the sensation of pain at the site of application. In dentistry, they are used to control local pain caused by needling, placement of orthodontic bands, the vomiting reflex, oral mucositis, and rubber-dam clamp placement. Traditional topical anesthetics contain lidocaine or benzocaine as active ingredients and are used in the form of solutions, creams, gels, and sprays. Eutectic mixtures of local anesthesia cream, a mixture of various topical anesthetics, has been reported to be more potent than other anesthetics. Recently, new products with modified ingredients and application methods have been introduced into the market. These products may be used for mild pain during periodontal treatment, such as scaling. Dentists should be aware that topical anesthetics, although rare, might induce allergic reactions or side effects as a result of an overdose. Topical anesthetics are useful aids during dental treatment, as they reduce dental phobia, especially in children, by mitigating discomfort and pain. PMID:28879311

  8. Discriminative Relational Topic Models.

    Science.gov (United States)

    Chen, Ning; Zhu, Jun; Xia, Fei; Zhang, Bo

    2015-05-01

    Relational topic models (RTMs) provide a probabilistic generative process to describe both the link structure and document contents for document networks, and they have shown promise on predicting network structures and discovering latent topic representations. However, existing RTMs have limitations in both the restricted model expressiveness and incapability of dealing with imbalanced network data. To expand the scope and improve the inference accuracy of RTMs, this paper presents three extensions: 1) unlike the common link likelihood with a diagonal weight matrix that allows the-same-topic interactions only, we generalize it to use a full weight matrix that captures all pairwise topic interactions and is applicable to asymmetric networks; 2) instead of doing standard Bayesian inference, we perform regularized Bayesian inference (RegBayes) with a regularization parameter to deal with the imbalanced link structure issue in real networks and improve the discriminative ability of learned latent representations; and 3) instead of doing variational approximation with strict mean-field assumptions, we present collapsed Gibbs sampling algorithms for the generalized relational topic models by exploring data augmentation without making restricting assumptions. Under the generic RegBayes framework, we carefully investigate two popular discriminative loss functions, namely, the logistic log-loss and the max-margin hinge loss. Experimental results on several real network datasets demonstrate the significance of these extensions on improving prediction performance.

  9. Topical anesthesia

    Directory of Open Access Journals (Sweden)

    Mritunjay Kumar

    2015-01-01

    Full Text Available Topical anesthetics are being widely used in numerous medical and surgical sub-specialties such as anesthesia, ophthalmology, otorhinolaryngology, dentistry, urology, and aesthetic surgery. They cause superficial loss of pain sensation after direct application. Their delivery and effectiveness can be enhanced by using free bases; by increasing the drug concentration, lowering the melting point; by using physical and chemical permeation enhancers and lipid delivery vesicles. Various topical anesthetic agents available for use are eutectic mixture of local anesthetics, ELA-max, lidocaine, epinephrine, tetracaine, bupivanor, 4% tetracaine, benzocaine, proparacaine, Betacaine-LA, topicaine, lidoderm, S-caine patch™ and local anesthetic peel. While using them, careful attention must be paid to their pharmacology, area and duration of application, age and weight of the patients and possible side-effects.

  10. Using Technology to Support Discussions on Sensitive Topics in the Study of Business Ethics

    Directory of Open Access Journals (Sweden)

    Michelle WL Fong

    2015-06-01

    Full Text Available There is a dearth of research into teaching strategies and learning approaches for units involving sensitive topics that can provoke an emotional response in students. In a business ethics unit, attempts to strike a balance between conceptual knowledge and theory and skills training can be challenging because the unit can involve personal, sensitive or controversial topics. When engaging in deep and meaningful face-to-face discussion, students may unknowingly divulge personal opinions that they later regret or become identified with by other students over time. Value-laden topics may also lead to clashes between students if face-to-face discussions are not managed properly. This paper considers the use of technology in blended learning to provide an optimal learning environment for student discussion on sensitive topics via role-play and simulation in a first-year business ethics unit. The Audience Response System (ARS, online discussion boards and blogs, and wikis are assessed for their suitability in supporting online role-play and simulation. Among these online tools, asynchronous online discussion boards and blogs are the ideal tools for supporting student discussion on sensitive topics in online role-play and simulation.

  11. Problematic topic transitions in dysarthric conversation.

    Science.gov (United States)

    Bloch, Steven; Saldert, Charlotta; Ferm, Ulrika

    2015-01-01

    This study examined the nature of topic transition problems associated with acquired progressive dysarthric speech in the everyday conversation of people with motor neurone disease. Using conversation analytic methods, a video collection of five naturally occurring problematic topic transitions was identified, transcribed and analysed. These were extracted from a main collection of over 200 other-initiated repair sequences and a sub-set of 15 problematic topic transition sequences. The sequences were analysed with reference to how the participants both identified and resolved the problems. Analysis revealed that topic transition by people with dysarthria can prove problematic. Conversation partners may find transitions problematic not only because of speech intelligibility but also because of a sequential disjuncture between the dysarthric speech turn and whatever topic has come prior. In addition the treatment of problematic topic transition as a complaint reveals the potential vulnerability of people with dysarthria to judgements of competence. These findings have implications for how dysarthria is conceptualized and how specific actions in conversation, such as topic transition, might be suitable targets for clinical intervention.

  12. 76 FR 81806 - Ophthalmic and Topical Dosage Form New Animal Drugs; Ivermectin Topical Solution

    Science.gov (United States)

    2011-12-29

    .... FDA-2011-N-0003] Ophthalmic and Topical Dosage Form New Animal Drugs; Ivermectin Topical Solution... solution of ivermectin. DATES: This rule is effective December 29, 2011. FOR FURTHER INFORMATION CONTACT... ANADA 200-318 for [[Page 81807

  13. Topical Session on the Decommissioning and Dismantling Safety Case

    International Nuclear Information System (INIS)

    2002-01-01

    Set up by the Radioactive Waste Management Committee (RWMC), the WPDD brings together senior representatives of national organisations who have a broad overview of Decommissioning and Dismantling (D and D) issues through their work as regulators, implementers, R and D experts or policy makers. These include representatives from regulatory authorities, industrial decommissioners from the NEA Cooperative Programme on Exchange of Scientific and Technical Information on Nuclear Installation Decommissioning Projects (CPD), and cross-representation from the NEA Committee on Nuclear Regulatory Activities, the Committee on Radiation Protection and Public Health, and the RWMC. The EC is a member of the WPDD and the IAEA also participates. This ensures co-ordination amongst activities in these international programmes. Participation from civil society organisations is considered on a case by case basis, and has already taken place through the active involvement of the Group of Municipalities with Nuclear Installations at the first meeting of the WPDD At its second meeting, in Paris, 5-7 December 2001, the WPDD held two topical sessions on the D and D Safety Case and on the Management of Materials from D and D, respectively. This report documents the topical session on the safety case. The topical session was meant to provide an exchange of information and experience on the following issues: What topics should be included in a safety case? Of what should it consist? Is there sufficient and complete guidance nationally and internationally? How do practices differ internationally? Main boundary condition to this session was that it would deal with plants where spent fuel has been removed. Also the topical sessions was kept at a level that makes the most of the varied constituency of the WPDD. Namely, interface issues are important, and issue-identification and discussion was the immediate goal. There was less interest in examining areas where variability amongst national

  14. Dressings and topical agents for preventing pressure ulcers.

    Science.gov (United States)

    Moore, Zena E H; Webster, Joan

    2013-08-18

    Pressure ulcers, which are localised injury to the skin, or underlying tissue or both, occur when people are unable to reposition themselves to relieve pressure on bony prominences. Pressure ulcers are often difficult to heal, painful and impact negatively on the individual's quality of life. The cost implications of pressure ulcer treatment are considerable, compounding the challenges in providing cost effective, efficient health services. Efforts to prevent the development of pressure ulcers have focused on nutritional support, pressure redistributing devices, turning regimes and the application of various topical agents and dressings designed to maintain healthy skin, relieve pressure and prevent shearing forces. Although products aimed at preventing pressure ulcers are widely used, it remains unclear which, if any, of these approaches are effective in preventing the development of pressure ulcers. To evaluate the effects of dressings and topical agents on the prevention of pressure ulcers, in people of any age without existing pressure ulcers, but considered to be at risk of developing a pressure ulcer, in any healthcare setting. In February 2013 we searched the following electronic databases to identify reports of relevant randomised clinical trials (RCTs): the Cochrane Wounds Group Specialised Register; the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Database of Abstracts of Reviews of Effects (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE; and EBSCO CINAHL. We included RCTs evaluating the use of dressings, topical agents, or topical agents with dressings, compared with a different dressing, topical agent, or combined topical agent and dressing, or no intervention or standard care, with the aim of preventing the development of a pressure ulcer. We assessed trials for their appropriateness for inclusion and for their risk of bias. This was done by two review

  15. Topical methotrexate pretreatment enhances the therapeutic effect of topical 5-aminolevulinic acid-mediated photodynamic therapy on hamster buccal pouch precancers

    OpenAIRE

    Deng-Fu Yang; Jeng-Woei Lee; Hsin-Ming Chen; Yih-Chih Hsu

    2014-01-01

    Topical 5-aminolevulinic acid-mediated photodynamic therapy (ALA-PDT) is effective for treatment of human oral precancerous lesions. This animal study aimed to assess whether topical methotrexate (MTX) pretreatment could enhance the therapeutic effect of topical ALA-PDT on hamster buccal pouch precancerous lesions. Methods: Twenty hamster buccal pouch precancerous lesions were treated with either topical ALA-PDT with topical MTX pretreatment (topical MTX-ALA-PDT group, n = 10) or topical A...

  16. Topical immunomodulators in dermatology

    Directory of Open Access Journals (Sweden)

    Khandpur Sujay

    2004-04-01

    Full Text Available Topical immunomodulators are agents that regulate the local immune response of the skin. They are now emerging as the therapy of choice for several immune-mediated dermatoses such as atopic dermatitis, contact allergic dermatitis, alopecia areata, psoriasis, vitiligo, connective tissue disorders such as morphea and lupus erythematosus, disorders of keratinization and several benign and malignant skin tumours, because of their comparable efficacy, ease of application and greater safety than their systemic counterparts. They can be used on a domiciliary basis for longer periods without aggressive monitoring. In this article, we have discussed the mechanism of action, common indications and side-effects of the commonly used topical immunomodulators, excluding topical steroids. Moreover, newer agents, which are still in the experimental stages, have also been described. A MEDLINE search was undertaken using the key words "topical immunomodulators, dermatology" and related articles were also searched. In addition, a manual search for many Indian articles, which are not indexed, was also carried out. Wherever possible, the full article was reviewed. If the full article could not be traced, the abstract was used.

  17. Doctoral Dissertation Topics in Education: Do They Align with Critical Issues?

    Directory of Open Access Journals (Sweden)

    Ethan J Allen

    2016-11-01

    Full Text Available American society faces complex educational issues which impact many facets of its national interests. Institutions of higher education are granting doctoral degrees to educational leaders, but it is not known to what extent their dissertation topics are aligned with both longstanding and critical issues in education. Using a theoretical framework synthesizing Paul and Elder’s critical thinking model and Kuhlthau’s information seeking process, this study examines a set of education doctoral dissertation topical selections and categorizes them by general themes in relationship to many of the recognized educational issues in the United States. Investigators categorized dissertations from four departments within the College of Education of their home institution. The dataset, retrieved from ProQuest Dissertations and Theses Global, consisted of 231 documents published between 2005 and 2014. Through an inter-rater process examining dissertation titles, abstracts, and keywords, the dissertations were assigned critical issue themes culled from nine editions of a college text, and then categorized under a broader topical scheme situated within a well-used educational research website. Findings indicated that most dissertations concentrated in studies that researched problems and issues within schools. Further, some of the issues considered longstanding were not studied by dissertation authors within the sample. For example, privatization of schools and classroom discipline and justice were not selected for study. Findings also suggest new directions for those responsible for dissertation supervision and topic selection. The study adds to the literature on dissertation topic selection that addresses existing educational issues.

  18. Recent Advances In Topical Therapy In Dermatology

    Directory of Open Access Journals (Sweden)

    Mohan Thappa Devinder

    2003-01-01

    Full Text Available With changing times various newer topical agents are introduced in the field of dermatology. Tacrolimus and pimecrolimus are immunisuppressants, which are effective topically and are tried in the management of atopic dermatitis as well as other disorders including allergic contact dermatitis, atrophic lichen planus, pyoderma gangrenosum. Imiquimod, an immune response modifier, is presently in use for genital warts but has potentials as anti- tumour agent and in various other dermatological conditions when used topically. Tazarotene is a newer addition to the list of topical reginoids, which is effective in psoriasis and has better effect in combination with calcipotriene, phototherapy and topical costicosteroids. Tazarotene and adapelene are also effective in inflammatory acne. Calcipotriol, a vitamin D analogue has been introduced as a topical agent in the treatment of psoriasis. Steroid components are also developed recently which will be devoid of the side effects but having adequate anti-inflammatory effect. Topical photodynamic therapy has also a wide range of use in dermatology. Newer topical agents including cidofovir, capsaicin, topical sensitizers, topical antifungal agents for onychomycosis are also of use in clinical practice. Other promising developments include skin substitutes and growth factors for wound care.

  19. A Lack of Systemic Absorption Following the Repeated Application of Topical Quetiapine in Healthy Adults.

    Science.gov (United States)

    Kayhart, Bryce; Lapid, Maria I; Nelson, Sarah; Cunningham, Julie L; Thompson, Virginia H; Leung, Jonathan G

    2018-01-01

    In the absence of suitable oral or intravenous access for medication administration and when the intramuscular medications are undesirable, alternative routes for drug delivery may be considered. Antipsychotics administered via an inhaled, intranasal, rectal, or topical route have been described in the literature. Topically administered antipsychotics have been previously reported to produce negligible systemic absorption despite being used in clinical practice for nausea and behavioral symptoms associated with dementia. Additionally, the American Academy of Hospice and Palliative Medicine recommends against the use of topical medications that lack supporting literature. Three studies have assessed the systemic absorption of different antipsychotics after administration of only a single, topically applied dose. To evaluate whether the repeated administration of a topically applied antipsychotic may result in detectable serum levels in an accumulating fashion, a pharmacokinetic study was conducted. Five healthy, adult participants consented to receive extemporaneously prepared topical quetiapine in Lipoderm every 4 hours for a total of 5 doses. Blood samples were drawn at baseline and hours 2, 4, 8, 12, 16, and 24, and serum quetiapine concentrations were measured using high-performance liquid chromatography. Quetiapine was undetectable in every sample from 3 participants. Two participants had minimally detectable serum quetiapine levels no sooner than hour 12 of the study period. Extemporaneously prepared quetiapine in Lipoderm resulted in nonexistent or minimal serum level following repeated topical administration. The use of topically applied quetiapine should still be questioned.

  20. Topic Modeling of Hierarchical Corpora /

    OpenAIRE

    Kim, Do-kyum

    2014-01-01

    The sizes of modern digital libraries have grown beyond our capacity to comprehend manually. Thus we need new tools to help us in organizing and browsing large corpora of text that do not require manually examining each document. To this end, machine learning researchers have developed topic models, statistical learning algorithms for automatic comprehension of large collections of text. Topic models provide both global and local views of a corpus; they discover topics that run through the co...

  1. Testosterone Topical

    Science.gov (United States)

    ... not apply any testosterone topical products to your penis or scrotum or to skin that has sores, ... are severe or do not go away: breast enlargement and/or pain decreased sexual desire acne depression ...

  2. Topic prominence in Chinese EFL learners’ interlanguage

    Directory of Open Access Journals (Sweden)

    Shaopeng Li

    2014-01-01

    Full Text Available The present study aims to investigate the general characteristics of topicprominent typological interlanguage development of Chinese learners of English in terms of acquiring subject-prominent English structures from a discourse perspective. Topic structures mainly appear in Chinese discourse in the form of topic chains (Wang, 2002; 2004. The research target are the topic chain, which is the main topic-prominent structure in Chinese discourse, and zero anaphora, which is the most common topic anaphora in the topic chain. Two important findings emerged from the present study. First, the characteristics of Chinese topic chains are transferrable to the interlanguage of Chinese EFL learners, thus resulting in overgeneralization of the zero anaphora. Second, the interlanguage discourse of Chinese EFL learners reflects a change of the second language acquisition process from topic-prominence to subject-prominence, thus lending support to the discourse transfer hypothesis.

  3. Systemic component of protoporphyrin IX production in nude mouse skin upon topical application of aminolevulinic acid depends on the application conditions

    NARCIS (Netherlands)

    van den Akker, Johanna T. H. M.; Iani, Vladimir; Star, Willem M.; Sterenborg, Henricus J. C. M.; Moan, Johan

    2002-01-01

    Topical application of 5-aminolevulinic acid (ALA) for protoporphyrin IX (PpIX)-based photodynamic therapy of skin cancer is generally considered not to induce systemic side effects because PpIX is supposed to be formed locally. However, earlier studies with topically applied ALA have revealed that

  4. Final report on the Copper Mountain conference on multigrid methods

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-01

    The Copper Mountain Conference on Multigrid Methods was held on April 6-11, 1997. It took the same format used in the previous Copper Mountain Conferences on Multigrid Method conferences. Over 87 mathematicians from all over the world attended the meeting. 56 half-hour talks on current research topics were presented. Talks with similar content were organized into sessions. Session topics included: fluids; domain decomposition; iterative methods; basics; adaptive methods; non-linear filtering; CFD; applications; transport; algebraic solvers; supercomputing; and student paper winners.

  5. Topical report review status: Volume 10

    International Nuclear Information System (INIS)

    1996-03-01

    This report provides industry with procedures for submitting topical reports, guidance on how the U.S. Nuclear Regulatory Commission (NRC) processes and responds to topical report submittals, and an accounting, with review schedules, of all topical reports currently accepted for review by the NRC. This report is published annually

  6. COMPARATIVE STUDY OF TOPICAL MOMETASONE FUROATE 0.1%, TOPICAL 0.03% TACROLIMUS, TOPICAL BASIC FIBROBLAST GROWTH FACTOR (bFGF IN CHILDHOOD VITILIGO

    Directory of Open Access Journals (Sweden)

    Kavitha S. B

    2017-08-01

    Full Text Available BACKGROUND Vitiligo, the commonest of all pigmentary disorders, is an idiopathic, acquired cutaneous achromia, characterised by circumscribed, chalky white macules. It may also involve the pigment epithelium of the eyes, the inner ear and the leptomeninges. Although, vitiligo can begin at any age, it develops before the age of 20 years in 50% of the patients and before the age of 10 years in 25% of patients. MATERIALS AND METHODS The study was conducted for a period of one year with 6 months active intervention. A group of 60 consecutive children attending the outpatient Department of Dermatology were included in this study. The same patients were acting as controls. RESULTS Grade 4 response was seen in 12 cases (60% who were on mometasone (VV-20%, focal-30%, segmental-10%, in 10 cases (50% on tacrolimus (VV-20%, focal-30% and in 4 cases (20% on bFGF (focal. Lesions on the face and neck showed grade 4 response in 16 cases (mometasone-8, tacrolimus-6 and bFGF-2, extremities in 6 cases. On the whole grade, 4 response was observed more with mometasone (60% followed by tacrolimus (50%. Grade 3 response was observed with bFGF (30%. CONCLUSION Topical mometasone was very effective among the 3 drugs used in childhood vitiligo showing grade 4 repigmentation in all types of vitiligo except mucosal vitiligo. Tacrolimus proved almost as effective as mometasone to restore skin colour in lesions of vitiligo in children. Because it does not produce atrophy or other adverse effects, tacrolimus may be very useful for younger patients, and for sensitive areas of the skin such as eyelids, it should be considered in other skin disorders currently treated with topical steroids for prolonged periods. Topical basic fibroblast growth factor though less effective than mometasone and tacrolimus, but can be tried as initial therapy in resistant cases such as segmental vitiligo as initial therapy of small vitiligo patches when physicians may not like to initiate high risk

  7. Chiropractic: MedlinePlus Health Topic

    Science.gov (United States)

    ... for back pain (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Chiropractic updates by ... ENCYCLOPEDIA Chiropractic care for back pain Related Health Topics Back Pain Complementary and Integrative Medicine National Institutes ...

  8. Diets: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish Mediterranean diet (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Diets updates by ... foods Diet-busting foods Mediterranean diet Related Health Topics Child Nutrition DASH Eating Plan Diabetic Diet Nutrition ...

  9. Colonoscopy: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish Virtual colonoscopy (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Colonoscopy updates by ... Colonoscopy Colonoscopy discharge Sigmoidoscopy Virtual colonoscopy Related Health Topics Colonic Diseases Colonic Polyps Colorectal Cancer National Institutes ...

  10. Dialysis: MedlinePlus Health Topic

    Science.gov (United States)

    ... access for hemodialysis (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Dialysis updates by ... for hemodialysis Show More Show Less Related Health Topics Creatinine Kidney Cysts Kidney Failure Peritoneal Disorders National ...

  11. Menopause: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish What is Menopause? (National Institute on Aging) Topic Image MedlinePlus Email Updates Get Menopause updates by ... test Menopause Types of hormone therapy Related Health Topics Hormone Replacement Therapy Menstruation Premature Ovarian Failure National ...

  12. Vaginitis: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish Vulvovaginitis - overview (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Vaginitis updates by ... Vaginitis test - wet mount Vulvovaginitis - overview Related Health Topics Trichomoniasis Vaginal Diseases Yeast Infections Other Languages Find ...

  13. Topical Acne Treatments and Pregnancy

    Science.gov (United States)

    Topical Acne Treatments In every pregnancy, a woman starts out with a 3-5% chance of having a baby ... This sheet talks about whether exposure to topical acne treatments may increase the risk for birth defects ...

  14. Symbiosis: Rich, Exciting, Neglected Topic

    Science.gov (United States)

    Rowland, Jane Thomas

    1974-01-01

    Argues that the topic of symbiosis has been greatly neglected and underemphasized in general-biology textbooks. Discusses many types and examples of symbiosis, and provides an extensive bibliography of the literature related to this topic. (JR)

  15. Prediabetes:MedlinePlus Health Topic

    Science.gov (United States)

    ... in Spanish Prediabetes (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Prediabetes updates by ... Glucose tolerance test - non-pregnant Prediabetes Related Health Topics A1C Diabetes Diabetes in Children and Teens Diabetes ...

  16. Diabetes: MedlinePlus Health Topic

    Science.gov (United States)

    ... High blood sugar (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Diabetes updates by ... ketones test Show More Show Less Related Health Topics A1C Blood Sugar Diabetes and Pregnancy Diabetes Complications ...

  17. Combining density functional theory calculations, supercomputing, and data-driven methods to design new materials (Conference Presentation)

    Science.gov (United States)

    Jain, Anubhav

    2017-04-01

    Density functional theory (DFT) simulations solve for the electronic structure of materials starting from the Schrödinger equation. Many case studies have now demonstrated that researchers can often use DFT to design new compounds in the computer (e.g., for batteries, catalysts, and hydrogen storage) before synthesis and characterization in the lab. In this talk, I will focus on how DFT calculations can be executed on large supercomputing resources in order to generate very large data sets on new materials for functional applications. First, I will briefly describe the Materials Project, an effort at LBNL that has virtually characterized over 60,000 materials using DFT and has shared the results with over 17,000 registered users. Next, I will talk about how such data can help discover new materials, describing how preliminary computational screening led to the identification and confirmation of a new family of bulk AMX2 thermoelectric compounds with measured zT reaching 0.8. I will outline future plans for how such data-driven methods can be used to better understand the factors that control thermoelectric behavior, e.g., for the rational design of electronic band structures, in ways that are different from conventional approaches.

  18. A Parallel Supercomputer Implementation of a Biological Inspired Neural Network and its use for Pattern Recognition

    International Nuclear Information System (INIS)

    De Ladurantaye, Vincent; Lavoie, Jean; Bergeron, Jocelyn; Parenteau, Maxime; Lu Huizhong; Pichevar, Ramin; Rouat, Jean

    2012-01-01

    A parallel implementation of a large spiking neural network is proposed and evaluated. The neural network implements the binding by synchrony process using the Oscillatory Dynamic Link Matcher (ODLM). Scalability, speed and performance are compared for 2 implementations: Message Passing Interface (MPI) and Compute Unified Device Architecture (CUDA) running on clusters of multicore supercomputers and NVIDIA graphical processing units respectively. A global spiking list that represents at each instant the state of the neural network is described. This list indexes each neuron that fires during the current simulation time so that the influence of their spikes are simultaneously processed on all computing units. Our implementation shows a good scalability for very large networks. A complex and large spiking neural network has been implemented in parallel with success, thus paving the road towards real-life applications based on networks of spiking neurons. MPI offers a better scalability than CUDA, while the CUDA implementation on a GeForce GTX 285 gives the best cost to performance ratio. When running the neural network on the GTX 285, the processing speed is comparable to the MPI implementation on RQCHP's Mammouth parallel with 64 notes (128 cores).

  19. Tracking topic birth and death in LDA.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Andrew T.; Robinson, David Gerald

    2011-09-01

    Most topic modeling algorithms that address the evolution of documents over time use the same number of topics at all times. This obscures the common occurrence in the data where new subjects arise and old ones diminish or disappear entirely. We propose an algorithm to model the birth and death of topics within an LDA-like framework. The user selects an initial number of topics, after which new topics are created and retired without further supervision. Our approach also accommodates many of the acceleration and parallelization schemes developed in recent years for standard LDA. In recent years, topic modeling algorithms such as latent semantic analysis (LSA)[17], latent Dirichlet allocation (LDA)[10] and their descendants have offered a powerful way to explore and interrogate corpora far too large for any human to grasp without assistance. Using such algorithms we are able to search for similar documents, model and track the volume of topics over time, search for correlated topics or model them with a hierarchy. Most of these algorithms are intended for use with static corpora where the number of documents and the size of the vocabulary are known in advance. Moreover, almost all current topic modeling algorithms fix the number of topics as one of the input parameters and keep it fixed across the entire corpus. While this is appropriate for static corpora, it becomes a serious handicap when analyzing time-varying data sets where topics come and go as a matter of course. This is doubly true for online algorithms that may not have the option of revising earlier results in light of new data. To be sure, these algorithms will account for changing data one way or another, but without the ability to adapt to structural changes such as entirely new topics they may do so in counterintuitive ways.

  20. Topics in lightwave transmission systems

    CERN Document Server

    Li, Tingye

    1991-01-01

    Topics in Lightwave Transmission Systems is a second volume of a treatise on optical fiber communications that is devoted to the science, engineering, and application of information transmission via optical fibers. The first volume, published in 1985, dealt exclusively with fiber fabrication. The present volume contains topics that pertain to subsystems and systems. The book contains five chapters and begins with discussions of transmitters and receivers, which are basic to systems now operating in the field. Subsequent chapters cover topics relating to coherent systems: frequency and phase m

  1. Osteoarthritis guidelines: a progressive role for topical nonsteroidal anti-inflammatory drugs

    Directory of Open Access Journals (Sweden)

    Stanos SP

    2013-04-01

    Full Text Available Steven P Stanos Rehabilitation Institute of Chicago, Center for Pain Management, Chicago, IL, USA Abstract: Current treatment guidelines for the treatment of chronic pain associated with osteoarthritis reflect the collective clinical knowledge of international experts in weighing the benefits of pharmacologic therapy options while striving to minimize the negative effects associated with them. Consideration of disease progression, pattern of flares, level of functional impairment or disability, response to treatment, coexisting conditions such as cardiovascular disease or gastrointestinal disorders, and concomitant prescription medication use should be considered when creating a therapeutic plan for a patient with osteoarthritis. Although topical nonsteroidal anti-inflammatory drugs historically have not been prevalent in many of the guidelines for osteoarthritis treatment, recent evidence-based medicine and new guidelines now support their use as a viable option for the clinician seeking alternatives to typical oral formulations. This article provides a qualitative review of these treatment guidelines and the emerging role of topical nonsteroidal anti-inflammatory drugs as a therapy option for patients with localized symptoms of osteoarthritis who may be at risk for oral nonsteroidal anti-inflammatory drug-related serious adverse events. Keywords: osteoarthritis, nonsteroidal anti-inflammatory drugs, guidelines, topical analgesics, diclofenac

  2. Prevalence of Topical Corticosteroids Related Adverse Drug Events and Associated Factors in Selected Community Pharmacies and Cosmetic Shops of Addis Ababa, Ethiopia

    Directory of Open Access Journals (Sweden)

    Mahlet Tsegaye

    2018-03-01

    Conclusion: Majority of the topical corticosteroids were obtained without prescription for the purpose of beautification rather than treatment. A higher proportion of cosmetic users reported to have experienced at least one adverse event. There needs to consider safety concerns related to topical corticosteroids use in the city.

  3. Forschungszentrum Juelich. Annual report 2008

    International Nuclear Information System (INIS)

    Frick, Frank; Roegener, Wiebke

    2009-07-01

    The following topics are dealt with: The precise lattice QCD mass calculation of protons and neutrons by means of the JUGENE supercomputer, the early diagnosis of morbus Alzheimer, the fabrication of vertebra-column implants consisting of porus titanium, software for the improvement of the spatial resolution in electron microscopy by means of aberration corrections. (HSI)

  4. Forschungszentrum Juelich. Annual report 2008; Forschungszentrum Juelich. Jahresbericht 2008

    Energy Technology Data Exchange (ETDEWEB)

    Frick, Frank; Roegener, Wiebke

    2009-07-15

    The following topics are dealt with: The precise lattice QCD mass calculation of protons and neutrons by means of the JUGENE supercomputer, the early diagnosis of morbus Alzheimer, the fabrication of vertebra-column implants consisting of porus titanium, software for the improvement of the spatial resolution in electron microscopy by means of aberration corrections. (HSI)

  5. High Performance Simulation of Large-Scale Red Sea Ocean Bottom Seismic Data on the Supercomputer Shaheen II

    KAUST Repository

    Tonellot, Thierry

    2017-02-27

    A combination of both shallow and deepwater, plus islands and coral reefs, are some of the main features contributing to the complexity of subsalt seismic exploration in the Red Sea transition zone. These features often result in degrading effects on seismic images. State-of-the-art ocean bottom acquisition technologies are therefore required to record seismic data with optimal fold and offset, as well as advanced processing and imaging techniques. Numerical simulations of such complex seismic data can help improve acquisition design and also help in customizing, validating and benchmarking the processing and imaging workflows that will be applied on the field data. Subsequently, realistic simulation of wave propagation is a computationally intensive process requiring a realistic model and an efficient 3D wave equation solver. Large-scale computing resources are also required to meet turnaround time compatible with a production time frame. In this work, we present the numerical simulation of an ocean bottom seismic survey to be acquired in the Red Sea transition zone starting in summer 2016. The survey\\'s acquisition geometry comprises nearly 300,000 unique shot locations and 21,000 unique receiver locations, covering about 760 km2. Using well log measurements and legacy 2D seismic lines in this area, a 3D P-wave velocity model was built, with a maximum depth of 7 km. The model was sampled at 10 m in each direction, resulting in more than 5 billion cells. Wave propagation in this model was performed using a 3D finite difference solver in the time domain based on a staggered grid velocity-pressure formulation of acoustodynamics. To ensure that the resulting data could be generated sufficiently fast, the King Abdullah University of Science and Technology (KAUST) supercomputer Shaheen II Cray XC40 was used. A total of 21,000 three-component (pressure and vertical and horizontal velocity) common receiver gathers with a 50 Hz maximum frequency were computed in less

  6. High Performance Simulation of Large-Scale Red Sea Ocean Bottom Seismic Data on the Supercomputer Shaheen II

    KAUST Repository

    Tonellot, Thierry; Etienne, Vincent; Gashawbeza, Ewenet; Curiel, Emesto Sandoval; Khan, Azizur; Feki, Saber; Kortas, Samuel

    2017-01-01

    A combination of both shallow and deepwater, plus islands and coral reefs, are some of the main features contributing to the complexity of subsalt seismic exploration in the Red Sea transition zone. These features often result in degrading effects on seismic images. State-of-the-art ocean bottom acquisition technologies are therefore required to record seismic data with optimal fold and offset, as well as advanced processing and imaging techniques. Numerical simulations of such complex seismic data can help improve acquisition design and also help in customizing, validating and benchmarking the processing and imaging workflows that will be applied on the field data. Subsequently, realistic simulation of wave propagation is a computationally intensive process requiring a realistic model and an efficient 3D wave equation solver. Large-scale computing resources are also required to meet turnaround time compatible with a production time frame. In this work, we present the numerical simulation of an ocean bottom seismic survey to be acquired in the Red Sea transition zone starting in summer 2016. The survey's acquisition geometry comprises nearly 300,000 unique shot locations and 21,000 unique receiver locations, covering about 760 km2. Using well log measurements and legacy 2D seismic lines in this area, a 3D P-wave velocity model was built, with a maximum depth of 7 km. The model was sampled at 10 m in each direction, resulting in more than 5 billion cells. Wave propagation in this model was performed using a 3D finite difference solver in the time domain based on a staggered grid velocity-pressure formulation of acoustodynamics. To ensure that the resulting data could be generated sufficiently fast, the King Abdullah University of Science and Technology (KAUST) supercomputer Shaheen II Cray XC40 was used. A total of 21,000 three-component (pressure and vertical and horizontal velocity) common receiver gathers with a 50 Hz maximum frequency were computed in less than

  7. Minoxidil topical solution: an unsafe product for children.

    Science.gov (United States)

    Claudet, Isabelle; Cortey, Caroline; Honorat, Raphaele; Franchitto, Nicolas

    2015-01-01

    Minoxidil hair formulation is commonly used for the treatment of male or female androgenic alopecia. This over-the-counter product is wrongly considered safe. The ingestion of a few milliliters by a child can lead to significant intoxication. We report a case of significant intoxication after the ingestion of topical minoxidil (Alopexy; Pierre Fabre Laboratoires, SA, Switzerland). A 7-year-old girl, who accidentally ingested a teaspoon of minoxidil hair solution, presented to the pediatric emergency department for emesis. At admission, she had a blood pressure of 86/56 mm Hg and a pulse of 149 beats per minute. Hypotension lasted 40 hours with the lowest value 24 hours after ingestion (79/33 mm Hg). She presented electrocardiogram changes (sinus tachycardia and flattening T-waves) but normal cardiac enzymes. Infusion of 20 mL/kg of normal saline fluid had no hemodynamic effect. Her blood pressure normalized on day 2. Minoxidil topical solution is an unsafe product for children. This formulation should be strictly kept out of reach of children and manufacturers should enhance child-resistance security of packaging. The over-the-counter availability must be questioned.

  8. Preliminary stop of the TOPical Imiquimod treatment of high-grade Cervical intraepithelial neoplasia (TOPIC) trial

    NARCIS (Netherlands)

    Koeneman, M. M.; Kruse, Arnold-Jan; Kooreman, L. F. S.; zur Hausen, Axel; Hopman, Anton H N; Sep, S. J. S.; Van Gorp, T.; Slangen, B. F. M.; van Beekhuizen, H. J.; de Sande, Michiel A. J. van; Gerestein, Cornelis G.; Nijman, H. W.; Kruitwagen, R. F. M. P.

    2017-01-01

    The "TOPical Imiquimod treatment of high-grade Cervical intraepithelial neoplasia" (TOPIC) trial was stopped preliminary, due to lagging inclusions. This study aimed to evaluate the treatment efficacy and clinical applicability of imiquimod 5% cream in high-grade cervical intraepithelial neoplasia

  9. How do disease perception, treatment features, and dermatologist–patient relationship impact on patients assuming topical treatment? An Italian survey

    Directory of Open Access Journals (Sweden)

    Burroni AG

    2015-02-01

    Full Text Available Anna Graziella Burroni,1 Mariella Fassino,2 Antonio Torti,3 Elena Visentin4 1IRCCS University Hospital San Martino, IST National Institute for Cancer Research, Genoa, Italy; 2Department of Psychology, Specialization School in Clinical Psychology, University of Turin, Turin, Italy; 3Dermatology practice, Milan, Italy; 4HTA and Scientific Support, CSD Medical Research Srl, Milan, Italy Background: Psoriasis largely affects daily activities and social interactions and has a strong impact on patients’ quality of life. Psoriatic patients have different attitudes toward their condition. Topical medications are essential for the treatment of psoriasis, but the majority of patients do not adhere to these therapies. Objective: The history of treatment success or failure seems to influence patient attitude toward topical therapy. Therefore, it is important to understand the psychological, experiential, and motivational aspects that could be critical for treatment adherence, and to describe the different attitudes toward topical treatment. Furthermore, the physician–patient relationship and the willingness to trust the dermatologist may have a substantial role in encouraging or discouraging patients’ attitudes toward topical therapy. Methods: A survey was designed to collect aspects that could be relevant to understanding different patient attitudes toward psoriasis and its treatments. A total of 495 self-administered questionnaires compiled by psoriatic patients were analyzed from 20 Italian specialized hospital centers in order to provide a nationwide picture. Results: Psoriatic patients have different perceptions and experiences in relation to their condition: half of them consider psoriasis as a disease, while the other half consider psoriasis as a disorder or a nuisance. Topical therapy is the most widely used treatment, even though it is not considered the most effective one and often perceived to be cosmetic. The main findings are: 1

  10. Deep Unfolding for Topic Models.

    Science.gov (United States)

    Chien, Jen-Tzung; Lee, Chao-Hsi

    2018-02-01

    Deep unfolding provides an approach to integrate the probabilistic generative models and the deterministic neural networks. Such an approach is benefited by deep representation, easy interpretation, flexible learning and stochastic modeling. This study develops the unsupervised and supervised learning of deep unfolded topic models for document representation and classification. Conventionally, the unsupervised and supervised topic models are inferred via the variational inference algorithm where the model parameters are estimated by maximizing the lower bound of logarithm of marginal likelihood using input documents without and with class labels, respectively. The representation capability or classification accuracy is constrained by the variational lower bound and the tied model parameters across inference procedure. This paper aims to relax these constraints by directly maximizing the end performance criterion and continuously untying the parameters in learning process via deep unfolding inference (DUI). The inference procedure is treated as the layer-wise learning in a deep neural network. The end performance is iteratively improved by using the estimated topic parameters according to the exponentiated updates. Deep learning of topic models is therefore implemented through a back-propagation procedure. Experimental results show the merits of DUI with increasing number of layers compared with variational inference in unsupervised as well as supervised topic models.

  11. Bimatoprost Topical

    Science.gov (United States)

    ... not use a cotton swab or any other brush or applicator to apply topical bimatoprost.To use the solution, follow these steps: Wash your hands and face thoroughly with soap and water. Be sure that all makeup is removed. Do not let the tip of ...

  12. Efficacy and tolerability of topical sertaconazole versus topical terbinafine in localized dermatophytosis: A randomized, observer-blind, parallel group study.

    Science.gov (United States)

    Chatterjee, Dattatreyo; Ghosh, Sudip Kumar; Sen, Sukanta; Sarkar, Saswati; Hazra, Avijit; De, Radharaman

    2016-01-01

    Epidermal dermatophyte infections most commonly manifest as tinea corporis or tinea cruris. Topical azole antifungals are commonly used in their treatment but literature suggests that most require twice-daily application and provide lower cure rates than the allylamine antifungal terbinafine. We conducted a head-to-head comparison of the effectiveness of the once-daily topical azole, sertaconazole, with terbinafine in these infections. We conducted a randomized, observer-blind, parallel group study (Clinical Trial Registry India [CTRI]/2014/09/005029) with adult patients of either sex presenting with localized lesions. The clinical diagnosis was confirmed by potassium hydroxide smear microscopy of skin scrapings. After baseline assessment of erythema, scaling, and pruritus, patients applied either of the two study drugs once daily for 2 weeks. If clinical cure was not seen at 2 weeks, but improvement was noted, application was continued for further 2 weeks. Patients deemed to be clinical failure at 2 weeks were switched to oral antifungals. Overall 88 patients on sertaconazole and 91 on terbinafine were analyzed. At 2 weeks, the clinical cure rates were comparable at 77.27% (95% confidence interval [CI]: 68.52%-86.03%) for sertaconazole and 73.63% (95% CI 64.57%-82.68%) for terbinafine ( P = 0.606). Fourteen patients in either group improved and on further treatment showed complete healing by another 2 weeks. The final cure rate at 4 weeks was also comparable at 93.18% (95% CI 88.75%-97.62%) and 89.01% (95% CI 82.59%-95.44%), respectively ( P = 0.914). At 2 weeks, 6 (6.82%) sertaconazole and 10 (10.99%) terbinafine recipients were considered as "clinical failure." Tolerability of both preparations was excellent. Despite the limitations of an observer-blind study without microbiological support, the results suggest that once-daily topical sertaconazole is as effective as terbinafine in localized tinea infections.

  13. A web search on environmental topics: what is the role of ranking?

    Science.gov (United States)

    Covolo, Loredana; Filisetti, Barbara; Mascaretti, Silvia; Limina, Rosa Maria; Gelatti, Umberto

    2013-12-01

    Although the Internet is easy to use, the mechanisms and logic behind a Web search are often unknown. Reliable information can be obtained, but it may not be visible as the Web site is not located in the first positions of search results. The possible risks of adverse health effects arising from environmental hazards are issues of increasing public interest, and therefore the information about these risks, particularly on topics for which there is no scientific evidence, is very crucial. The aim of this study was to investigate whether the presentation of information on some environmental health topics differed among various search engines, assuming that the most reliable information should come from institutional Web sites. Five search engines were used: Google, Yahoo!, Bing, Ask, and AOL. The following topics were searched in combination with the word "health": "nuclear energy," "electromagnetic waves," "air pollution," "waste," and "radon." For each topic three key words were used. The first 30 search results for each query were considered. The ranking variability among the search engines and the type of search results were analyzed for each topic and for each key word. The ranking of institutional Web sites was given particular consideration. Variable results were obtained when surfing the Internet on different environmental health topics. Multivariate logistic regression analysis showed that, when searching for radon and air pollution topics, it is more likely to find institutional Web sites in the first 10 positions compared with nuclear power (odds ratio=3.4, 95% confidence interval 2.1-5.4 and odds ratio=2.9, 95% confidence interval 1.8-4.7, respectively) and also when using Google compared with Bing (odds ratio=3.1, 95% confidence interval 1.9-5.1). The increasing use of online information could play an important role in forming opinions. Web users should become more aware of the importance of finding reliable information, and health institutions should be

  14. PREFACE: CEWQO Topical Issue CEWQO Topical Issue

    Science.gov (United States)

    Bozic, Mirjana; Man'ko, Margarita

    2009-09-01

    This topical issue of Physica Scripta collects selected peer-reviewed contributions based on invited and contributed talks and posters presented at the 15th Central European Workshop on Quantum Optics (CEWQO) which took place in Belgrade 29 May-3 June 2008 (http://cewqo08.phy.bg.ac.yu). On behalf of the whole community took place in Belgrade 29 May-3 June 2008 (http://cewqo08.phy.bg.ac.yu, cewqo08.phy.bg.ac.yu). On behalf of the whole community of the workshop, we thank the referees for their careful reading and useful suggestions which helped to improve all of the submitted papers. A brief description of CEWQO The Central European Workshop on Quantum Optics is a series of conferences started informally in Budapest in 1992. Sometimes small events transform into important conferences, as in the case of CEWQO. Professor Jozsef Janszky, from the Research Institute of Solid State Physics and Optics, is the founder of this series. Margarita Man'ko obtained the following information from Jozsef Janszky during her visit to Budapest, within the framework of cooperation between the Russian and Hungarian Academies of Sciences in 2005. He organized a small workshop on quantum optics in Budapest in 1992 with John Klauder as a main speaker. Then, bearing in mind that a year before Janszky himself was invited by Vladimir Buzek to give a seminar on the same topic in Bratislava, he decided to assign the name 'Central European Workshop on Quantum Optics', considering the seminar in Bratislava to be the first workshop and the one in Budapest the second. The third formal workshop took place in Bratislava in 1993 organized by Vladimir Buzek, then in 1994 (Budapest, by Jozsef Janszky), 1995 and 1996 (Budmerice, Slovakia, by Vladimir Buzek), 1997 (Prague, by Igor Jex), 1999 (Olomouc, Czech Republic, by Zdenek Hradil), 2000 (Balatonfüred, Hungary, by Jozsef Janszky ), 2001 (Prague, by Igor Jex), 2002 (Szeged, Hungary, by Mihaly Benedict), 2003 (Rostock,Germany, by Werner Vogel and

  15. Understanding the Correlations between Social Attention and Topic Trends of Scientific Publications

    Directory of Open Access Journals (Sweden)

    Xianlei Dong

    2016-03-01

    variable selection models, because the stepwise regression method is time consuming, especially for a large number of variables. Practical implications: This paper analyzes publication topic trends from three perspectives: tendency, seasonality, and correlation with social media attention, providing a new perspective for identifying and understanding topical themes in academic publications. Originality/value: To the best of our knowledge, we are the first to apply the state-space model to examine the relationships between healthcare-related publications and social media to investigate the relationships between a topic's evolvement and people's search behavior in social media. This paper thus provides a new viewpoint in the correlation analysis area, and demonstrates the value of considering social media attention in the analysis of publication topic trends.

  16. Proceedings of the 2. international conference on simulation methods in nuclear engineering

    International Nuclear Information System (INIS)

    Brais, A.

    1986-10-01

    The fifty papers presented at this conference cover the field of computerized simulation and mathematical modelling of processes in PWR and CANDU type reactors. Topics include thermalhydraulics of system transients, complex geometries, multi-fluid systems, and general situations; reactor physics simulations; reactor simulators; fuel and fuel channel behaviour; and applications of supercomputers, parallel processing, and microcomputers

  17. Engineering Technology Showcase to feature high-tech products from 18 companies

    OpenAIRE

    Gilbert, Karen

    2007-01-01

    The Student Technology Council (STC) at Virginia Tech is sponsoring an Engineering Technology Showcase on Tuesday, March 27. In addition to providing a platform for technology companies to show off their most recent innovations, technology presentations will be offered by Virginia Tech faculty and staff on topics ranging from a virtual greenhouse to the System X supercomputer.

  18. SERS internship fall 1995 abstracts and research papers

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Beverly

    1996-05-01

    This report is a compilation of twenty abstracts and their corresponding full papers of research projects done under the US Department of Energy Science and Engineering Research Semester (SERS) program. Papers cover a broad range of topics, for example, environmental transport, supercomputers, databases, biology. Selected papers were indexed separately for inclusion the the Energy Science and Technology Database.

  19. Learning topic models by belief propagation.

    Science.gov (United States)

    Zeng, Jia; Cheung, William K; Liu, Jiming

    2013-05-01

    Latent Dirichlet allocation (LDA) is an important hierarchical Bayesian model for probabilistic topic modeling, which attracts worldwide interest and touches on many important applications in text mining, computer vision and computational biology. This paper represents the collapsed LDA as a factor graph, which enables the classic loopy belief propagation (BP) algorithm for approximate inference and parameter estimation. Although two commonly used approximate inference methods, such as variational Bayes (VB) and collapsed Gibbs sampling (GS), have gained great success in learning LDA, the proposed BP is competitive in both speed and accuracy, as validated by encouraging experimental results on four large-scale document datasets. Furthermore, the BP algorithm has the potential to become a generic scheme for learning variants of LDA-based topic models in the collapsed space. To this end, we show how to learn two typical variants of LDA-based topic models, such as author-topic models (ATM) and relational topic models (RTM), using BP based on the factor graph representations.

  20. Bare quantifier fronting as contrastive topicalization

    Directory of Open Access Journals (Sweden)

    Ion Giurgea

    2015-11-01

    Full Text Available I argue that indefinites (in particular bare quantifiers such as ‘something’, ‘somebody’, etc. which are neither existentially presupposed nor in the restriction of a quantifier over situations, can undergo topicalization in a number of Romance languages (Catalan, Italian, Romanian, Spanish, but only if the sentence contains “verum” focus, i.e. focus on a high degree of certainty of the sentence. I analyze these indefinites as contrastive topics, using Büring’s (1999 theory (where the term ‘S-topic’ is used for what I call ‘contrastive topic’. I propose that the topic is evaluated in relation to a scalar set including generalized quantifiers such as {lP $x P(x, lP MANYx P(x, lP MOSTx P(x, lP “xP(x} or {lP $xP(x, lP P(a, lP P(b …}, and that the contrastive topic is the weakest generalized quantifier in this set. The verum focus, which is part of the “comment” that co-occurs with the “Topic”, introduces a set of alternatives including degrees of certainty of the assertion. The speaker asserts that his claim is certainly true or highly probable, contrasting it with stronger claims for which the degree of probability is unknown. This explains the observation that in downward entailing contexts, the fronted quantified DPs are headed by ‘all’ or ‘many’, whereas ‘some’, small numbers or ‘at least n’ appear in upward entailing contexts. Unlike other cases of non-specific topics, which are property topics, these are quantifier topics: the topic part is a generalized quantifier, the comment is a property of generalized quantifiers. This explains the narrow scope of the fronted quantified DP.

  1. Secondary School Students' Knowledge and Opinions on Astrobiology Topics and Related Social Issues

    Science.gov (United States)

    Oreiro, Raquel; Solbes, Jordi

    2017-01-01

    Astrobiology is the study of the origin of life on Earth and the distribution of life in the Universe. Its multidisciplinary approach, social and philosophical implications, and appeal within the discipline and beyond make astrobiology a uniquely qualified subject for general science education. In this study, student knowledge and opinions on astrobiology topics were investigated. Eighty-nine students in their last year of compulsory education (age 15) completed a written questionnaire that consisted of 10 open questions on the topic of astrobiology. The results indicate that students have significant difficulties understanding the origin of life on Earth, despite exposure to the topic by way of the assigned textbooks. The students were often unaware of past or present achievements in the search for life within the Solar System and beyond, topics that are far less commonly seen in textbooks. Student questionnaire answers also indicated that students had problems in reasoning and critical thinking when asked for their opinions on issues such as the potential for life beyond Earth, the question of whether UFOs exist, or what our place is in the Universe. Astrobiology might help initiate student awareness as to current thinking on these matters and should be considered for general science education.

  2. Secondary School Students' Knowledge and Opinions on Astrobiology Topics and Related Social Issues.

    Science.gov (United States)

    Oreiro, Raquel; Solbes, Jordi

    2017-01-01

    Astrobiology is the study of the origin of life on Earth and the distribution of life in the Universe. Its multidisciplinary approach, social and philosophical implications, and appeal within the discipline and beyond make astrobiology a uniquely qualified subject for general science education. In this study, student knowledge and opinions on astrobiology topics were investigated. Eighty-nine students in their last year of compulsory education (age 15) completed a written questionnaire that consisted of 10 open questions on the topic of astrobiology. The results indicate that students have significant difficulties understanding the origin of life on Earth, despite exposure to the topic by way of the assigned textbooks. The students were often unaware of past or present achievements in the search for life within the Solar System and beyond, topics that are far less commonly seen in textbooks. Student questionnaire answers also indicated that students had problems in reasoning and critical thinking when asked for their opinions on issues such as the potential for life beyond Earth, the question of whether UFOs exist, or what our place is in the Universe. Astrobiology might help initiate student awareness as to current thinking on these matters and should be considered for general science education. Key Words: Astrobiology-Students' views-Science education. Astrobiology 17, 91-99.

  3. Resources for Topics in Architecture.

    Science.gov (United States)

    Van Noate, Judith, Comp.

    This guide for conducting library research on topics in architecture or on the work of a particular architect presents suggestions for utilizing four categories of resources: books, dictionaries and encyclopedias, indexes, and a periodicals and series list (PASL). Two topics are researched as examples: the contemporary architect Richard Meier, and…

  4. Web directories as topical context

    NARCIS (Netherlands)

    Kaptein, R.; Kamps, J.; Aly, R.; Hauff, C.; den Hamer, I.; Hiemstra, D.; Huibers, T.; de Jong, F.

    2009-01-01

    In this paper we explore whether the Open Directory (or DMOZ) can be used to classify queries into topical categories on different levels and whether we can use this topical context to improve retrieval performance. We have set up a user study to let test persons explicitly classify queries into

  5. Basic anatomic aspects of the lung (segmental, lobular and sublobular) considering radiological point of view (Part 1)

    International Nuclear Information System (INIS)

    Santos, Itazil Benicio dos

    1994-01-01

    A basic anatomic study of the lung considering radiological aspects is presented. After a short introduction, some topics are emphasized, such as the structures which originate the lungs and lung segment details. Some histological elements are also briefly presented. Several illustrations complement the presentation

  6. Unsupervised topic discovery by anomaly detection

    OpenAIRE

    Cheng, Leon

    2013-01-01

    Approved for public release; distribution is unlimited With the vast amount of information and public comment available online, it is of increasing interest to understand what is being said and what topics are trending online. Government agencies, for example, want to know what policies concern the public without having to look through thousands of comments manually. Topic detection provides automatic identification of topics in documents based on the information content and enhances many ...

  7. Topical cyclosporine for atopic keratoconjunctivitis.

    Science.gov (United States)

    González-López, Julio J; López-Alcalde, Jesús; Morcillo Laiz, Rafael; Fernández Buenaga, Roberto; Rebolleda Fernández, Gema

    2012-09-12

    Atopic keratoconjunctivitis (AKC) is a chronic ocular surface non-infectious inflammatory condition that atopic dermatitis patients may suffer at any time point in the course of their dermatologic disease and is independent of its degree of severity. AKC is usually not self resolving and it poses a higher risk of corneal injuries and severe sequelae. Management of AKC should prevent or treat corneal damage. Although topical corticosteroids remain the standard treatment for patients with AKC, prolonged use may lead to complications. Topical cyclosporine A (CsA) may improve AKC signs and symptoms, and be used as a corticosteroid sparing agent. To determine the efficacy and gather evidence on safety from randomised controlled trials (RCTs) of topical CsA in patients with AKC. We searched CENTRAL (which contains the Cochrane Eyes and Vision Group Trials Register) (The Cochrane Library 2012, Issue 6), MEDLINE (January 1946 to July 2012), EMBASE (January 1980 to July 2012), Latin American and Caribbean Literature on Health Sciences (LILACS) (January 1982 to July 2012), Cumulative Index to Nursing and Allied Health Literature (CINAHL) (January 1937 to July 2012), OpenGrey (System for Information on Grey Literature in Europe) (www.opengrey.eu/), the metaRegister of Controlled Trials (mRCT) (www.controlled-trials.com), ClinicalTrials.gov (www.clinicaltrials.gov), the WHO International Clinical Trials Registry Platform (ICTRP) (www.who.int/ictrp/search/en), the IFPMA Clinical Trials Portal (http://clinicaltrials.ifpma.org/no_cache/en/myportal/index.htm) and Web of Science Conference Proceedings Citation Index- Science (CPCI-S). We did not use any date or language restrictions in the electronic searches for trials. The electronic databases were last searched on 9 July 2012. We also handsearched the following conference proceedings: American Academy of Ophthalmology, Association for Research in Vision and Ophthalmology, International Council of Opthalmology and Societas

  8. 21 CFR 868.5170 - Laryngotracheal topical anesthesia applicator.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Laryngotracheal topical anesthesia applicator. 868... topical anesthesia applicator. (a) Identification. A laryngotracheal topical anesthesia applicator is a device used to apply topical anesthetics to a patient's laryngotracheal area. (b) Classification. Class...

  9. Topical minoxidil fortified with finasteride: An account of maintenance of hair density after replacing oral finasteride

    Directory of Open Access Journals (Sweden)

    B S Chandrashekar

    2015-01-01

    Full Text Available Background: Finasteride acts by reducing dihydrotestosterone levels, thereby inhibiting miniaturization of hair follicles in patients with androgenetic alopecia (AGA. Oral finasteride is associated with side effects such as decreased libido, sexual dysfunction, and gynecomastia. Aim: The aim of the following study is to assess the efficacy of maintaining hair growth with 5% topical minoxidil fortified with 0.1% finasteride in patients with AGA after initial treatment with 5% topical minoxidil and oral finasteride for two years. Materials and Methods: A retrospective assessment was done in 50 male patients aged 20-40 years with AGA. All the patients had been initially treated with topical minoxidil and oral finasteride for a period of two years, after which the oral finasteride was replaced with topical minoxidil fortified with finasteride. Five of 50 patients had discontinued the treatment for a period of 8-12 months and were then resumed with only topical minoxidil fortified with finasteride. The patients′ case sheets and photographs were reviewed by independent observers and the efficacy of minoxidil-finasteride combination was assessed. Results: Of the 45 patients who underwent a continuous treatment for AGA, 84.44% maintained a good hair density with topical minoxidil-finasteride combinatio. Of the five patients who discontinued oral finasteride for 8-12 months, four demonstrated good improvement in hair density when treatment was resumed with topical minoxidil-finasteride combination. Conclusion: Topical finasteride can be considered for hair density maintenance after initial improvement with oral finasteride, thereby obviating the indefinite use of oral finasteride.

  10. Parallel adaptation of a vectorised quantumchemical program system

    International Nuclear Information System (INIS)

    Van Corler, L.C.H.; Van Lenthe, J.H.

    1987-01-01

    Supercomputers, like the CRAY 1 or the Cyber 205, have had, and still have, a marked influence on Quantum Chemistry. Vectorization has led to a considerable increase in the performance of Quantum Chemistry programs. However, clockcycle times more than a factor 10 smaller than those of the present supercomputers are not to be expected. Therefore future supercomputers will have to depend on parallel structures. Recently, the first examples of such supercomputers have been installed. To be prepared for this new generation of (parallel) supercomputers one should consider the concepts one wants to use and the kind of problems one will encounter during implementation of existing vectorized programs on those parallel systems. The authors implemented four important parts of a large quantumchemical program system (ATMOL), i.e. integrals, SCF, 4-index and Direct-CI in the parallel environment at ECSEC (Rome, Italy). This system offers simulated parallellism on the host computer (IBM 4381) and real parallellism on at most 10 attached processors (FPS-164). Quantumchemical programs usually handle large amounts of data and very large, often sparse matrices. The transfer of that many data can cause problems concerning communication and overhead, in view of which shared memory and shared disks must be considered. The strategy and the tools that were used to parallellise the programs are shown. Also, some examples are presented to illustrate effectiveness and performance of the system in Rome for these type of calculations

  11. Topical therapy of atopic dermatitis: controversies from Hippocrates to topical immunomodulators.

    Science.gov (United States)

    Tilles, Gérard; Wallach, Daniel; Taïeb, Alain

    2007-02-01

    Although atopic dermatitis can be treated efficiently, there is still much controversy about the risk/benefit ratio of both topical corticosteroids and topical immunomodulators. Conflicting data may be found about the usefulness of bathing, diet regulation, and other therapeutic interventions. These controversies result in part from the persistence of Hippocratic doctrines in modern medical thinking. Humoralist and diathetic doctrines, as they pertain to eczema, are reviewed. The paradoxical worsening of oozing and the deadly hazards of hospitalization before the era of antibiotics are brought to mind. We hope that this historical review will improve the understanding of current controversies and help dermatologists to manage patients with atopic dermatitis and other chronic skin diseases.

  12. Strategic mistakes (AVOIDABLE) The topicality of Michel Porter’s generic strategies

    OpenAIRE

    Viltard, Leandro Adolfo

    2017-01-01

    This article explores the topicality of Porter’s generic strategies, assessing about their applicability on two specific automotive industry projects: The Smart and the New Beetle.   After performing a documentation analysis on these two projects, it was concluded that both of them may be considered avoidable strategic mistakes as they show the risks of higher differentiation that is not being paid by the customer, no matter how if it is about recognized brands or icon products. Hazards a...

  13. Topic Time Series Analysis of Microblogs

    Science.gov (United States)

    2014-10-01

    may be distributed more globally. Tweets on a specific topic that cluster spatially, temporally or both might be of interest to analysts, marketers ...of $ and @, with the latter only in the case that it is the only character in the token (the @ symbol is significant in its usage by Instagram in...is generated by Instagram . Topic 80, Distance: 143.2101 Top words: 1. rawr 2. ˆ0ˆ 3. kill 4. jurassic 5. dinosaur Analysis: This topic is quite

  14. Topic modelling in the information warfare domain

    CSIR Research Space (South Africa)

    De Waal, A

    2013-11-01

    Full Text Available for interesting and relevant topics. The objectives of this paper is to describe topic modelling, put it in context as a useful IW technique and illustrate its use with two examples. They discuss several applications of topic modelling in the safety and security...

  15. Australasian emergency physicians: a learning and educational needs analysis. Part Four: CPD topics desired by emergency physicians.

    Science.gov (United States)

    Dent, Andrew W; Weiland, Tracey J; Paltridge, Debbie

    2008-06-01

    To report the preferences of Fellows of the Australasian College for Emergency Medicine for topics they would desire for their continuing professional development (CPD). A mailed survey of Fellows of the Australasian College for Emergency Medicine asked for Likert type responses on the desirability of CPD on 15 procedural skills, 13 management skills, 11 clinical emergency topics, 9 topics related to teaching, 7 related to diagnostics and 5 evidence based practice topics. CPD in procedural skills of advanced and surgical airways, ED ultrasound, ventilation, skills, plastic procedures and regional anaesthesia were nominated as desirable by 85% of emergency physicians (EP). More than 90% desired CPD in ophthalmological, otorhinolaryngeal, neonatal and paediatric emergencies. Of diagnostic skills, more than 80% considered CPD on computerized tomography, electrocardiography and plain X-ray interpretation as desirable, well as CPD about teaching in general, simulation and preparing candidates for fellowship exams. Of the 12 management skills, 11 were seen as desirable topics by more than 70%, with counter disaster planning, giving feedback and dealing with complaints the most popular. All evidence based practice related skills, including interpreting statistics and undertaking literature searches were seen as desirable topics by more than 80% of EP. This information may assist in the planning of future educational interventions for emergency physicians. EP seek CPD on management, educational and other non clinical skills, as well as topics relating directly to patient care.

  16. Alkemio: association of chemicals with biomedical topics by text and data mining.

    Science.gov (United States)

    Gijón-Correas, José A; Andrade-Navarro, Miguel A; Fontaine, Jean F

    2014-07-01

    The PubMed® database of biomedical citations allows the retrieval of scientific articles studying the function of chemicals in biology and medicine. Mining millions of available citations to search reported associations between chemicals and topics of interest would require substantial human time. We have implemented the Alkemio text mining web tool and SOAP web service to help in this task. The tool uses biomedical articles discussing chemicals (including drugs), predicts their relatedness to the query topic with a naïve Bayesian classifier and ranks all chemicals by P-values computed from random simulations. Benchmarks on seven human pathways showed good retrieval performance (areas under the receiver operating characteristic curves ranged from 73.6 to 94.5%). Comparison with existing tools to retrieve chemicals associated to eight diseases showed the higher precision and recall of Alkemio when considering the top 10 candidate chemicals. Alkemio is a high performing web tool ranking chemicals for any biomedical topics and it is free to non-commercial users. http://cbdm.mdc-berlin.de/∼medlineranker/cms/alkemio. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Topical Session on Materials Management

    International Nuclear Information System (INIS)

    2002-01-01

    At its second meeting, in Paris, 5-7 December 2001, the WPDD held two topical sessions on the D and D Safety Case and on the Management of Materials from D and D, respectively. This report documents the topical session on the management of materials. Presentations during the topical session covered key aspects of the management of materials and meant to provide an exchange of information and experience, including: Experience and lessons learnt from VLLW and non-radioactive material management in Spain and Germany with special attention to recycling (How specific solutions came about? Are there 'generic' examples for wider adoption?); Risk assessment of recycling and non-recycling: a CPD study; Waste acceptance issues within different national contexts (What constraints are there on the waste receiving body and what flexibility can the latter have? What constraints does this impose on D and D implementers? What about wastes are without current solution? What needs to be done? What about large items and 'difficult' waste in general?); Radiological characterisation of materials during decommissioning, particularly difficult situations - large volumes, large items,.. wastes, heterogeneous streams (What examples of established practice? What are the approaches or aspects that set the regulatory requirements? How can the flow rates be large but the answers acceptable? How much is needed to be known for later action, e. g., disposal, release, protection of worker, etc.); Radiological characterisation of buildings as they stand, in order to allow conventional demolition (What are strategies for optimisation of characterisation? How much needs to be known to take action later? e.g. for storage, disposal, release, cost estimation and ALARA? What needs to be done in advance and after decommissioning/dismantling?). At the end of each presentation time was allotted for discussion of the paper. Integral to the Topical Session was a facilitated plenary discussion on the topical

  18. Topical steroid addiction in atopic dermatitis

    Directory of Open Access Journals (Sweden)

    Fukaya M

    2014-10-01

    Full Text Available Mototsugu Fukaya,1 Kenji Sato,2 Mitsuko Sato,3 Hajime Kimata,4 Shigeki Fujisawa,5 Haruhiko Dozono,6 Jun Yoshizawa,7 Satoko Minaguchi8 1Tsurumai Kouen Clinic, Nagoya, 2Department of Dermatology, Hannan Chuo Hospital, Osaka, 3Sato Pediatric Clinic, Osaka, 4Kimata Hajime Clinic, Osaka, 5Fujisawa Dermatology Clinic, Tokyo, 6Dozono Medical House, Kagoshima, 7Yoshizawa Dermatology Clinic, Yokohama, 8Department of Dermatology, Kounosu Kyousei Hospital, Saitama, Japan Abstract: The American Academy of Dermatology published a new guideline regarding topical therapy in atopic dermatitis in May 2014. Although topical steroid addiction or red burning skin syndrome had been mentioned as possible side effects of topical steroids in a 2006 review article in the Journal of the American Academy of Dermatology, no statement was made regarding this illness in the new guidelines. This suggests that there are still controversies regarding this illness. Here, we describe the clinical features of topical steroid addiction or red burning skin syndrome, based on the treatment of many cases of the illness. Because there have been few articles in the medical literature regarding this illness, the description in this article will be of some benefit to better understand the illness and to spur discussion regarding topical steroid addiction or red burning skin syndrome. Keywords: topical steroid addiction, atopic dermatitis, red burning skin syndrome, rebound, corticosteroid, eczema

  19. Considering resistance in systematic reviews of antibiotic treatment.

    Science.gov (United States)

    Leibovici, Leonard; Soares-Weiser, Karla; Paul, Mical; Goldberg, Elad; Herxheimer, Andrew; Garner, Paul

    2003-10-01

    Microorganisms resistant to antibiotic drugs are a threat to the health and chances of survival of patients. Systematic reviews on antibiotic drugs that ignore the topic of resistance present readers with a skewed view, emphasizing short-term efficacy or effectiveness while ignoring long-term consequences. To examine whether systematic reviews of antibiotic treatment consider resistance; if not, to find out whether data on resistance were reported in the original trials; and based on that, to offer a framework for taking resistance into account in systematic reviews. The Cochrane Database of Systematic Reviews (the Cochrane Library, 2001, issue 2); and MEDLINE, 1996-2000. (i) Systematic reviews or meta-analyses of antimicrobial therapy, published during 1996-2000. (ii) Randomized, controlled trials abstracted in systematic reviews that addressed a topic highly relevant to antibiotic resistance. We examined each systematic review, and each article, to see whether the implications of resistance were discussed; and whether data on resistance were collected. Out of 111 systematic reviews, only 44 (40%) discussed resistance. Ten reviews (9%) planned or performed collection of data on the response of patients with susceptible or resistant isolates. In 22 systematic reviews (20%), collection of data on induction of resistance was planned or performed. The topic of 41 reviews was judged highly relevant to resistance, and these reviews extracted data from 337 articles, out of which we retrieved 279 articles (83%). In 201 (72%) articles, resistance was discussed or data pertaining to it were collected. Ninety-seven articles (35%) gave actual data on resistance of pathogens to the study drugs, 71 articles (25%) data on efficacy of antibiotic drugs in patients with susceptible and resistant pathogens, and 55 articles (20%) provided data on infection or colonization with resistant strains during treatment. Most systematic reviews on antibiotic treatment ignored the issue of

  20. Selected topics in particle accelerators: Proceedings of the CAP meetings. Volume 3

    International Nuclear Information System (INIS)

    Parsa, Z.

    1995-01-01

    This Report includes copies of transparencies and notes from the presentations made at the Center for Accelerator Physics at Brookhaven National Laboratory. Editing and changes to authors' contributions in this Report were made only to fulfill the publication requirements. This volume includes notes and transparencies on eight presentations: ''Inverse Cherenkov Laser Acceleration of Electron Beams'', ''High Brightness Field Emission Cathodes'', ''QCD/Teraflop Collaboration: The Future of Supercomputing'', ''Report on Dipole R ampersand D'', ''Reaching Maximum Luminosity in Hadron Colliders at 10-100 TeV'', ''STAR Collaboration Project Status Report: Quarks and Gluons'', ''PHENIX Collaboration Project Status Report'', and ''Update on Status of BNL Relativistic Heavy Ion Collider (RHIC) Project: RHIC Design Issues.''

  1. Eye Wear: MedlinePlus Health Topic

    Science.gov (United States)

    ... When You Exercise (National Institute on Aging) - PDF Topic Image MedlinePlus Email Updates Get Eye Wear updates by email What's this? GO Related Health Topics Refractive Errors National Institutes of Health The primary ...

  2. 21 CFR 524.1193 - Ivermectin topical solution.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Ivermectin topical solution. 524.1193 Section 524...) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS OPHTHALMIC AND TOPICAL DOSAGE FORM NEW ANIMAL DRUGS § 524.1193 Ivermectin topical solution. (a) Specifications. Each milliliter (mL) of solution contains 5 milligrams of...

  3. Eosinophilic Esophagitis: MedlinePlus Health Topic

    Science.gov (United States)

    ... Esophagitis (EoE) (American Academy of Allergy, Asthma, and Immunology) Also in Spanish Latest News Eosinophilic Esophagitis May ... Pediatric and Adolescent Patients (American College of Gastroenterology) Topic Image Related Health Topics Eosinophilic Disorders Esophagus Disorders ...

  4. Female Infertility: MedlinePlus Health Topic

    Science.gov (United States)

    ... Prolactin blood test (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Female Infertility updates ... Serum progesterone Show More Show Less Related Health Topics Assisted Reproductive Technology Infertility Male Infertility National Institutes ...

  5. Mobility Aids: MedlinePlus Health Topic

    Science.gov (United States)

    ... Mobility Problems (AGS Foundation for Health in Aging) Topic Image MedlinePlus Email Updates Get Mobility Aids updates ... standing and walking Using a cane Related Health Topics Assistive Devices Other Languages Find health information in ...

  6. Genetic Testing: MedlinePlus Health Topic

    Science.gov (United States)

    ... Your Family's Health (National Institutes of Health) - PDF Topic Image MedlinePlus Email Updates Get Genetic Testing updates ... testing and your cancer risk Karyotyping Related Health Topics Birth Defects Genetic Counseling Genetic Disorders Newborn Screening ...

  7. Folic Acid: MedlinePlus Health Topic

    Science.gov (United States)

    ... acid in diet (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Folic Acid updates ... acid - test Folic acid in diet Related Health Topics Vitamins National Institutes of Health The primary NIH ...

  8. Pneumococcal Infections: MedlinePlus Health Topic

    Science.gov (United States)

    ... Prevention, Immunization Action Coalition) - PDF Also in Spanish Topic Image MedlinePlus Email Updates Get Pneumococcal Infections updates ... ray Meningitis - pneumococcal Sputum gram stain Related Health Topics Meningitis Pneumonia Sepsis Sinusitis Streptococcal Infections National Institutes ...

  9. Wilms' Tumor: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish Wilms tumor (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Wilms Tumor updates ... ENCYCLOPEDIA After chemotherapy - discharge Wilms tumor Related Health Topics Kidney Cancer National Institutes of Health The primary ...

  10. Child Safety: MedlinePlus Health Topic

    Science.gov (United States)

    ... injuries in children (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Child Safety updates ... safety Preventing head injuries in children Related Health Topics Infant and Newborn Care Internet Safety Motor Vehicle ...

  11. Pneumocystis Infections: MedlinePlus Health Topic

    Science.gov (United States)

    ... Pneumocystis jiroveci pneumonia (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Pneumocystis Infections updates ... GO MEDICAL ENCYCLOPEDIA Pneumocystis jiroveci pneumonia Related Health Topics HIV/AIDS HIV/AIDS and Infections Pneumonia National ...

  12. Collapsed Lung: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish Pneumothorax - infants (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Collapsed Lung updates ... Lung surgery Pneumothorax - slideshow Pneumothorax - infants Related Health Topics Chest Injuries and Disorders Lung Diseases Pleural Disorders ...

  13. Male Infertility: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish Testicular biopsy (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Male Infertility updates ... analysis Sperm release pathway Testicular biopsy Related Health Topics Assisted Reproductive Technology Female Infertility Infertility National Institutes ...

  14. Healthy Aging: MedlinePlus Health Topic

    Science.gov (United States)

    ... Aging National Institute on Aging Also in Spanish Topic Image MedlinePlus Email Updates Get Healthy Aging updates ... 65 Health screening - women - over 65 Related Health Topics Exercise for Seniors Nutrition for Seniors Seniors' Health ...

  15. Psoriatic Arthritis: MedlinePlus Health Topic

    Science.gov (United States)

    ... Handouts Psoriatic arthritis (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Psoriatic Arthritis updates ... this? GO MEDICAL ENCYCLOPEDIA Psoriatic arthritis Related Health Topics Arthritis Psoriasis National Institutes of Health The primary ...

  16. Hip Replacement: MedlinePlus Health Topic

    Science.gov (United States)

    ... invasive hip replacement (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Hip Replacement updates ... replacement - precautions Minimally invasive hip replacement Related Health Topics Hip Injuries and Disorders National Institutes of Health ...

  17. Platelet Disorders: MedlinePlus Health Topic

    Science.gov (United States)

    ... Thromobocytopenia - drug-induced (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Platelet Disorders updates ... Willebrand disease Show More Show Less Related Health Topics Bleeding Disorders Blood Clots Blood Count Tests Blood ...

  18. Cardiac Rehabilitation: MedlinePlus Health Topic

    Science.gov (United States)

    ... in Spanish Electrocardiogram (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Cardiac Rehabilitation updates ... How to take your pulse Pulse Related Health Topics Heart Attack Heart Diseases How to Prevent Heart ...

  19. Cardiac Arrest: MedlinePlus Health Topic

    Science.gov (United States)

    ... Handouts Cardiac arrest (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Cardiac Arrest updates ... this? GO MEDICAL ENCYCLOPEDIA Cardiac arrest Related Health Topics Arrhythmia CPR Pacemakers and Implantable Defibrillators National Institutes ...

  20. Kawasaki Disease: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish Kawasaki disease (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Kawasaki Disease updates ... GO MEDICAL ENCYCLOPEDIA Electrocardiogram Kawasaki disease Related Health Topics Vasculitis National Institutes of Health The primary NIH ...

  1. Diabetic Diet: MedlinePlus Health Topic

    Science.gov (United States)

    ... Sweeteners - sugar substitutes (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Diabetic Diet updates ... you have diabetes Sweeteners - sugar substitutes Related Health Topics Blood Sugar Diabetes Diabetes in Children and Teens ...

  2. Infection Control: MedlinePlus Health Topic

    Science.gov (United States)

    ... Staph infections - hospital (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Infection Control updates ... infections when visiting Staph infections - hospital Related Health Topics Hepatitis HIV/AIDS MRSA National Institutes of Health ...

  3. Hearing Aids: MedlinePlus Health Topic

    Science.gov (United States)

    ... for hearing loss (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Hearing Aids updates ... MEDICAL ENCYCLOPEDIA Devices for hearing loss Related Health Topics Cochlear Implants Hearing Disorders and Deafness National Institutes ...

  4. Kidney Tests: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish Total protein (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Kidney Tests updates ... hour volume Show More Show Less Related Health Topics Kidney Cancer Kidney Diseases National Institutes of Health ...

  5. Ischemic Stroke: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish Thrombolytic therapy (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Ischemic Stroke updates ... cardiogenic embolism Stroke - slideshow Thrombolytic therapy Related Health Topics Hemorrhagic Stroke Stroke Stroke Rehabilitation National Institutes of ...

  6. Pulmonary Rehabilitation: MedlinePlus Health Topic

    Science.gov (United States)

    ... Handouts Postural drainage (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Pulmonary Rehabilitation updates ... this? GO MEDICAL ENCYCLOPEDIA Postural drainage Related Health Topics Lung Diseases National Institutes of Health The primary ...

  7. KEY TOPICS IN SPORTS MEDICINE

    Directory of Open Access Journals (Sweden)

    Amir Ali Narvani

    2006-12-01

    Full Text Available Key Topics in Sports Medicine is a single quick reference source for sports and exercise medicine. It presents the essential information from across relevant topic areas, and includes both the core and emerging issues in this rapidly developing field. It covers: 1 Sports injuries, rehabilitation and injury prevention, 2 Exercise physiology, fitness testing and training, 3 Drugs in sport, 4 Exercise and health promotion, 5 Sport and exercise for special and clinical populations, 6 The psychology of performance and injury. PURPOSE The Key Topics format provides extensive, concise information in an accessible, easy-to-follow manner. AUDIENCE The book is targeted the students and specialists in sports medicine and rehabilitation, athletic training, physiotherapy and orthopaedic surgery. The editors are authorities in their respective fields and this handbook depends on their extensive experience and knowledge accumulated over the years. FEATURES The book contains the information for clinical guidance, rapid access to concise details and facts. It is composed of 99 topics which present the information in an order that is considered logical and progressive as in most texts. Chapter headings are: 1. Functional Anatomy, 2. Training Principles / Development of Strength and Power, 3. Biomechanical Principles, 4. Biomechanical Analysis, 5. Physiology of Training, 6. Monitoring of Training Progress, 7. Nutrition, 8. Hot and Cold Climates, 9. Altitude, 10. Sport and Travelling, 11. Principles of Sport Injury Diagnosis, 12. Principles of Sport and Soft Tissue Management, 13. Principles of Physical Therapy and Rehabilitation, 14. Principles of Sport Injury Prevention, 15. Sports Psychology, 16. Team Sports, 17. Psychological Aspects of Injury in Sport, 18. Injury Repair Process, 19. Basic Biomechanics of Tissue Injury, 20. Plain Film Radiography in Sport, 21. Nuclear Medicine, 22. Diagnostic Ultrasound, 23. MRI Scan, 24. Other Imaging, 5. Head Injury, 26. Eye

  8. Preservice Teachers' Perspectives on 'Appropriate' K-8 Climate Change and Environmental Science Topics

    Science.gov (United States)

    Ford, D. J.

    2013-12-01

    With the release of the Next Generation Science Standards (NRC, 2013), climate change and related environmental sciences will now receive greater emphasis within science curricula at all grade levels. In grades K-8, preparation in foundational content (e.g., weather and climate, natural resources, and human impacts on the environment) and the nature of scientific inquiry will set the groundwork for later learning of climate change in upper middle and high school. These rigorous standards increase pressure on elementary and middle school teachers to possess strong science content knowledge, as well as experience supporting children to develop scientific ideas through the practices of science. It also requires a set of beliefs - about children and the science that is appropriate for them - that is compatible with the goals set out in the standards. Elementary teachers in particular, who often have minimal preparation in the earth sciences (NSF, 2007), and entrenched beliefs about how particular topics ought to be taught (Holt- Reynolds, 1992; Pajares, 1992), including climate change (Bryce & Day, 2013; Lambert & Bleicher, 2013), may face unique challenges in adjusting to the new standards. If teachers hold beliefs about climate change as controversial, for example, they may not consider it an appropriate topic for children, despite its inclusion in the standards. On the other hand, those who see a role for children in efforts to mitigate human impacts on the environment may be more enthusiastic about the new standards. We report on a survey of preservice K-8 teachers' beliefs about the earth and environmental science topics that they consider to be appropriate and inappropriate for children in grades K-3, 4-5, and 6-8. Participants were surveyed on a variety of standards-based topics using terminology that signals publicly and scientifically neutral (e.g. weather, ecosystems) to overtly controversial (evolution, global warming) science. Results from pilot data

  9. Topical tar: Back to the future

    Energy Technology Data Exchange (ETDEWEB)

    Paghdal, K.V.; Schwartz, R.A. [University of Medicine & Dentistry of New Jersey, Newark, NJ (United States)

    2009-08-15

    The use of medicinal tar for dermatologic disorders dates back to the ancient times. Although coal tar is utilized more frequently in modern dermatology, wood tars have also been widely employed. Tar is used mainly in the treatment of chronic stable plaque psoriasis, scalp psoriasis, atopic dermatitis, and seborrheic dermatitis, either alone or in combination therapy with other medications, phototherapy, or both. Many modifications have been made to tar preparations to increase their acceptability, as some dislike its odor, messy application, and staining of clothing. One should consider a tried and true treatment with tar that has led to clearing of lesions and prolonged remission times. Occupational studies have demonstrated the carcinogenicity of tar; however, epidemiologic studies do not confirm similar outcomes when used topically. This article will review the pharmacology, formulations, efficacy, and adverse effects of crude coal tar and other tars in the treatment of selected dermatologic conditions.

  10. Antibiotic Resistance: MedlinePlus Health Topic

    Science.gov (United States)

    ... GO GO About MedlinePlus Site Map FAQs Customer Support Health Topics Drugs & Supplements Videos & Tools Español You Are Here: Home → Health Topics → Antibiotic Resistance URL of this page: https://medlineplus.gov/antibioticresistance. ...

  11. Comparative efficacy and patient preference of topical anaesthetics in dermatological laser treatments and skin microneedling

    Directory of Open Access Journals (Sweden)

    Yi Zhen Chiang

    2015-01-01

    Full Text Available Background: Topical anaesthetics are effective for patients undergoing superficial dermatological and laser procedures. Our objective was to compare the efficacy and patient preference of three commonly used topical anaesthetics: (2.5% lidocaine/2.5% prilocaine cream (EMLA ® , 4% tetracaine gel (Ametop TM and 4% liposomal lidocaine gel (LMX4 ® in patients undergoing laser procedures and skin microneedling. Settings and Design: This was a prospective, double-blind study of patients undergoing laser and skin microneedling procedures at a laser unit in a tertiary referral dermatology centre. Materials and Methods: All 29 patients had three topical anaesthetics applied under occlusion for 1 hour prior to the procedure, at different treatment sites within the same anatomical zone. A self-assessment numerical pain rating scale was given to each patient to rate the pain during the procedure and each patient was asked to specify their preferred choice of topical anaesthetic at the end of the procedure. Statistical Analysis: Parametric data (mean pain scores and frequency of topical anaesthetic agent of choice were compared using the paired samples t-test. A P-value of ≤0.05 was considered as statistically significant. Results and Conclusions: Patients reported a mean (±SD; 95% confidence interval pain score of 5 (±2.58; 3.66-6.46 with Ametop TM , 4.38 (±2.53; 2.64-4.89 with EMLA ® and 3.91 (±1.95; 2.65-4.76 with LMX4 ® . There was no statistically significant difference in pain scores between the different topical anaesthetics. The majority of patients preferred LMX4 ® as their choice of topical anaesthetic for dermatological laser and skin microneedling procedures.

  12. A Web Search on Environmental Topics: What Is the Role of Ranking?

    Science.gov (United States)

    Filisetti, Barbara; Mascaretti, Silvia; Limina, Rosa Maria; Gelatti, Umberto

    2013-01-01

    Abstract Background: Although the Internet is easy to use, the mechanisms and logic behind a Web search are often unknown. Reliable information can be obtained, but it may not be visible as the Web site is not located in the first positions of search results. The possible risks of adverse health effects arising from environmental hazards are issues of increasing public interest, and therefore the information about these risks, particularly on topics for which there is no scientific evidence, is very crucial. The aim of this study was to investigate whether the presentation of information on some environmental health topics differed among various search engines, assuming that the most reliable information should come from institutional Web sites. Materials and Methods: Five search engines were used: Google, Yahoo!, Bing, Ask, and AOL. The following topics were searched in combination with the word “health”: “nuclear energy,” “electromagnetic waves,” “air pollution,” “waste,” and “radon.” For each topic three key words were used. The first 30 search results for each query were considered. The ranking variability among the search engines and the type of search results were analyzed for each topic and for each key word. The ranking of institutional Web sites was given particular consideration. Results: Variable results were obtained when surfing the Internet on different environmental health topics. Multivariate logistic regression analysis showed that, when searching for radon and air pollution topics, it is more likely to find institutional Web sites in the first 10 positions compared with nuclear power (odds ratio=3.4, 95% confidence interval 2.1–5.4 and odds ratio=2.9, 95% confidence interval 1.8–4.7, respectively) and also when using Google compared with Bing (odds ratio=3.1, 95% confidence interval 1.9–5.1). Conclusions: The increasing use of online information could play an important role in forming opinions. Web users should become

  13. Epidemic spread in bipartite network by considering risk awareness

    Science.gov (United States)

    Han, She; Sun, Mei; Ampimah, Benjamin Chris; Han, Dun

    2018-02-01

    Human awareness plays an important role in the spread of infectious diseases and the control of propagation patterns. Exploring the interplay between human awareness and epidemic spreading is a topic that has been receiving increasing attention. Considering the fact, some well-known diseases only spread between different species we propose a theoretical analysis of the Susceptible-Infected-Susceptible (SIS) epidemic spread from the perspective of bipartite network and risk aversion. Using mean field theory, the epidemic threshold is calculated theoretically. Simulation results are consistent with the proposed analytic model. The results show that, the final infection density is negative linear with the value of individuals' risk awareness. Therefore, the epidemic spread could be effectively suppressed by improving individuals' risk awareness.

  14. Mapping the Infoscape of LIS Courses for Intersections of Health-Gender and Health-Sexual Orientation Topics

    Science.gov (United States)

    Mehra, Bharat; Tidwell, William Travis

    2014-01-01

    The article explores the information landscape (i.e., infoscape) of library and information science (LIS) courses for intersections of health-gender and health-sexual orientation topics, concerns, and issues. This research was considered important because health information support services essential in today's society must include marginalized…

  15. Large-Scale Topic Detection and Language Model Adaptation

    National Research Council Canada - National Science Library

    Seymore, Kristie

    1997-01-01

    .... We have developed a language model adaptation scheme that takes apiece of text, chooses the most similar topic clusters from a set of over 5000 elemental topics, and uses topic specific language...

  16. Quantum mechanics II advanced topics

    CERN Document Server

    Rajasekar, S

    2015-01-01

    Quantum Mechanics II: Advanced Topics uses more than a decade of research and the authors’ own teaching experience to expound on some of the more advanced topics and current research in quantum mechanics. A follow-up to the authors introductory book Quantum Mechanics I: The Fundamentals, this book begins with a chapter on quantum field theory, and goes on to present basic principles, key features, and applications. It outlines recent quantum technologies and phenomena, and introduces growing topics of interest in quantum mechanics. The authors describe promising applications that include ghost imaging, detection of weak amplitude objects, entangled two-photon microscopy, detection of small displacements, lithography, metrology, and teleportation of optical images. They also present worked-out examples and provide numerous problems at the end of each chapter.

  17. Topical application of hemostatic paste

    Directory of Open Access Journals (Sweden)

    Mohammad Mizanur Rahman

    2017-02-01

    Full Text Available As a measure to control minor surgical bleeding, surgeons usually depend on a number of hemostatic aids. Topical use of bovine thrombin is a widely used procedure to arrest such minor bleeding. A 35 year old male sergeant of Bangladesh Air Force presented with repeated development of hematoma in his left thigh without any history of trauma or previous history of bleeding. Critical analysis of the patient’s history, routine and sophisticated hematological investigations revealed that the patient developed anti-thrombin antibody following the application of hemostatic paste in the tooth socket five years back during minor dental procedure to stop ignorable bleeding episodes. Therefore, topical use of hemostatic glue/paste or bovine thrombin should be avoided to desist minor bleeding as recombinant human thrombin is now available for topical use.

  18. What do parents want to know when considering autopsy for their child with cancer?

    Science.gov (United States)

    Wiener, Lori; Sweeney, Corinne; Baird, Kristin; Merchant, Melinda S; Warren, Katherine E; Corner, Geoffrey W; Roberts, Kailey E; Lichtenthal, Wendy G

    2014-08-01

    Research has suggested that autopsy in pediatrics is a valued way for parents to better understand and process their child's death, yet physicians often express hesitancy in discussing this topic with parents. To better assist clinicians with initiating discussion about this often sensitive topic, the current study examined bereaved parents' preferences about the timing and content of the autopsy discussion as well as reasons for considering autopsy. This study explored the views of 30 parents who lost a child to a variety of malignancies between 6 months and 6 years ago. Results showed that 36.7% of parents recalled having a discussion about autopsy, and the vast majority of those who did not recall a discussion (89.5%) would have considered an autopsy if it had been discussed. The majority of participants in this study indicated their preference to have the first conversation about autopsy when it becomes clear that cure is no longer possible. Findings suggest that educating parents about the clinical, emotional, and potential research benefits of autopsy and tissue procurement will ultimately help them make informed decisions and understand the importance of autopsy in medical progress. The future research and clinical implications of these findings are discussed.

  19. Aerodynamics of wind turbines emerging topics

    CERN Document Server

    Amano, R S

    2014-01-01

    Focusing on Aerodynamics of Wind Turbines with topics ranging from Fundamental to Application of horizontal axis wind turbines, this book presents advanced topics including: Basic Theory for Wind turbine Blade Aerodynamics, Computational Methods, and Special Structural Reinforcement Technique for Wind Turbine Blades.

  20. SETI: A good introductory physics topic

    Science.gov (United States)

    Hobson, Art

    1997-04-01

    If America is to achieve the science literacy that is essential to industrialized democracy, all students must study such topics as scientific methodology, pseudoscience, ozone depletion, and global warming. My large-enrollment liberal-arts physics course covers the great principles of physics along with several such philosophical and societal topics. It is easy to include the interdisciplinary context of physics in courses for non-scientists, because these courses are flexible, conceptual, and taught to students whose interests span a broad range. Students find these topics relevant and fascinating, leading to large enrollments by non-scientists even in courses labeled ''physics.'' I will discuss my approach to teaching the search for extra-terrestrial intelligence (SETI), a topic with lots of good physics and with connections to scientific methodology and pseudoscience. A textbook for this kind of course has been published, Physics: Concepts and Connections (Prentice-Hall, 1995).

  1. Affinity between information retrieval system and search topic

    International Nuclear Information System (INIS)

    Ebinuma, Yukio

    1979-01-01

    Ten search profiles are tested on the INIS system at the Japan Atomic Energy Research Institute. The results are plotted on recall-precision chart ranging from 100% recall to 100% precision. The curves are not purely systems-dependent nor search-dependent, and are determined substantially by the ''affinity'' between the system and the search topic. The curves are named ''Affinity curves of search topics with information retrieval systems'', and hence retrieval affinity factors are derived. They are obtained not only for individual search topics but also for averages in the system. By such a quantitative examination, the difference of affinity among search topics in a given system, that of the same search topic among various systems, and that of systems to the same group of search topics can be compared reasonably. (author)

  2. Analyzing the history of Cognition using Topic Models.

    Science.gov (United States)

    Cohen Priva, Uriel; Austerweil, Joseph L

    2015-02-01

    Very few articles have analyzed how cognitive science as a field has changed over the last six decades. We explore how Cognition changed over the last four decades using Topic Models. Topic Models assume that every word in every document is generated by one of a limited number of topics. Words that are likely to co-occur are likely to be generated by a single topic. We find a number of significant historical trends: the rise of moral cognition, eyetracking methods, and action, the fall of sentence processing, and the stability of development. We introduce the notion of framing topics, which frame content, rather than present the content itself. These framing topics suggest that over time Cognition turned from abstract theorizing to more experimental approaches. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Topics of Bioengineering in Wikipedia

    Directory of Open Access Journals (Sweden)

    Vassia Atanassova

    2009-10-01

    Full Text Available The present report aims to give a snapshot of how topics from the field of bioengineering (bioinformatics, bioprocess systems, biomedical engineering, biotechnology, etc. are currently covered in the free electronic encyclopedia Wikipedia. It also offers insights and information about what Wikipedia is, how it functions, how and when to cite Wikipedian articles, if necessary. Several external wikis, devoted to topics of bioengineering, are also listed and reviewed.

  4. Updates of Topical and Local Anesthesia Agents.

    Science.gov (United States)

    Boyce, Ricardo A; Kirpalani, Tarun; Mohan, Naveen

    2016-04-01

    As described in this article, there are many advances in topical and local anesthesia. Topical and local anesthetics have played a great role in dentistry in alleviating the fears of patients, eliminating pain, and providing pain control. Many invasive procedures would not be performed without the use and advances of topical/local anesthetics. The modern-day dentist has the responsibility of knowing the variety of products on the market and should have at least references to access before, during, and after treatment. This practice ensures proper care with topical and local anesthetics for the masses of patients entering dental offices worldwide. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Anesthesia: A Topic for Interdisciplinary Study.

    Science.gov (United States)

    Labianca, Dominick A.; Reeves, William J.

    1977-01-01

    Describes an interdisciplinary approach for teaching the topic of anesthesia as one aspect of a chemistry-oriented course for nonscience majors which focuses on timely topics such as the energy crisis and drugs. Historical treatment with the examination of literature is emphasized in teaching. (HM)

  6. FINDING POTENTIALLY UNSAFE NUTRITIONAL SUPPLEMENTS FROM USER REVIEWS WITH TOPIC MODELING.

    Science.gov (United States)

    Sullivan, Ryan; Sarker, Abeed; O'Connor, Karen; Goodin, Amanda; Karlsrud, Mark; Gonzalez, Graciela

    2016-01-01

    Although dietary supplements are widely used and generally are considered safe, some supplements have been identified as causative agents for adverse reactions, some of which may even be fatal. The Food and Drug Administration (FDA) is responsible for monitoring supplements and ensuring that supplements are safe. However, current surveillance protocols are not always effective. Leveraging user-generated textual data, in the form of Amazon.com reviews for nutritional supplements, we use natural language processing techniques to develop a system for the monitoring of dietary supplements. We use topic modeling techniques, specifically a variation of Latent Dirichlet Allocation (LDA), and background knowledge in the form of an adverse reaction dictionary to score products based on their potential danger to the public. Our approach generates topics that semantically capture adverse reactions from a document set consisting of reviews posted by users of specific products, and based on these topics, we propose a scoring mechanism to categorize products as "high potential danger", "average potential danger" and "low potential danger." We evaluate our system by comparing the system categorization with human annotators, and we find that the our system agrees with the annotators 69.4% of the time. With these results, we demonstrate that our methods show promise and that our system represents a proof of concept as a viable low-cost, active approach for dietary supplement monitoring.

  7. Efficacy, tolerability and consumer acceptability of terbinafine topical spray versus terbinafine topical solution: a phase IIa, randomised, observer-blind, comparative study.

    Science.gov (United States)

    Brown, Marc; Evans, Charles; Muddle, Andrew; Turner, Rob; Lim, Sian; Reed, Jessica; Traynor, Matt

    2013-10-01

    Tinea pedis is one of the world's most prevalent dermatophyte infections. MedSpray™ tinea pedis 1 % w/w (topical spray) is a novel, easy-to-use propellant-based spray formulation containing 1 % w/w terbinafine, requiring no manipulation at the site of infection. This is in contrast to the only formulation currently approved in Europe for single application (none are approved in the USA for single use), which is Lamisil(®) Once 1 % w/w (topical solution), containing 1 % w/w terbinafine hydrochloride, which requires manipulation on the affected area. The aim of this study was to evaluate the efficacy, tolerability and consumer acceptability of a topical spray versus a topical solution in the treatment of tinea pedis. This study is a phase IIa, randomised, observer-blind, non-inferiority comparative study of the topical spray compared with the topical solution over a 12-week study period. The study was conducted at Bioskin GmbH, Hamburg and Berlin. Patients (n = 120) who presented with the presence of interdigital tinea pedis caused by dermatophytes on one or both feet were enrolled in the study. Patients were randomly assigned between the two treatment groups. Either the topical spray or the topical solution was administered by the study nurse and consisted of a single application (equivalent to 20 mg of terbinafine per foot) on day 1 of the study. No further applications were made for the duration of the study. The hypothesis formulated before commencement of the study was that the topical spray would prove to be non-inferior to the topical solution. Efficacy assessments, including clinical signs and symptoms, mycology and microscopy were performed at baseline and 1, 6 and 12 weeks after treatment. The rate of mycological cure at week 1 was statistically equivalent for both treatments. There was a significant reduction in the overall clinical score as assessed by the Physician's Global Assessment of signs and symptoms for both treatment groups. The topical

  8. 2nd International Workshop on Eigenvalue Problems : Algorithms, Software and Applications in Petascale Computing

    CERN Document Server

    Zhang, Shao-Liang; Imamura, Toshiyuki; Yamamoto, Yusaku; Kuramashi, Yoshinobu; Hoshi, Takeo

    2017-01-01

    This book provides state-of-the-art and interdisciplinary topics on solving matrix eigenvalue problems, particularly by using recent petascale and upcoming post-petascale supercomputers. It gathers selected topics presented at the International Workshops on Eigenvalue Problems: Algorithms; Software and Applications, in Petascale Computing (EPASA2014 and EPASA2015), which brought together leading researchers working on the numerical solution of matrix eigenvalue problems to discuss and exchange ideas – and in so doing helped to create a community for researchers in eigenvalue problems. The topics presented in the book, including novel numerical algorithms, high-performance implementation techniques, software developments and sample applications, will contribute to various fields that involve solving large-scale eigenvalue problems.

  9. Topical thrombin-related corneal calcification.

    Science.gov (United States)

    Kiratli, Hayyam; Irkeç, Murat; Alaçal, Sibel; Söylemezoğlu, Figen

    2006-09-01

    To report a highly unusual case of corneal calcification after brief intraoperative use of topical thrombin. A 44-year-old man underwent sclerouvectomy for ciliochoroidal leiomyoma, during which 35 UNIH/mL lyophilized bovine thrombin mixed with 9 mL of diluent containing 1500 mmol/mL calcium chloride was used. From the first postoperative day, corneal and anterior lenticular capsule calcifications developed, and corneal involvement slightly enlarged thereafter. A year later, 2 corneal punch biopsies confirmed calcification mainly in the Bowman layer. Topical treatment with 1.5% ethylenediaminetetraacetic acid significantly restored corneal clarity. Six months later, a standard extracapsular cataract extraction with intraocular lens placement improved visual acuity to 20/60. This case suggests that topical thrombin drops with elevated calcium concentrations may cause acute corneal calcification in Bowman layer and on the anterior lens capsule.

  10. Summaries of research and development activities by using supercomputer system of JAEA in FY2015. April 1, 2015 - March 31, 2016

    International Nuclear Information System (INIS)

    2017-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As shown in the fact that about 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology. In FY2015, the system was used for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue, as well as for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great number of R and D results accomplished by using the system in FY2015, as well as user support, operational records and overviews of the system, and so on. (author)

  11. Summaries of research and development activities by using supercomputer system of JAEA in FY2014. April 1, 2014 - March 31, 2015

    International Nuclear Information System (INIS)

    2016-02-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As shown in the fact that about 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology. In FY2014, the system was used for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue, as well as for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great number of R and D results accomplished by using the system in FY2014, as well as user support, operational records and overviews of the system, and so on. (author)

  12. Summaries of research and development activities by using supercomputer system of JAEA in FY2013. April 1, 2013 - March 31, 2014

    International Nuclear Information System (INIS)

    2015-02-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. About 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2013, the system was used not only for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science, but also for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue. This report presents a great amount of R and D results accomplished by using the system in FY2013, as well as user support, operational records and overviews of the system, and so on. (author)

  13. Summaries of research and development activities by using supercomputer system of JAEA in FY2012. April 1, 2012 - March 31, 2013

    International Nuclear Information System (INIS)

    2014-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As more than 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2012, the system was used not only for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science, but also for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as apriority issue. This report presents a great amount of R and D results accomplished by using the system in FY2012, as well as user support, operational records and overviews of the system, and so on. (author)

  14. Summaries of research and development activities by using supercomputer system of JAEA in FY2011. April 1, 2011 - March 31, 2012

    International Nuclear Information System (INIS)

    2013-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As more than 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2011, the system was used for analyses of the accident at the Fukushima Daiichi Nuclear Power Station and establishment of radioactive decontamination plan, as well as the JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great amount of R and D results accomplished by using the system in FY2011, as well as user support structure, operational records and overviews of the system, and so on. (author)

  15. Topics in the Journal of Counseling Psychology, 1963-2015.

    Science.gov (United States)

    Oh, JungSu; Stewart, Alan E; Phelps, Rosemary E

    2017-11-01

    Historical trends in a scientific field should be apparent in the changing content of journal articles over time. Using a topic modeling approach, a statistical method for quantifying the thematic content of text, 70 topics were extracted from the abstracts of 3,603 articles published in the Journal of Counseling Psychology from 1963 to 2015. After examining interpretability of 70 topics derived from the model, 64 meaningful topics and their trends were named. In addition, the authors also classified some of the related topics into 4 categories-counseling process and outcome, multiculturalism, research methodology, and vocational psychology. Counseling process and outcome related topics have decreased recently, while topics relating to multiculturalism and diversity have shown increasing trends. The authors also discussed trends that were observed and tried to account for the changing frequencies of some important research topics within these categories. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Influence of number of topics, topic duration, and curricular focus on biology achievement of Population 3 TIMSS countries

    Science.gov (United States)

    Hodges, Eddie Louis

    The purposes of this study were to determine if a relationship exists between biology achievement and (1) number of topics, (2) topic duration, (3) curricular focus, and (4) science achievement using TIMSS data from Population 3---the final year of secondary school. Students included in this study were subsets of the 55,675 students from 22 countries who participated in the science literacy portion of TIMSS at the Population 3 level (IEA, 1997). The sample included in this study for the four research questions were comprised of between (1) 17,769 and 37,794 students from 15 countries, (2) 17,769 and 37,794 students from 15 countries, (3) 21,715 and 46,458 students from 18 countries, and (4) 19,518 and 46,458 students from 18 countries, respectively. A Pearson's product moment correlation was used to determine whether a relationship exists between the number of biology topics addressed by intended national science curricula and mean student achievement on selected items by country. No statistically significant correlation was found by country between biology achievement and number of topics. To determine whether a relationship exists between the topic duration of biology topics addressed by released, biology-oriented, Population 3 Science Literacy items and student achievement on those items, a Pearson's product moment correlation was also used. A statistically significant correlation between biology achievement and topic duration for only one topic (Interdependence of Life) was found at the .05 level of significance. A possible relationship between the degree of curricular focus of Population 3 TIMSS countries and student achievement on selected items by country was evaluated using a Pearson's product moment correlation. No statistically significant correlation by country between biology achievement and degree of curricular focus was found. A Pearson's product moment correlation was used to determine whether a relationship exists between science literacy

  17. Getting To Exascale: Applying Novel Parallel Programming Models To Lab Applications For The Next Generation Of Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Dube, Evi [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shereda, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Nau, Lee [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Harris, Lance [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-09-27

    As supercomputing moves toward exascale, node architectures will change significantly. CPU core counts on nodes will increase by an order of magnitude or more. Heterogeneous architectures will become more commonplace, with GPUs or FPGAs providing additional computational power. Novel programming models may make better use of on-node parallelism in these new architectures than do current models. In this paper we examine several of these novel models – UPC, CUDA, and OpenCL –to determine their suitability to LLNL scientific application codes. Our study consisted of several phases: We conducted interviews with code teams and selected two codes to port; We learned how to program in the new models and ported the codes; We debugged and tuned the ported applications; We measured results, and documented our findings. We conclude that UPC is a challenge for porting code, Berkeley UPC is not very robust, and UPC is not suitable as a general alternative to OpenMP for a number of reasons. CUDA is well supported and robust but is a proprietary NVIDIA standard, while OpenCL is an open standard. Both are well suited to a specific set of application problems that can be run on GPUs, but some problems are not suited to GPUs. Further study of the landscape of novel models is recommended.

  18. Most common dermatologic topics published in five high-impact general medical journals, 1970-2012: melanoma, psoriasis, herpes simplex, herpes zoster, and acne.

    Science.gov (United States)

    Choi, Young M; Namavar, Aram A; Wu, Jashin J

    2014-01-01

    General practitioners frequently encounter skin diseases and are accustomed to diagnosing the most common dermatologic conditions. We sought to determine the most common dermatologic topics published in five high-impact general medical journals (New England Journal of Medicine, The Lancet, the Journal of the American Medical Association, British Medical Journal (now The BMJ), and Annals of Internal Medicine). We conducted an independent search of the Thomson Reuters’ Science Citation Index for common dermatologic topics, limited to the period 1970 to 2012. Total number of publications dealing with each dermatologic topic considered. The five most common dermatologic topics published were melanoma, psoriasis, herpes simplex, herpes zoster, and acne. Melanoma and psoriasis were the top two dermatologic topics published in each journal except for Annals of Internal Medicine. Internists frequently diagnose herpes simplex, herpes zoster, and acne, which are also common dermatologic topics published. Although internists infrequently diagnose melanoma and psoriasis, they are major topics for general medical journals because of their increased community awareness, major advancements in therapeutic research, and their nondermatologic manifestations.

  19. Extracting Hot spots of Topics from Time Stamped Documents

    Science.gov (United States)

    Chen, Wei; Chundi, Parvathi

    2011-01-01

    Identifying time periods with a burst of activities related to a topic has been an important problem in analyzing time-stamped documents. In this paper, we propose an approach to extract a hot spot of a given topic in a time-stamped document set. Topics can be basic, containing a simple list of keywords, or complex. Logical relationships such as and, or, and not are used to build complex topics from basic topics. A concept of presence measure of a topic based on fuzzy set theory is introduced to compute the amount of information related to the topic in the document set. Each interval in the time period of the document set is associated with a numeric value which we call the discrepancy score. A high discrepancy score indicates that the documents in the time interval are more focused on the topic than those outside of the time interval. A hot spot of a given topic is defined as a time interval with the highest discrepancy score. We first describe a naive implementation for extracting hot spots. We then construct an algorithm called EHE (Efficient Hot Spot Extraction) using several efficient strategies to improve performance. We also introduce the notion of a topic DAG to facilitate an efficient computation of presence measures of complex topics. The proposed approach is illustrated by several experiments on a subset of the TDT-Pilot Corpus and DBLP conference data set. The experiments show that the proposed EHE algorithm significantly outperforms the naive one, and the extracted hot spots of given topics are meaningful. PMID:21765568

  20. Comparing the ocular surface effects of topical vancomycin and linezolid for treating bacterial keratitis.

    Science.gov (United States)

    Akova Budak, Berna; Baykara, Mehmet; Kıvanç, Sertaç Argun; Yilmaz, Hakan; Cicek, Serhat

    2016-01-01

    Vancomycin is the gold standard in combination therapy for severe and resistant gram-positive keratitis and in particular for Methicillin-resistant Staphylococcus aureus (MRSA) infections. The aim of this study was to report the ocular surface toxicity and scoring in patients whose treatment shifted to topical linezolid/ceftazidime from topical vancomycin/ceftazidime due to their vancomycin intolerance. A retrospective, interventional case series of bacterial keratitis was treated with topical linezolid (one drop of 0.2% solution per eye), administered hourly until epithelization and then gradually decreased. The number and extent of punctate epithelial erosions were noted across the entire surface of the cornea. Ocular discomfort was assessed by means of (a) patient-reported pain upon instillation of the medication (vancomycin/linezolid), (b) reported burning sensation between doses and (c) reported foreign-body sensation. No ocular surface toxicity related to linezolid use was noted. Patients were followed for at least 2 months after treatment between April and December 2013. Of the seven patients included in the study (age range: 2-88 years; five females, two males), complete epithelization and resolution was achieved in five patients. One patient was treated with linezolid after penetrating keratoplasty. The second culture of another patient with impending perforation despite linezolid/ceftazidime therapy yielded Fusarium spp., so he underwent tectonic keratoplasty. The mean ocular surface score was 9.4 ± 1.6 during vancomycin treatment and 5.9 ± 1.3 during linezolid treatment after discontinuation of vancomycin. The topical linezolid score was significantly lower (p = 0.027). Topical linezolid may be better tolerated, according to the mean ocular surface score, than topical vancomycin by some patients and can be considered an alternative for patients who do not well tolerate vancomycin.

  1. Topics in supergravity and string theory

    International Nuclear Information System (INIS)

    Eastaugh, A.G.

    1987-01-01

    The first topic covered in this dissertation concerns the harmonic expansion technique and its application to the dimensional compactification of higher dimensional supergravity. A simple example is given to explain the method and then the method is applied to the problem of obtaining the mass spectrum of the squashed seven-sphere compactification of eleven dimensional supergravity. The second topic concerns the application of Fujikawa's method of anomaly calculation to the calculation of the critical dimension of various string models. The third topic is a study and explicit calculation of the Fock space representation of the vertex in Witten's formulation of the interacting open bosonic string field theory

  2. Topical nonsteroidal anti-inflammatory drugs for the treatment of pain due to soft tissue injury: diclofenac epolamine topical patch

    Directory of Open Access Journals (Sweden)

    David R Lionberger

    2010-11-01

    Full Text Available David R Lionberger1, Michael J Brennan21Southwest Orthopedic Group, Houston, TX, USA; 2Department of Medicine, Bridgeport Hospital, Bridgeport, CT, USAAbstract: The objective of this article is to review published clinical data on diclofenac epolamine topical patch 1.3% (DETP in the treatment of acute soft tissue injuries, such as strains, sprains, and contusions. Review of published literature on topical nonsteroidal anti-inflammatory drugs (NSAIDs, diclofenac, and DETP in patients with acute soft tissue injuries was included. Relevant literature was identified on MEDLINE using the search terms topical NSAIDs, diclofenac, diclofenac epolamine, acute pain, sports injury, soft tissue injury, strain, sprain, and contusion, and from citations in retrieved articles covering the years 1978–2008. Review of published, randomized clinical trials and meta-analyses shows that topical NSAIDs are significantly more effective than placebo in relieving acute pain; the pooled average relative benefit was 1.7 (95% confidence interval, 1.5–1.9. In a limited number of comparisons, topical and oral NSAIDs provided comparable pain relief, but the use of topical agents produced lower plasma drug concentrations and fewer systemic adverse events (AEs. The physical–chemical properties of diclofenac epolamine make it well suited for topical use. In patients with acute soft tissue injuries treated with DETP, clinical data report an analgesic benefit within hours of the first application, and significant pain relief relative to placebo within 3 days. Moreover, DETP displayed tolerability comparable with placebo; the most common AEs were pruritus and other application site reactions. Review of published literature suggests that DETP is generally safe and well tolerated, clinically efficacious, and a rational treatment option for patients experiencing acute pain associated with strains, sprains, and contusions, and other localized painful conditions

  3. Topics in modern differential geometry

    CERN Document Server

    Verstraelen, Leopold

    2017-01-01

    A variety of introductory articles is provided on a wide range of topics, including variational problems on curves and surfaces with anisotropic curvature. Experts in the fields of Riemannian, Lorentzian and contact geometry present state-of-the-art reviews of their topics. The contributions are written on a graduate level and contain extended bibliographies. The ten chapters are the result of various doctoral courses which were held in 2009 and 2010 at universities in Leuven, Serbia, Romania and Spain.

  4. Key Topics in Sports Medicine

    OpenAIRE

    2006-01-01

    Key Topics in Sports Medicine is a single quick reference source for sports and exercise medicine. It presents the essential information from across relevant topic areas, and includes both the core and emerging issues in this rapidly developing field. It covers: 1) Sports injuries, rehabilitation and injury prevention, 2) Exercise physiology, fitness testing and training, 3) Drugs in sport, 4) Exercise and health promotion, 5) Sport and exercise for special and clinical populations, 6) The ps...

  5. Therapeutic Effects of Topical Minoxidil or Rosemary and the Combination of Both on the treatment of Alopecia areata

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Lohrasb

    2015-02-01

    Full Text Available Background & Objectives: Considering the prevalence of Alopecia areata, , failure of treatment, and the unknown pathogenesis of this illness, a comparative study was performed by using topical Minoxidil 2% and topical rosemary solution alone and in combination to treatment this disease. Materials & Methods: This study is a clinical trial performed on 200 patients with Alopecia areata referring to Hamzeh clinic of Fasa during the years 2012 and 2013. They were divided into four groups by random permutation, each group contained 50 patients. Group one received the combination of topical Minoxidil 2% and topical rosemary, group two received only topical Minoxidil 2% solution, group three received only topical rosemary solution and the fourth group, the case-control group, did not receive any medication and were just advised to rub the site of the disease for the same period of time. The patients were under observation for one year. Results: The Results of this investigation showed that the best remissions after treatments were as follow (respectively: combination of topical Minoxidil 2% and topical rosemary (27 patient=54 %, Minoxidil 2% solution (23 patients =46%, rosemary solution (21 patients =42%, and case- control group (9 patients =18%. These results showed that despite better response to the combination of rosemary and Minoxidil solutions in comparison to the two other treated groups, the changes were minimal and statistically insignificant (P-value =0.0411. Conclusion: Using the combination of both rosemary and Minoxidil is more effective than the individual one on treatment of Alopecia areata.

  6. Salicylic Acid Topical

    Science.gov (United States)

    ... the package label for more information.Apply a small amount of the salicylic acid product to one or two small areas you want to treat for 3 days ... know that children and teenagers who have chicken pox or the flu should not use topical salicylic ...

  7. Topical Medical Cannabis: A New Treatment for Wound Pain-Three Cases of Pyoderma Gangrenosum.

    Science.gov (United States)

    Maida, Vincent; Corban, Jason

    2017-11-01

    Pain associated with integumentary wounds is highly prevalent, yet it remains an area of significant unmet need within health care. Currently, systemically administered opioids are the mainstay of treatment. However, recent publications are casting opioids in a negative light given their high side effect profile, inhibition of wound healing, and association with accidental overdose, incidents that are frequently fatal. Thus, novel analgesic strategies for wound-related pain need to be investigated. The ideal methods of pain relief for wound patients are modalities that are topical, lack systemic side effects, noninvasive, self-administered, and display rapid onset of analgesia. Extracts derived from the cannabis plant have been applied to wounds for thousands of years. The discovery of the human endocannabinoid system and its dominant presence throughout the integumentary system provides a valid and logical scientific platform to consider the use of topical cannabinoids for wounds. We are reporting a prospective case series of three patients with pyoderma gangrenosum that were treated with topical medical cannabis compounded in nongenetically modified organic sunflower oil. Clinically significant analgesia that was associated with reduced opioid utilization was noted in all three cases. Topical medical cannabis has the potential to improve pain management in patients suffering from wounds of all classes. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  8. Successful Treatment of Cutaneous Botryomycosis with a Combination of Minocycline and Topical Heat Therapy

    Directory of Open Access Journals (Sweden)

    Masaya Ishibashi

    2012-05-01

    Full Text Available Cutaneous botryomycosis is a chronic focal infection characterized by a granulomatous inflammatory response to bacterial pathogens such as Staphylococcus aureus. Treatment requires antibiotic therapy and may also require surgical debridement. We employed topical heat therapy and oral minocycline. The lesions became flattened and pigmented after 1 month. We consider that this simple treatment can be an effective and harmless complementary therapy for cutaneous botryomycosis.

  9. Topical report review status. Volume 9, No. 1

    International Nuclear Information System (INIS)

    1995-02-01

    This report provides industry with procedures for submitting topical reports, guidance on how the US Nuclear Regulatory Commission (NRC) processes and responds to topical report submittals, and an accounting, with review schedules, of all topical reports currently accepted for review by the NRC. This report is published semiannually

  10. Topical nonsteroidal anti-inflammatory drugs for the treatment of pain due to soft tissue injury: diclofenac epolamine topical patch.

    Science.gov (United States)

    Lionberger, David R; Brennan, Michael J

    2010-11-10

    The objective of this article is to review published clinical data on diclofenac epolamine topical patch 1.3% (DETP) in the treatment of acute soft tissue injuries, such as strains, sprains, and contusions. Review of published literature on topical nonsteroidal anti-inflammatory drugs (NSAIDs), diclofenac, and DETP in patients with acute soft tissue injuries was included. Relevant literature was identified on MEDLINE using the search terms topical NSAIDs, diclofenac, diclofenac epolamine, acute pain, sports injury, soft tissue injury, strain, sprain, and contusion, and from citations in retrieved articles covering the years 1978-2008. Review of published, randomized clinical trials and meta-analyses shows that topical NSAIDs are significantly more effective than placebo in relieving acute pain; the pooled average relative benefit was 1.7 (95% confidence interval, 1.5-1.9). In a limited number of comparisons, topical and oral NSAIDs provided comparable pain relief, but the use of topical agents produced lower plasma drug concentrations and fewer systemic adverse events (AEs). The physical-chemical properties of diclofenac epolamine make it well suited for topical use. In patients with acute soft tissue injuries treated with DETP, clinical data report an analgesic benefit within hours of the first application, and significant pain relief relative to placebo within 3 days. Moreover, DETP displayed tolerability comparable with placebo; the most common AEs were pruritus and other application site reactions. Review of published literature suggests that DETP is generally safe and well tolerated, clinically efficacious, and a rational treatment option for patients experiencing acute pain associated with strains, sprains, and contusions, and other localized painful conditions.

  11. Probabilistic topic modeling for the analysis and classification of genomic sequences

    Science.gov (United States)

    2015-01-01

    Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734

  12. Hypercalcaemia-induced kidney injury caused by the vitamin D analogue calcitriol for psoriasis: a note of caution when prescribing topical treatment.

    Science.gov (United States)

    Corden, E; Higgins, E; Smith, C

    2016-12-01

    A 55-year-old man with severe plaque psoriasis presented with a 2-week history of feeling generally unwell with lethargy and thirst. His symptoms had developed 6 weeks after commencement of the topical vitamin D3 analogue calcitriol. Investigations revealed hypercalcaemia and acute-on-chronic kidney injury, probably directly induced by systemic absorption of vitamin D3 following extensive topical use. Topical calcitriol had been started as a steroid-sparing agent to reduce the patient's liberal potent corticosteroid usage during anti-tumour necrosis factor-alfa therapy. Topical vitamin D analogues are commonly prescribed in dermatological and general practice, with hypercalcaemia being a rare but potentially serious adverse effect. This case serves to outline key factors that may predispose to hypercalcaemia, such as disease extent, quantity of drug applied, comorbidities and concurrent medications, and it highlights the importance of considering these factors when prescribing topical therapies. © 2016 British Association of Dermatologists.

  13. Topical treatment of psoriasis: questionnaire results on topical therapy accessibility and influence of body surface area on usage

    NARCIS (Netherlands)

    Iversen, L.; Lange, M.M. De; Bissonette, R.; Carvalho, A.V.E.; Kerkhof, P.C.M. van de; Kirby, B.; Kleyn, C.E.; Lynde, C.W.; Walt, J.M. van der; Wu, J.J.

    2017-01-01

    BACKGROUND: Topical treatment of mild to moderate psoriasis is first-line treatment and exhibits varying degrees of success across patient groups. Key factors influencing treatment success are physician topical treatment choice (high efficacy, low adverse events) and strict patient adherence.

  14. Hot topics: Signal processing in acoustics

    Science.gov (United States)

    Gaumond, Charles F.

    2005-09-01

    Signal processing in acoustics is a multidisciplinary group of people that work in many areas of acoustics. We have chosen two areas that have shown exciting new applications of signal processing to acoustics or have shown exciting and important results from the use of signal processing. In this session, two hot topics are shown: the use of noiselike acoustic fields to determine sound propagation structure and the use of localization to determine animal behaviors. The first topic shows the application of correlation on geo-acoustic fields to determine the Greens function for propagation through the Earth. These results can then be further used to solve geo-acoustic inverse problems. The first topic also shows the application of correlation using oceanic noise fields to determine the Greens function through the ocean. These results also have utility for oceanic inverse problems. The second topic shows exciting results from the detection, localization, and tracking of marine mammals by two different groups. Results from detection and localization of bullfrogs are shown, too. Each of these studies contributed to the knowledge of animal behavior. [Work supported by ONR.

  15. Topical Valproate Solution for Hair Growth

    Directory of Open Access Journals (Sweden)

    Anil Kakunje

    2018-05-01

    Full Text Available Valproate is used regularly in the treatment of various seizure disorders, bipolar disorder, migraine prophylaxis and off label in many other conditions. Alopecia or hair loss is cosmetic side effect of oral valproate administration. Hair loss with valproate is diffused, non-scarring and dose related. A large number of drugs may interfere with the hair cycle and produce hair loss. We have only a few drugs like Minoxidil, Finasteride used for hair regeneration and both have its own side effects and limitations. In contrast to oral ingestions of valproate causing hair loss, early experiments with topical Valproic acid cream showed hair regeneration. Valproic acid cream is currently unavailable in the market, alternatively, we do have valproate and divalproex solutions available in various strengths which have a potential to be used topically for hair regeneration. The side effects and cost of topical valproate solution could be much less than the available options in the market. Valproate solution topically has the potential to be used for hair growth.

  16. Evaluating topic model interpretability from a primary care physician perspective.

    Science.gov (United States)

    Arnold, Corey W; Oh, Andrea; Chen, Shawn; Speier, William

    2016-02-01

    Probabilistic topic models provide an unsupervised method for analyzing unstructured text. These models discover semantically coherent combinations of words (topics) that could be integrated in a clinical automatic summarization system for primary care physicians performing chart review. However, the human interpretability of topics discovered from clinical reports is unknown. Our objective is to assess the coherence of topics and their ability to represent the contents of clinical reports from a primary care physician's point of view. Three latent Dirichlet allocation models (50 topics, 100 topics, and 150 topics) were fit to a large collection of clinical reports. Topics were manually evaluated by primary care physicians and graduate students. Wilcoxon Signed-Rank Tests for Paired Samples were used to evaluate differences between different topic models, while differences in performance between students and primary care physicians (PCPs) were tested using Mann-Whitney U tests for each of the tasks. While the 150-topic model produced the best log likelihood, participants were most accurate at identifying words that did not belong in topics learned by the 100-topic model, suggesting that 100 topics provides better relative granularity of discovered semantic themes for the data set used in this study. Models were comparable in their ability to represent the contents of documents. Primary care physicians significantly outperformed students in both tasks. This work establishes a baseline of interpretability for topic models trained with clinical reports, and provides insights on the appropriateness of using topic models for informatics applications. Our results indicate that PCPs find discovered topics more coherent and representative of clinical reports relative to students, warranting further research into their use for automatic summarization. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. Topics on continua

    CERN Document Server

    Macias, Sergio

    2005-01-01

    Specialized as it might be, continuum theory is one of the most intriguing areas in mathematics. However, despite being popular journal fare, few books have thoroughly explored this interesting aspect of topology. In Topics on Continua, Sergio Macías, one of the field's leading scholars, presents four of his favorite continuum topics: inverse limits, Jones's set function T, homogenous continua, and n-fold hyperspaces, and in doing so, presents the most complete set of theorems and proofs ever contained in a single topology volume. Many of the results presented have previously appeared only in research papers, and some appear here for the first time. After building the requisite background and exploring the inverse limits of continua, the discussions focus on Professor Jones''s set function T and continua for which T is continuous. An introduction to topological groups and group actions lead to a proof of Effros''s Theorem, followed by a presentation of two decomposition theorems. The author then offers an...

  18. Encapsulation of cosmetic active ingredients for topical application--a review.

    Science.gov (United States)

    Casanova, Francisca; Santos, Lúcia

    2016-02-01

    Microencapsulation is finding increasing applications in cosmetics and personal care markets. This article provides an overall discussion on encapsulation of cosmetically active ingredients and encapsulation techniques for cosmetic and personal care products for topical applications. Some of the challenges are identified and critical aspects and future perspectives are addressed. Many cosmetics and personal care products contain biologically active substances that require encapsulation for increased stability of the active materials. The topical and transdermal delivery of active cosmetic ingredients requires effective, controlled and safe means of reaching the target site within the skin. Preservation of the active ingredients is also essential during formulation, storage and application of the final cosmetic product. Microencapsulation offers an ideal and unique carrier system for cosmetic active ingredients, as it has the potential to respond to all these requirements. The encapsulated agent can be released by several mechanisms, such as mechanical action, heat, diffusion, pH, biodegradation and dissolution. The selection of the encapsulation technique and shell material depends on the final application of the product, considering physical and chemical stability, concentration, required particle size, release mechanism and manufacturing costs.

  19. Including plasma and fusion topics in the science education in school

    International Nuclear Information System (INIS)

    Kado, Shinichiro

    2015-01-01

    Yutori education (more relaxed education policy) started with the revision of the Courses of Study to introduce 'five-day week system' in 1989, continued with the reduction of the content of school lessons by 30% in 1998, and ended with the introduction of the New Courses of Study in 2011. Focusing on science education, especially in the topics of plasma and nuclear fusion, the modality of the education system in Japan is discussed considering the transition of academic performance based on the Program for International Student Assessment (PISA) in comparison with the examples in other countries. Particularly, the issues with high school textbooks are pointed out from the assessment of current textbooks, and the significance and the need for including the topic of 'plasma' in them are stated. Lastly, in order to make the general public acknowledged with plasma and nuclear fusion, it is suggested to include them also in junior high school textbooks, by briefly mentioning the terms related to plasma, solar wind, aurora phenomenon, and nuclear fusion energy. (S.K.)

  20. Osteoarthritis guidelines: a progressive role for topical nonsteroidal anti-inflammatory drugs.

    Science.gov (United States)

    Stanos, Steven P

    2013-01-01

    Current treatment guidelines for the treatment of chronic pain associated with osteoarthritis reflect the collective clinical knowledge of international experts in weighing the benefits of pharmacologic therapy options while striving to minimize the negative effects associated with them. Consideration of disease progression, pattern of flares, level of functional impairment or disability, response to treatment, coexisting conditions such as cardiovascular disease or gastrointestinal disorders, and concomitant prescription medication use should be considered when creating a therapeutic plan for a patient with osteoarthritis. Although topical nonsteroidal anti-inflammatory drugs historically have not been prevalent in many of the guidelines for osteoarthritis treatment, recent evidence-based medicine and new guidelines now support their use as a viable option for the clinician seeking alternatives to typical oral formulations. This article provides a qualitative review of these treatment guidelines and the emerging role of topical nonsteroidal anti-inflammatory drugs as a therapy option for patients with localized symptoms of osteoarthritis who may be at risk for oral nonsteroidal anti-inflammatory drug-related serious adverse events.

  1. Topical cholesterol in clofazimine induced ichthyosis

    Directory of Open Access Journals (Sweden)

    Pandey S

    1994-01-01

    Full Text Available Topical application of 10% cholesterol in petrolatum significantly (P< 0.05 controlled the development of ichthyosis in 62 patients taking 100 mg clofazimine daily for a period of 3 months. However, topical cholesterol application did not affect the lowering of serum cholesterol induced by oral clofazimine. Probable mechanism of action is being discussed.

  2. Topical antibiotic monotherapy prescribing practices in acne vulgaris.

    Science.gov (United States)

    Hoover, William D; Davis, Scott A; Fleischer, Alan B; Feldman, Steven R

    2014-04-01

    The aim of this study is to evaluate the frequency of dosing topical antibiotics as monotherapy in the treatment of acne vulgaris, and physician specialty prescribing these medications. This study is a retrospective review of all visits with a sole diagnosis of acne vulgaris (ICD-9-CM code 706.1) found on the National Ambulatory Medical Care Survey (NAMCS) in 1993-2010. We recorded the number of visits surveyed where acne vulgaris was the sole diagnosis, number of visits where topical antibiotics were the only treatment prescribed, and the specialty of physician in each encounter. Topical erythromycin or clindamycin were the sole medication prescribed in 0.81% of the visits recorded, with 60% of these prescriptions arising from dermatologists and 40% from non-dermatologists. The trend of prescribing topical antibiotic monotherapy is declining (p acnes to topical antibiotic regimens has led to the need to re-evaluate the use of topical antibiotics in the treatment of acne vulgaris. While the rate of topical antibiotic monotherapy is declining, their use should be reserved for situations where the direct need for antibiotics arises. If a clinician feels that antibiotics are a necessary component to acne therapy, they should be used as part of a combination regimen.

  3. Most Common Dermatologic Topics Published in Five High-Impact General Medical Journals, 1970–2012: Melanoma, Psoriasis, Herpes Simplex, Herpes Zoster, and Acne

    Science.gov (United States)

    Choi, Young M; Namavar, Aram A; Wu, Jashin J

    2014-01-01

    Context: General practitioners frequently encounter skin diseases and are accustomed to diagnosing the most common dermatologic conditions. Objective: We sought to determine the most common dermatologic topics published in five high-impact general medical journals (New England Journal of Medicine, The Lancet, the Journal of the American Medical Association, British Medical Journal (now The BMJ), and Annals of Internal Medicine). Design: We conducted an independent search of the Thomson Reuters’ Science Citation Index for common dermatologic topics, limited to the period 1970 to 2012. Main Outcome Measure: Total number of publications dealing with each dermatologic topic considered. Results: The five most common dermatologic topics published were melanoma, psoriasis, herpes simplex, herpes zoster, and acne. Melanoma and psoriasis were the top two dermatologic topics published in each journal except for Annals of Internal Medicine. Conclusions: Internists frequently diagnose herpes simplex, herpes zoster, and acne, which are also common dermatologic topics published. Although internists infrequently diagnose melanoma and psoriasis, they are major topics for general medical journals because of their increased community awareness, major advancements in therapeutic research, and their nondermatologic manifestations. PMID:25662523

  4. Baby Health Checkup: MedlinePlus Health Topic

    Science.gov (United States)

    ... Know (Centers for Disease Control and Prevention) - PDF Topic Image MedlinePlus Email Updates Get Baby Health Checkup ... GO MEDICAL ENCYCLOPEDIA Well-child visits Related Health Topics Childhood Immunization Common Infant and Newborn Problems Infant ...

  5. Laser Eye Surgery: MedlinePlus Health Topic

    Science.gov (United States)

    ... corneal surgery - discharge (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Laser Eye Surgery ... surgery - what to ask your doctor Related Health Topics Refractive Errors National Institutes of Health The primary ...

  6. Child Mental Health: MedlinePlus Health Topic

    Science.gov (United States)

    ... events and children (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Child Mental Health ... in childhood Traumatic events and children Related Health Topics Bullying Child Behavior Disorders Mental Disorders Mental Health ...

  7. Bone Marrow Transplantation: MedlinePlus Health Topic

    Science.gov (United States)

    ... marrow transplant - discharge (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Bone Marrow Transplantation ... transplant - slideshow Graft-versus-host disease Related Health Topics Bone Marrow Diseases Stem Cells National Institutes of ...

  8. Nutrition for Seniors: MedlinePlus Health Topic

    Science.gov (United States)

    ... America) National Institute on Aging Also in Spanish Topic Image MedlinePlus Email Updates Get Nutrition for Seniors updates by email What's this? GO Related Health Topics Nutrition Seniors' Health National Institutes of Health The ...

  9. Blood Count Tests: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish WBC count (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Blood Count Tests ... WBC count Show More Show Less Related Health Topics Bleeding Disorders Blood Laboratory Tests National Institutes of ...

  10. Hormone Replacement Therapy: MedlinePlus Health Topic

    Science.gov (United States)

    ... of hormone therapy (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Hormone Replacement Therapy ... Estrogen overdose Types of hormone therapy Related Health Topics Menopause National Institutes of Health The primary NIH ...

  11. Topics in millimeter wave technology

    CERN Document Server

    Button, Kenneth

    1988-01-01

    Topics in Millimeter Wave Technology, Volume 1 presents topics related to millimeter wave technology, including fin-lines and passive components realized in fin-lines, suspended striplines, suspended substrate microstrips, and modal power exchange in multimode fibers. A miniaturized monopulse assembly constructed in planar waveguide with multimode scalar horn feeds is also described. This volume is comprised of five chapters; the first of which deals with the analysis and synthesis techniques for fin-lines as well as the various passive components realized in fin-line. Tapers, discontinuities,

  12. Topics in nuclear and radiochemistry for college curricula and high school science programs

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    The concern with the current status and trends of nuclear chemistry and radiochemistry education in academic institutions was addressed in a recent workshop. The 1988 workshop considered the important contributions that scientist with nuclear and radiochemistry backgrounds have made and are continuing to make to other sciences and to various applied fields. Among the areas discussed were environmental studies, life sciences, materials science, separation technology, hot atom chemistry, cosmochemistry, and the rapidly growing field of nuclear medicine. It is intent of the organizer and participants of this symposium entitled Topics in Nuclear and Radiochemistry for College Curricula and High School Science Program'' to provide lecture material on topics related to nuclear and radiochemistry to educators. It is our hope that teachers, who may or may not be familiar with the field, will find this collections of articles useful and incorporate some of them into their lectures.

  13. Topics in nuclear and radiochemistry for college curricula and high school science programs

    International Nuclear Information System (INIS)

    1990-01-01

    The concern with the current status and trends of nuclear chemistry and radiochemistry education in academic institutions was addressed in a recent workshop. The 1988 workshop considered the important contributions that scientist with nuclear and radiochemistry backgrounds have made and are continuing to make to other sciences and to various applied fields. Among the areas discussed were environmental studies, life sciences, materials science, separation technology, hot atom chemistry, cosmochemistry, and the rapidly growing field of nuclear medicine. It is intent of the organizer and participants of this symposium entitled ''Topics in Nuclear and Radiochemistry for College Curricula and High School Science Program'' to provide lecture material on topics related to nuclear and radiochemistry to educators. It is our hope that teachers, who may or may not be familiar with the field, will find this collections of articles useful and incorporate some of them into their lectures

  14. Topical report review status. Volume 7, No. 2

    International Nuclear Information System (INIS)

    1984-11-01

    Purpose of this document is to provide periodic progress reports of on-going topical report reviews, to identify those topical reports for which the NRC staff review has been completed and those which are under review and to provide NRC management with sufficient information regarding the conduct of the topical report program to permit taking whatever actions are deemed necessary or appropriate. This document is also intended to be a source of information to NRC Licensing Project Managers and other NRC personnel regarding the status of topical reports which may be referenced in applications for which they have responsibility

  15. Treatment of multiple-level tracheobronchial stenosis secondary to endobronchial tuberculosis using bronchoscopic balloon dilatation with topical mitomycin-C.

    Science.gov (United States)

    Faisal, Mohamed; Harun, Hafaruzi; Hassan, Tidi M; Ban, Andrea Y L; Chotirmall, Sanjay H; Abdul Rahaman, Jamalul Azizi

    2016-04-14

    Tracheobronchial stenosis is a known complication of endobronchial tuberculosis. Despite antituberculous and steroid therapy, the development of bronchial stenosis is usually irreversible and requires airway patency to be restored by either bronchoscopic or surgical interventions. We report the use of balloon dilatation and topical mitomycin-C to successful restore airway patency. We present a 24-year old lady with previous pulmonary tuberculosis and laryngeal tuberculosis in 2007 and 2013 respectively who presented with worsening dyspnoea and stridor. She had total left lung collapse with stenosis of both the upper trachea and left main bronchus. She underwent successful bronchoscopic balloon and manual rigid tube dilatation with topical mitomycin-C application over the stenotic tracheal segment. A second bronchoscopic intervention was performed after 20 weeks for the left main bronchus stenosis with serial balloon dilatation and topical mitomycin-C application. These interventions led to significant clinical and radiographic improvements. This case highlights that balloon dilatation and topical mitomycin-C application should be considered in selected patients with tracheobronchial stenosis following endobronchial tuberculosis, avoiding airway stenting and invasive surgical intervention.

  16. Effect of psoriasis activity and topical treatment on serum lipocalin-2 levels.

    Science.gov (United States)

    Baran, A; Świderska, M; Myśliwiec, H; Flisiak, I

    2017-03-01

    Psoriasis has been considered as systemic disorder. Lipocalin-2 might be a link between psoriasis and its comorbidities. Aim of the study was to investigate the associations between serum lipocalin-2 levels and the disease activity, markers of inflammation or metabolic disturbances and changes after topical treatment in psoriatic patients. Thirty-seven individuals with active plaque-type psoriasis and 15 healthy controls were recruited. Blood samples were collected before and after 14 days of therapy. Serum lipocalin-2 concentrations were examined by enzyme-linked immunosorbent assay. The results were correlated with Psoriasis Area and Severity Index (PASI), body mass index (BMI), inflammatory and biochemical markers, lipid profile and with effectiveness of topical treatment. Lipocalin-2 serum levels were significantly increased in psoriatic patients in comparison to the controls (p = 0.023). No significant correlations with indicators of inflammation, nor BMI or PASI were noted. A statistical association between lipocalin-2 and low-density lipoprotein-cholesterol was shown. After topical treatment serum lipocalin-2 level did not significantly change (p = 0.9), still remaining higher than in the controls, despite clinical improvement. Lipocalin-2 might be a marker of psoriasis and convey cardiovascular or metabolic risk in psoriatic patients, but may not be a reliable indicator of inflammation, severity of psoriasis nor efficacy of antipsoriatic treatment.

  17. Development and Evaluation of Topical Gabapentin Formulations

    Science.gov (United States)

    Alcock, Natalie; Hiom, Sarah; Birchall, James C.

    2017-01-01

    Topical delivery of gabapentin is desirable to treat peripheral neuropathic pain conditions whilst avoiding systemic side effects. To date, reports of topical gabapentin delivery in vitro have been variable and dependent on the skin model employed, primarily involving rodent and porcine models. In this study a variety of topical gabapentin formulations were investigated, including Carbopol® hydrogels containing various permeation enhancers, and a range of proprietary bases including a compounded Lipoderm® formulation; furthermore microneedle facilitated delivery was used as a positive control. Critically, permeation of gabapentin across a human epidermal membrane in vitro was assessed using Franz-type diffusion cells. Subsequently this data was contextualised within the wider scope of the literature. Although reports of topical gabapentin delivery have been shown to vary, largely dependent upon the skin model used, this study demonstrated that 6% (w/w) gabapentin 0.75% (w/w) Carbopol® hydrogels containing 5% (w/w) DMSO or 70% (w/w) ethanol and a compounded 10% (w/w) gabapentin Lipoderm® formulation were able to facilitate permeation of the molecule across human skin. Further pre-clinical and clinical studies are required to investigate the topical delivery performance and pharmacodynamic actions of prospective formulations. PMID:28867811

  18. Selected topics in nuclear structure

    Energy Technology Data Exchange (ETDEWEB)

    Solov` ev, V G; Gromov, K Ya; Malov, L A; Shilov, V M

    1994-12-31

    The Fourth International Conference on selected topics in nuclear structure was held at Dubna in July 1994 on recent experimental and theoretical investigations in nuclear structure. Topics discussed were the following: nuclear structure at low-energy excitations (collective quasiparticle phenomena, proton-neutron interactions, microscopic and phenomenological theories of nuclear structure; nuclear structure studies with charged particles). heavy ions, neutrons and photons; nuclei at high angular momenta and superdeformation, structure and decay properties of giant resonances, charge-exchange resonances and {beta}-decay; semiclassical approach of large amplitude collective motion and structure of hot nuclei.

  19. Topics in current aerosol research

    CERN Document Server

    Hidy, G M

    1971-01-01

    Topics in Current Aerosol Research deals with the fundamental aspects of aerosol science, with emphasis on experiment and theory describing highly dispersed aerosols (HDAs) as well as the dynamics of charged suspensions. Topics covered range from the basic properties of HDAs to their formation and methods of generation; sources of electric charges; interactions between fluid and aerosol particles; and one-dimensional motion of charged cloud of particles. This volume is comprised of 13 chapters and begins with an introduction to the basic properties of HDAs, followed by a discussion on the form

  20. Selected topics in nuclear structure

    International Nuclear Information System (INIS)

    Solov'ev, V.G.; Gromov, K.Ya.; Malov, L.A.; Shilov, V.M.

    1994-01-01

    The Fourth International Conference on selected topics in nuclear structure was held at Dubna in July 1994 on recent experimental and theoretical investigations in nuclear structure. Topics discussed were the following: nuclear structure at low-energy excitations (collective quasiparticle phenomena, proton-neutron interactions, microscopic and phenomenological theories of nuclear structure; nuclear structure studies with charged particles. heavy ions, neutrons and photons; nuclei at high angular momenta and superdeformation, structure and decay properties of giant resonances, charge-exchange resonances and β-decay; semiclassical approach of large amplitude collective motion and structure of hot nuclei

  1. Topical Antibacterials and Global Challenges on Resistance ...

    African Journals Online (AJOL)

    skin infections can be easily treated with topical antibacterial medication that is available over the counter or by ... infection in minor cut or burn, eyes and ear infection [5]. .... Sensitive/dry skin ... includes both oral and topical antibiotics, but.

  2. Topics in Finance Part VII--Dividend Policy

    Science.gov (United States)

    Laux, Judy

    2011-01-01

    This series inspects the major topics in finance, reviewing the roles of stockholder wealth maximization, the risk-return tradeoff, and agency conflicts. The current article, devoted to dividend policy, also reviews the topic as presented in textbooks and the literature.

  3. Topics in Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Chung, K.C.

    1982-01-01

    Some topics in nuclear astrophysics are discussed, e.g.: highly evolved stellar cores, stellar evolution (through the temperature analysis of stellar surface), nucleosynthesis and finally the solar neutrino problem. (L.C.) [pt

  4. Case Of Iatrogenic Cushing's Syndrome By Topical Triamcinolone.

    Science.gov (United States)

    Zil-E-Ali, Ahsan; Janjua, Omer Hanif; Latif, Aiza; Aadil, Muhammad

    2018-01-01

    Cushing's syndrome is a collection of signs and symptoms due to hypercortisolism. Prolong use of topical steroid may cause this syndrome and suppression of hypothalamic and pituitary function, however such events are more common with oral and parenteral route. There are very few cases of Cushing's syndrome with a topical application amongst which triamcinolone is the rarest drug. We report a case of 11-year-old boy is presented who developed Cushing's disease by topical application. The child had body rashes for which the caregiver consulted a local quack, a topical cream of triamcinolone was prescribed. After application for three months, the patient became obese and developed a moon-like face. A thorough biochemical workup and diagnostic test for Cushing's disease was done to confirm. The following case report a dramatic example of development of the syndrome from chronic topical application of the least potent corticosteroid.

  5. Interpretable Topic Features for Post-ICU Mortality Prediction.

    Science.gov (United States)

    Luo, Yen-Fu; Rumshisky, Anna

    2016-01-01

    Electronic health records provide valuable resources for understanding the correlation between various diseases and mortality. The analysis of post-discharge mortality is critical for healthcare professionals to follow up potential causes of death after a patient is discharged from the hospital and give prompt treatment. Moreover, it may reduce the cost derived from readmissions and improve the quality of healthcare. Our work focused on post-discharge ICU mortality prediction. In addition to features derived from physiological measurements, we incorporated ICD-9-CM hierarchy into Bayesian topic model learning and extracted topic features from medical notes. We achieved highest AUCs of 0.835 and 0.829 for 30-day and 6-month post-discharge mortality prediction using baseline and topic proportions derived from Labeled-LDA. Moreover, our work emphasized the interpretability of topic features derived from topic model which may facilitates the understanding and investigation of the complexity between mortality and diseases.

  6. Exploring Topic Structure: Coherence, Diversity and Relatedness

    NARCIS (Netherlands)

    J. He (Jiyin)

    2011-01-01

    htmlabstractThe use of topical information has long been studied in the context of information retrieval. For example, grouping search results into topical categories enables more effective information presentation to users, while grouping documents in a collection can lead to efficient information

  7. topicmodels: An R Package for Fitting Topic Models

    Directory of Open Access Journals (Sweden)

    Bettina Grun

    2011-05-01

    Full Text Available Topic models allow the probabilistic modeling of term frequency occurrences in documents. The fitted model can be used to estimate the similarity between documents as well as between a set of specified keywords using an additional layer of latent variables which are referred to as topics. The R package topicmodels provides basic infrastructure for fitting topic models based on data structures from the text mining package tm. The package includes interfaces to two algorithms for fitting topic models: the variational expectation-maximization algorithm provided by David M. Blei and co-authors and an algorithm using Gibbs sampling by Xuan-Hieu Phan and co-authors.

  8. Topical 0.25% desoximetasone spray efficacy for moderate to severe plaque psoriasis: a randomized clinical trial.

    Science.gov (United States)

    Saleem, Mohammed D; Negus, Deborah; Feldman, Steven R

    2018-02-01

    Traditionally, ointments were the vehicle of choice for psoriasis. Poor adherence of traditional vehicles limits the use of topical corticosteroids. Alternative formulations have gained popularity due to their ease of application, improved adherence and efficacy. To evaluate the efficacy of topical desoximetasone 0.25% spray formulation in extensive psoriasis. This multicenter, double-blinded, randomized trial compared twice daily topical 0.25% desoximetasone spray to placebo in subjects ≥18 with moderate to severe plaque psoriasis. Primary outcome of the study was the proportion of subjects in each group that achieved clinical success (Physician Global Assessment [PGA] of 0 or 1) and/or treatment success at (target lesion score of 0 or 1) day 28. One-hundred-and-twenty subjects were enrolled. At baseline, 75.0% and 73.3% of the treatment and placebo group had at least moderate PGA, respectively. Clinical success in the intended-to treat and placebo group was 30% and 5% (p = .0003), respectively; treatment success was 39% and 7% (p psoriasis treatments limits the ability to compare the results to other treatments. Topical desoximetasone spray provides rapid control of moderate to severe psoriasis lesions and may be considered for patients awaiting approval of biologicals. Clinical Trial was registered at clinicaltrial.gov: NCT01206387.

  9. Topical Research: Africa.

    Science.gov (United States)

    Lynn, Karen

    This lesson plan can be used in social studies, language arts, or library research. The instructional objective is for students to select a topic of study relating to Africa, write a thesis statement, collect information from media sources, and develop a conclusion. The teacher may assign the lesson for written or oral evaluation. The teacher…

  10. Topic structure affects semantic integration: evidence from event-related potentials.

    Science.gov (United States)

    Yang, Xiaohong; Chen, Xuhai; Chen, Shuang; Xu, Xiaoying; Yang, Yufang

    2013-01-01

    This study investigated whether semantic integration in discourse context could be influenced by topic structure using event-related brain potentials. Participants read discourses in which the last sentence contained a critical word that was either congruent or incongruent with the topic established in the first sentence. The intervening sentences between the first and the last sentence of the discourse either maintained or shifted the original topic. Results showed that incongruent words in topic-maintained discourses elicited an N400 effect that was broadly distributed over the scalp while those in topic-shifted discourses elicited an N400 effect that was lateralized to the right hemisphere and localized over central and posterior areas. Moreover, a late positivity effect was only elicited by incongruent words in topic-shifted discourses, but not in topic-maintained discourses. This suggests an important role for discourse structure in semantic integration, such that compared with topic-maintained discourses, the complexity of discourse structure in topic-shifted condition reduces the initial stage of semantic integration and enhances the later stage in which a mental representation is updated.

  11. Diclofenac Topical (osteoarthritis pain)

    Science.gov (United States)

    ... gel (Voltaren) is used to relieve pain from osteoarthritis (arthritis caused by a breakdown of the lining ... Diclofenac topical liquid (Pennsaid) is used to relieve osteoarthritis pain in the knees. Diclofenac is in a ...

  12. Diclofenac Topical (actinic keratosis)

    Science.gov (United States)

    ... topical gel (Solaraze) is used to treat actinic keratosis (flat, scaly growths on the skin caused by ... The way diclofenac gel works to treat actinic keratosis is not known.Diclofenac is also available as ...

  13. Proceedings of the international topical meeting on advances in human factors in nuclear power systems

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    This book presents the papers given at a conference on the human factors engineering of nuclear power plants. Topics considered at the conference included human modeling, artificial intelligence, expert systems, robotics and teleoperations, organizational issues, innovative applications, testing and evaluation, training systems technology, a modeling framework for crew decisions during reactor accident sequences, intelligent operator support systems, control algorithms for robot navigation, and personnel management

  14. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  15. Digital Social Network Mining for Topic Discovery

    Science.gov (United States)

    Moradianzadeh, Pooya; Mohi, Maryam; Sadighi Moshkenani, Mohsen

    Networked computers are expanding more and more around the world, and digital social networks becoming of great importance for many people's work and leisure. This paper mainly focused on discovering the topic of exchanging information in digital social network. In brief, our method is to use a hierarchical dictionary of related topics and words that mapped to a graph. Then, with comparing the extracted keywords from the context of social network with graph nodes, probability of relation between context and desired topics will be computed. This model can be used in many applications such as advertising, viral marketing and high-risk group detection.

  16. Development and Evaluation of Topical Gabapentin Formulations

    Directory of Open Access Journals (Sweden)

    Christopher J. Martin

    2017-08-01

    Full Text Available Topical delivery of gabapentin is desirable to treat peripheral neuropathic pain conditions whilst avoiding systemic side effects. To date, reports of topical gabapentin delivery in vitro have been variable and dependent on the skin model employed, primarily involving rodent and porcine models. In this study a variety of topical gabapentin formulations were investigated, including Carbopol® hydrogels containing various permeation enhancers, and a range of proprietary bases including a compounded Lipoderm® formulation; furthermore microneedle facilitated delivery was used as a positive control. Critically, permeation of gabapentin across a human epidermal membrane in vitro was assessed using Franz-type diffusion cells. Subsequently this data was contextualised within the wider scope of the literature. Although reports of topical gabapentin delivery have been shown to vary, largely dependent upon the skin model used, this study demonstrated that 6% (w/w gabapentin 0.75% (w/w Carbopol® hydrogels containing 5% (w/w DMSO or 70% (w/w ethanol and a compounded 10% (w/w gabapentin Lipoderm® formulation were able to facilitate permeation of the molecule across human skin. Further pre-clinical and clinical studies are required to investigate the topical delivery performance and pharmacodynamic actions of prospective formulations.

  17. Antimicrobial topical agents used in the vagina.

    Science.gov (United States)

    Frey Tirri, Brigitte

    2011-01-01

    Vaginally applied antimicrobial agents are widely used in the vagina in women with lower genital tract infections. An 'antimicrobial' is a general term that refers to a group of drugs that are effective against bacteria, fungi, viruses and protozoa. Topical treatments can be prescribed for a wide variety of vaginal infections. Many bacterial infections, such as bacterial vaginosis, desquamative inflammatory vaginitis or, as some European authors call it, aerobic vaginitis as well as infection with Staphylococcus aureus or group A streptococci, may be treated in this way. Candida vulvovaginitis is a fungal infection that is very amenable to topical treatment. The most common viral infections which can be treated with topical medications are condylomata acuminata and herpes simplex. The most often encountered protozoal vaginitis, which is caused by Trichomonas vaginalis, may be susceptible to topical medications, although this infection is treated systemically. This chapter covers the wide variety of commonly used topical antimicrobial agents for these diseases and focuses on the individual therapeutic agents and their clinical efficacy. In addition, potential difficulties that can occur in practice, as well as the usage of these medications in the special setting of pregnancy, are described in this chapter. Copyright © 2011 S. Karger AG, Basel.

  18. Influence of input matrix representation on topic modelling performance

    CSIR Research Space (South Africa)

    De Waal, A

    2010-11-01

    Full Text Available Topic models explain a collection of documents with a small set of distributions over terms. These distributions over terms define the topics. Topic models ignore the structure of documents and use a bag-of-words approach which relies solely...

  19. Liquid crystals: a new topic in physics for undergraduates

    International Nuclear Information System (INIS)

    Pavlin, Jerneja; Čepič, Mojca; Vaupotič, Nataša

    2013-01-01

    This paper presents a teaching module about liquid crystals. Since liquid crystals are linked to everyday student experiences and are also a topic of current scientific research, they are an excellent candidate for a modern topic to be introduced into education. We show that liquid crystals can provide a pathway through several fields of physics such as thermodynamics, optics and electromagnetism. We discuss what students should learn about liquid crystals and what physical concepts they should know before considering them. In the presentation of the teaching module, which consists of a lecture and experimental work in a chemistry and physics laboratory, we focus on experiments on phase transitions, polarization of light, double refraction and colours. A pilot evaluation of the module was performed among pre-service primary school teachers who have no special preference for natural sciences. The evaluation shows that the module is very efficient in transferring knowledge. A prior study showed that the informally obtained pre-knowledge on liquid crystals of the first-year students from several different fields of study was negligible. Since social science students are the least interested in natural sciences, it can be expected that students in any study programme will on average achieve at least as good qualitative knowledge of phenomena related to liquid crystals as the group involved in the pilot study. (paper)

  20. Topical methotrexate pretreatment enhances the therapeutic effect of topical 5-aminolevulinic acid-mediated photodynamic therapy on hamster buccal pouch precancers

    Directory of Open Access Journals (Sweden)

    Deng-Fu Yang

    2014-09-01

    Conclusion: We conclude that topical MTX-pretreatment can increase intracellular PpIX production in hamster buccal pouch precancerous lesions and significantly improves the outcomes of the precancerous lesions treated with topical ALA-PDT.

  1. A Data-Based Approach to Discovering Multi-Topic Influential Leaders.

    Directory of Open Access Journals (Sweden)

    Xing Tang

    Full Text Available Recently, increasing numbers of users have adopted microblogging services as their main information source. However, most of them find themselves drowning in the millions of posts produced by other users every day. To cope with this, identifying a set of the most influential people is paramount. Moreover, finding a set of related influential users to expand the coverage of one particular topic is required in real world scenarios. Most of the existing algorithms in this area focus on topology-related methods such as PageRank. These methods mine link structures to find the expected influential rank of users. However, because they ignore the interaction data, these methods turn out to be less effective in social networks. In reality, a variety of topics exist within the information diffusing through the network. Because they have different interests, users play different roles in the diffusion of information related to different topics. As a result, distinguishing influential leaders according to different topics is also worthy of research. In this paper, we propose a multi-topic influence diffusion model (MTID based on traces acquired from historic information. We decompose the influential scores of users into two parts: the direct influence determined by information propagation along the link structure and indirect influence that extends beyond the restrictions of direct follower relationships. To model the network from a multi-topical viewpoint, we introduce topic pools, each of which represents a particular topic information source. Then, we extract the topic distributions from the traces of tweets, determining the influence propagation probability and content generation probability. In the network, we adopt multiple ground nodes representing topic pools to connect every user through bidirectional links. Based on this multi-topical view of the network, we further introduce the topic-dependent rank (TD-Rank algorithm to identify the multi-topic

  2. A Data-Based Approach to Discovering Multi-Topic Influential Leaders.

    Science.gov (United States)

    Tang, Xing; Miao, Qiguang; Yu, Shangshang; Quan, Yining

    2016-01-01

    Recently, increasing numbers of users have adopted microblogging services as their main information source. However, most of them find themselves drowning in the millions of posts produced by other users every day. To cope with this, identifying a set of the most influential people is paramount. Moreover, finding a set of related influential users to expand the coverage of one particular topic is required in real world scenarios. Most of the existing algorithms in this area focus on topology-related methods such as PageRank. These methods mine link structures to find the expected influential rank of users. However, because they ignore the interaction data, these methods turn out to be less effective in social networks. In reality, a variety of topics exist within the information diffusing through the network. Because they have different interests, users play different roles in the diffusion of information related to different topics. As a result, distinguishing influential leaders according to different topics is also worthy of research. In this paper, we propose a multi-topic influence diffusion model (MTID) based on traces acquired from historic information. We decompose the influential scores of users into two parts: the direct influence determined by information propagation along the link structure and indirect influence that extends beyond the restrictions of direct follower relationships. To model the network from a multi-topical viewpoint, we introduce topic pools, each of which represents a particular topic information source. Then, we extract the topic distributions from the traces of tweets, determining the influence propagation probability and content generation probability. In the network, we adopt multiple ground nodes representing topic pools to connect every user through bidirectional links. Based on this multi-topical view of the network, we further introduce the topic-dependent rank (TD-Rank) algorithm to identify the multi-topic influential users

  3. Interactional Organization and Topic Control in Conciliation Hearings

    Directory of Open Access Journals (Sweden)

    Wânia Terezinha Ladeira

    2011-07-01

    Full Text Available We analyse discursive topic in talk-in-interaction within the institutional setting of three conciliation hearings held in a kind of small claims court for consumption conflict resolution. This research is based on Interactional Sociolinguistics and Conversation Analysis theories. The analysis shows that the participants of those meetings have asymmetric rights regarding the choice of discussion topics. Thus, the mediator is the one who has the right to suggest and control the discursive topics of the conversation. This topic control is the most important institutional procedure that can cause a reduction in accusations and adjacent replies. Consequently, the chance of mediators achieving their institutional task of reaching an agreement between parts in conflict is increased.

  4. Topics in low-temperature Fermi liquid theory

    International Nuclear Information System (INIS)

    Hess, D.W.

    1987-01-01

    Several topics in quantum liquids are discussed including the elementary excitation spectrum of 3 He under pressure, spin-polarized 3 He, and an early attempt to formulate a Fermi liquid theory to describe the low-temperature thermodynamic and transport properties of the heavy-electron systems UPt 3 . The elementary excitation spectrum of ordinary liquid 3 He is calculated at several pressures using the polarization potential theory of Aldrich and Pines together with a simple model to describe the effect of multipair excitation. The effective interactions between quasi particles in fully spin-polarized 3 He are obtained from physical arguments and sum rules. The interactions between two down-spin impurities and that between an up and down spin are also deduced. The regime of small polarization is considered next. Using the phenomenological model of Bedell and Sanchez-Castro together with an ansatz form for the spin-flip interaction, a large increase in the singlet scattering rate as a function of polarization is obtained

  5. The use of compound topical anesthetics: a review.

    Science.gov (United States)

    Kravitz, Neal D

    2007-10-01

    The author reviewed the history of, federal regulations regarding, risks of and adverse drug reactions of five compound topical anesthetics: tetracaine, adrenaline/epinephrine and cocaine (TAC); lidocaine, adrenaline/epinephrine and tetracaine (LET); lidocaine, tetracaine and phenylephrine (TAC 20 percent Alternate); lidocaine, prilocaine and tetracaine (Profound); and lidocaine, prilocaine, tetracaine and phenylephrine with thickeners (Profound PET). The author reviewed clinical trials, case reports, descriptive articles, and U.S. Food and Drug Administration (FDA) regulations and recent public advisory warnings regarding the federal approval of and risks associated with the use of compound topical anesthetics. Compound topical anesthetics are neither FDA-regulated nor -unregulated. Some compounding pharmacies bypass the new FDA drug approval process, which is based on reliable scientific data and ensures that a marketed drug is safe, effective, properly manufactured and accurately labeled. Two deaths have been attributed to the lay use of compound topical anesthetics. In response, the FDA has announced the strengthening of its efforts against unapproved drug products. Compound topical anesthetics may be an effective alternative to local infiltration for some minimally invasive dental procedures; however, legitimate concerns exist in regard to their safety. Until they become federally regulated, compound topical anesthetics remain unapproved drug products whose benefits may not outweigh their risks for dental patients.

  6. Topical Apigenin Alleviates Cutaneous Inflammation in Murine Models

    Directory of Open Access Journals (Sweden)

    Mao-Qiang Man

    2012-01-01

    Full Text Available Herbal medicines have been used in preventing and treating skin disorders for centuries. It has been demonstrated that systemic administration of chrysanthemum extract exhibits anti-inflammatory properties. However, whether topical applications of apigenin, a constituent of chrysanthemum extract, influence cutaneous inflammation is still unclear. In the present study, we first tested whether topical applications of apigenin alleviate cutaneous inflammation in murine models of acute dermatitis. The murine models of acute allergic contact dermatitis and acute irritant contact dermatitis were established by topical application of oxazolone and phorbol 12-myristate 13-acetate (TPA, respectively. Inflammation was assessed in both dermatitis models by measuring ear thickness. Additionally, the effect of apigenin on stratum corneum function in a murine subacute allergic contact dermatitis model was assessed with an MPA5 physiology monitor. Our results demonstrate that topical applications of apigenin exhibit therapeutic effects in both acute irritant contact dermatitis and allergic contact dermatitis models. Moreover, in comparison with the vehicle treatment, topical apigenin treatment significantly reduced transepidermal water loss, lowered skin surface pH, and increased stratum corneum hydration in a subacute murine allergic contact dermatitis model. Together, these results suggest that topical application of apigenin could provide an alternative regimen for the treatment of dermatitis.

  7. Knowledge-Based Topic Model for Unsupervised Object Discovery and Localization.

    Science.gov (United States)

    Niu, Zhenxing; Hua, Gang; Wang, Le; Gao, Xinbo

    Unsupervised object discovery and localization is to discover some dominant object classes and localize all of object instances from a given image collection without any supervision. Previous work has attempted to tackle this problem with vanilla topic models, such as latent Dirichlet allocation (LDA). However, in those methods no prior knowledge for the given image collection is exploited to facilitate object discovery. On the other hand, the topic models used in those methods suffer from the topic coherence issue-some inferred topics do not have clear meaning, which limits the final performance of object discovery. In this paper, prior knowledge in terms of the so-called must-links are exploited from Web images on the Internet. Furthermore, a novel knowledge-based topic model, called LDA with mixture of Dirichlet trees, is proposed to incorporate the must-links into topic modeling for object discovery. In particular, to better deal with the polysemy phenomenon of visual words, the must-link is re-defined as that one must-link only constrains one or some topic(s) instead of all topics, which leads to significantly improved topic coherence. Moreover, the must-links are built and grouped with respect to specific object classes, thus the must-links in our approach are semantic-specific , which allows to more efficiently exploit discriminative prior knowledge from Web images. Extensive experiments validated the efficiency of our proposed approach on several data sets. It is shown that our method significantly improves topic coherence and outperforms the unsupervised methods for object discovery and localization. In addition, compared with discriminative methods, the naturally existing object classes in the given image collection can be subtly discovered, which makes our approach well suited for realistic applications of unsupervised object discovery.Unsupervised object discovery and localization is to discover some dominant object classes and localize all of object

  8. Topic Prominence in Chinese EFL Learners' Interlanguage

    Science.gov (United States)

    Li, Shaopeng; Yang, Lianrui

    2014-01-01

    The present study aims to investigate the general characteristics of topicprominent typological interlanguage development of Chinese learners of English in terms of acquiring subject-prominent English structures from a discourse perspective. Topic structures mainly appear in Chinese discourse in the form of topic chains (Wang, 2002; 2004). The…

  9. A Discourse Perspective of Topic-prominence in Chinese EFL Learners’ Interlanguage

    Directory of Open Access Journals (Sweden)

    Shaopeng Li

    2014-07-01

    Full Text Available The present study aims to investigate the general characteristics of topic-prominent typological interlanguage development of Chinese learners of English in terms of acquiring subject-prominent English structures from the discourse perspective. We have selected as the research target “topic chain” which is the main topic-prominent structure in Chinese discourse and “zero anaphora” which is the most common topic anaphor of topic chain. Topic structures mainly appear in Chinese discourse in the form of “topic chain” (Wang, 2002; 2004. Actually, in the event of a topic chain, research on topic structures should go into the typical range of discourse. Two important findings were yielded by the present study. First, the characteristics of Chinese topic chain are transferrable to the interlanguage of Chinese EFL learners, thus resulting in overgeneralization of zero anaphora; second, interlanguage discourse of Chinese EFL learners reflects the characteristics of a second language acquisition process from topic-prominence to subject-prominence, thus lending support to the discourse transfer hypothesis.

  10. Topical reports on Louisiana salt domes

    International Nuclear Information System (INIS)

    1983-09-01

    The Institute for Environmental Studies at Louisiana State University conducted research into the potential use of Louisiana salt domes for disposal of nuclear waste material. Topical reports generated in 1981 and 1982 related to Vacherie and Rayburn's domes are compiled and presented, which address palynological studies, tiltmeter monitoring, precise releveling, saline springs, and surface hydrology. The latter two are basically a compilation of references related to these topics. Individual reports are abstracted

  11. Data Mining Thesis Topics in Finland

    OpenAIRE

    Bajo Rouvinen, Ari

    2017-01-01

    The Theseus open repository contains metadata about more than 100,000 thesis publications from the different universities of applied sciences in Finland. Different data mining techniques were applied to the Theseus dataset to build a web application to explore thesis topics and degree programmes using different libraries in Python and JavaScript. Thesis topics were extracted from manually annotated keywords by the authors and curated subjects by the librarians. During the project, the quality...

  12. Selected topics in e+e- physics

    International Nuclear Information System (INIS)

    Sau Lan Wu

    1981-01-01

    Selected topics of recent experimental results from the high-energy electron-positron storage rings are presented. The topics include some of the tau and charm physics from SPEAR, the upsilon physics from DORIS and CESR, and the γγ physics and quark and gluon physics from the PLUTO and TASSO Collaborations at PETRA. Related data from the JADE and MARK J Collaborations at PETRA are discussed in separated papers at this school. (orig.)

  13. An overview of topic modeling and its current applications in bioinformatics.

    Science.gov (United States)

    Liu, Lin; Tang, Lin; Dong, Wen; Yao, Shaowen; Zhou, Wei

    2016-01-01

    With the rapid accumulation of biological datasets, machine learning methods designed to automate data analysis are urgently needed. In recent years, so-called topic models that originated from the field of natural language processing have been receiving much attention in bioinformatics because of their interpretability. Our aim was to review the application and development of topic models for bioinformatics. This paper starts with the description of a topic model, with a focus on the understanding of topic modeling. A general outline is provided on how to build an application in a topic model and how to develop a topic model. Meanwhile, the literature on application of topic models to biological data was searched and analyzed in depth. According to the types of models and the analogy between the concept of document-topic-word and a biological object (as well as the tasks of a topic model), we categorized the related studies and provided an outlook on the use of topic models for the development of bioinformatics applications. Topic modeling is a useful method (in contrast to the traditional means of data reduction in bioinformatics) and enhances researchers' ability to interpret biological information. Nevertheless, due to the lack of topic models optimized for specific biological data, the studies on topic modeling in biological data still have a long and challenging road ahead. We believe that topic models are a promising method for various applications in bioinformatics research.

  14. Control of pain with topical plant medicines

    Directory of Open Access Journals (Sweden)

    James David Adams Jr.

    2015-04-01

    Full Text Available Pain is normally treated with oral nonsteroidal anti-inflammatory agents and opioids. These drugs are dangerous and are responsible for many hospitalizations and deaths. It is much safer to use topical preparations made from plants to treat pain, even severe pain. Topical preparations must contain compounds that penetrate the skin, inhibit pain receptors such as transient receptor potential cation channels and cyclooxygenase-2, to relieve pain. Inhibition of pain in the skin disrupts the pain cycle and avoids exposure of internal organs to large amounts of toxic compounds. Use of topical pain relievers has the potential to save many lives, decrease medical costs and improve therapy.

  15. Topics in Finance Part VI--Capital Budgeting

    Science.gov (United States)

    Laux, Judy

    2011-01-01

    This series on the theory of financial management offers insight into the roles of stockholder wealth maximization, the risk-return tradeoff, and agency conflicts as they apply to major topics in finance. The current article investigates capital budgeting. Much literature addresses this topic, with a number of articles challenging mainstream…

  16. Fostering Topic Knowledge: Essential for Academic Writing

    Science.gov (United States)

    Proske, Antje; Kapp, Felix

    2013-01-01

    Several researchers emphasize the role of the writer's topic knowledge for writing. In academic writing topic knowledge is often constructed by studying source texts. One possibility to support that essential phase of the writing process is to provide interactive learning questions which facilitate the construction of an adequate situation…

  17. Ego Involvement and Topic Controversiality as Related to Attitude Change.

    Science.gov (United States)

    Sledden, Elizabeth A.; Fernandez, Katherine A.

    Attitude change was measured on four different topics before and immediately after a persuasion was presented in order to compare the degree of change with the level of ego involvement as it relates to topic controversiality. Ego involvement was based on self-ratings of concern for each topic. Objective topic controversiality was based on the…

  18. EvoRiver: Visual Analysis of Topic Coopetition on Social Media.

    Science.gov (United States)

    Sun, Guodao; Wu, Yingcai; Liu, Shixia; Peng, Tai-Quan; Zhu, Jonathan J H; Liang, Ronghua

    2014-12-01

    Cooperation and competition (jointly called "coopetition") are two modes of interactions among a set of concurrent topics on social media. How do topics cooperate or compete with each other to gain public attention? Which topics tend to cooperate or compete with one another? Who plays the key role in coopetition-related interactions? We answer these intricate questions by proposing a visual analytics system that facilitates the in-depth analysis of topic coopetition on social media. We model the complex interactions among topics as a combination of carry-over, coopetition recruitment, and coopetition distraction effects. This model provides a close functional approximation of the coopetition process by depicting how different groups of influential users (i.e., "topic leaders") affect coopetition. We also design EvoRiver, a time-based visualization, that allows users to explore coopetition-related interactions and to detect dynamically evolving patterns, as well as their major causes. We test our model and demonstrate the usefulness of our system based on two Twitter data sets (social topics data and business topics data).

  19. Emerging topics on the hip: Ligamentum teres and hip microinstability

    International Nuclear Information System (INIS)

    Cerezal, Luis; Arnaiz, Javier; Canga, Ana; Piedra, Tatiana; Altónaga, José R.; Munafo, Ricardo; Pérez-Carro, Luis

    2012-01-01

    Microinstability and ligament teres lesions are emergent topics on the hip pathology. These entities are an increasingly recognized cause of persistent hip pain and should be considered in the differential diagnosis of the patient with hip pain. Conventional (non-arthrographic) CT and MR have a very limited role in the evaluation of these entities. CTa and MRa have emerged as the modalities of choice for pre-operative imaging of ligamentum teres injuries and microinstability. To date, pre-operative imaging detection of these pathologies is not widespread but with appropriate imaging and a high index of suspicion, preoperative detection should improve. This article discusses current concepts regarding anatomy, biomechanics, clinical findings, diagnosis and treatment of ligament teres lesions and microinstability.

  20. Acute Effect of Topical Menthol on Chronic Pain in Slaughterhouse Workers with Carpal Tunnel Syndrome: Triple-Blind, Randomized Placebo-Controlled Trial

    Directory of Open Access Journals (Sweden)

    Emil Sundstrup

    2014-01-01

    Full Text Available Topical menthol gels are classified “topical analgesics” and are claimed to relieve minor aches and pains of the musculoskeletal system. In this study we investigate the acute effect of topical menthol on carpal tunnel syndrome (CTS. We screened 645 slaughterhouse workers and recruited 10 participants with CTS and chronic pain of the arm/hand who were randomly distributed into two groups to receive topical menthol (Biofreeze or placebo (gel with a menthol scent during the working day and 48 hours later the other treatment (crossover design. Participants rated arm/hand pain intensity during the last hour of work (scale 0–10 immediately before 1, 2, and 3 hours after application. Furthermore, global rating of change (GROC in arm/hand pain was assessed 3 hours after application. Compared with placebo, pain intensity and GROC improved more following application of topical menthol (P=0.026 and P=0.044, resp.. Pain intensity of the arm/hand decreased by −1.2 (CI 95%: −1.7 to −0.6 following topical menthol compared with placebo, corresponding to a moderate effect size of 0.63. In conclusion, topical menthol acutely reduces pain intensity during the working day in slaughterhouse workers with CTS and should be considered as an effective nonsystemic alternative to regular analgesics in the workplace management of chronic and neuropathic pain.