WorldWideScience

Sample records for extreme-scale computing workshop

  1. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  2. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  3. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Peisert, Sean [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Davis, CA (United States); Potok, Thomas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jones, Todd [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-03

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues included research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the

  4. Extreme Scale Computing for First-Principles Plasma Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Choogn-Seock [Princeton University

    2011-10-12

    World superpowers are in the middle of the “Computnik” race. US Department of Energy (and National Nuclear Security Administration) wishes to launch exascale computer systems into the scientific (and national security) world by 2018. The objective is to solve important scientific problems and to predict the outcomes using the most fundamental scientific laws, which would not be possible otherwise. Being chosen into the next “frontier” group can be of great benefit to a scientific discipline. An extreme scale computer system requires different types of algorithms and programming philosophy from those we have been accustomed to. Only a handful of scientific codes are blessed to be capable of scalable usage of today’s largest computers in operation at petascale (using more than 100,000 cores concurrently). Fortunately, a few magnetic fusion codes are competing well in this race using the “first principles” gyrokinetic equations.These codes are beginning to study the fusion plasma dynamics in full-scale realistic diverted device geometry in natural nonlinear multiscale, including the large scale neoclassical and small scale turbulence physics, but excluding some ultra fast dynamics. In this talk, most of the above mentioned topics will be introduced at executive level. Representative properties of the extreme scale computers, modern programming exercises to take advantage of them, and different philosophies in the data flows and analyses will be presented. Examples of the multi-scale multi-physics scientific discoveries made possible by solving the gyrokinetic equations on extreme scale computers will be described. Future directions into “virtual tokamak experiments” will also be discussed.

  5. Extreme Weather and Climate: Workshop Report

    Science.gov (United States)

    Sobel, Adam; Camargo, Suzana; Debucquoy, Wim; Deodatis, George; Gerrard, Michael; Hall, Timothy; Hallman, Robert; Keenan, Jesse; Lall, Upmanu; Levy, Marc; hide

    2016-01-01

    Extreme events are the aspects of climate to which human society is most sensitive. Due to both their severity and their rarity, extreme events can challenge the capacity of physical, social, economic and political infrastructures, turning natural events into human disasters. Yet, because they are low frequency events, the science of extreme events is very challenging. Among the challenges is the difficulty of connecting extreme events to longer-term, large-scale variability and trends in the climate system, including anthropogenic climate change. How can we best quantify the risks posed by extreme weather events, both in the current climate and in the warmer and different climates to come? How can we better predict them? What can we do to reduce the harm done by such events? In response to these questions, the Initiative on Extreme Weather and Climate has been created at Columbia University in New York City (extreme weather.columbia.edu). This Initiative is a University-wide activity focused on understanding the risks to human life, property, infrastructure, communities, institutions, ecosystems, and landscapes from extreme weather events, both in the present and future climates, and on developing solutions to mitigate those risks. In May 2015,the Initiative held its first science workshop, entitled Extreme Weather and Climate: Hazards, Impacts, Actions. The purpose of the workshop was to define the scope of the Initiative and tremendously broad intellectual footprint of the topic indicated by the titles of the presentations (see Table 1). The intent of the workshop was to stimulate thought across disciplinary lines by juxtaposing talks whose subjects differed dramatically. Each session concluded with question and answer panel sessions. Approximately, 150 people were in attendance throughout the day. Below is a brief synopsis of each presentation. The synopses collectively reflect the variety and richness of the emerging extreme event research agenda.

  6. Report from the 4th Workshop on Extremely Large Databases

    Directory of Open Access Journals (Sweden)

    Jacek Becla

    2011-02-01

    Full Text Available Academic and industrial users are increasingly facing the challenge of petabytes of data, but managing and analyzing such large data sets still remains a daunting task. The 4th Extremely Large Databases workshop was organized to examine the needs of communities under-represented at the past workshops facing these issues. Approaches to big data statistical analytics as well as emerging opportunities related to emerging hardware technologies were also debated. Writable extreme scale databases and the science benchmark were discussed. This paper is the final report of the discussions and activities at this workshop.

  7. Report from the 3rd Workshop on Extremely Large Databases

    Directory of Open Access Journals (Sweden)

    Jacek Becla

    2010-02-01

    Full Text Available Academic and industrial users are increasingly facing the challenge of petabytes of data, but managing and analyzing such large data sets still remains a daunting task. Both the database and the map/reduce communities worldwide are working on addressing these issues. The 3rdExtremely Large Databases workshop was organized to examine the needs of scientific communities beginning to face these issues, to reach out to European communities working on extremely large scale data challenges, and to brainstorm possible solutions. The science benchmark that emerged from the 2nd workshop in this series was also debated. This paper is the final report of the discussions and activities at this workshop.

  8. Report from the 6th Workshop on Extremely Large Databases

    Directory of Open Access Journals (Sweden)

    Daniel Liwei Wang

    2013-05-01

    Full Text Available Petascale data management and analysis remain one of the main unresolved challenges in today's computing. The 6th Extremely Large Databases workshop was convened alongside the XLDB conference to discuss the challenges in the health care, biology, and natural resources communities. The role of cloud computing, the dominance of file-based solutions in science applications, in-situ and predictive analysis, and commercial software use in academic environments were discussed in depth as well. This paper summarizes the discussions of this workshop.

  9. Standing Together for Reproducibility in Large-Scale Computing: Report on reproducibility@XSEDE

    OpenAIRE

    James, Doug; Wilkins-Diehr, Nancy; Stodden, Victoria; Colbry, Dirk; Rosales, Carlos; Fahey, Mark; Shi, Justin; Silva, Rafael F.; Lee, Kyo; Roskies, Ralph; Loewe, Laurence; Lindsey, Susan; Kooper, Rob; Barba, Lorena; Bailey, David

    2014-01-01

    This is the final report on reproducibility@xsede, a one-day workshop held in conjunction with XSEDE14, the annual conference of the Extreme Science and Engineering Discovery Environment (XSEDE). The workshop's discussion-oriented agenda focused on reproducibility in large-scale computational research. Two important themes capture the spirit of the workshop submissions and discussions: (1) organizational stakeholders, especially supercomputer centers, are in a unique position to promote, enab...

  10. Scientific Grand Challenges: Discovery In Basic Energy Sciences: The Role of Computing at the Extreme Scale - August 13-15, 2009, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Galli, Giulia [Univ. of California, Davis, CA (United States). Workshop Chair; Dunning, Thom [Univ. of Illinois, Urbana, IL (United States). Workshop Chair

    2009-08-13

    The U.S. Department of Energy’s (DOE) Office of Basic Energy Sciences (BES) and Office of Advanced Scientific Computing Research (ASCR) workshop in August 2009 on extreme-scale computing provided a forum for more than 130 researchers to explore the needs and opportunities that will arise due to expected dramatic advances in computing power over the next decade. This scientific community firmly believes that the development of advanced theoretical tools within chemistry, physics, and materials science—combined with the development of efficient computational techniques and algorithms—has the potential to revolutionize the discovery process for materials and molecules with desirable properties. Doing so is necessary to meet the energy and environmental challenges of the 21st century as described in various DOE BES Basic Research Needs reports. Furthermore, computational modeling and simulation are a crucial complement to experimental studies, particularly when quantum mechanical processes controlling energy production, transformations, and storage are not directly observable and/or controllable. Many processes related to the Earth’s climate and subsurface need better modeling capabilities at the molecular level, which will be enabled by extreme-scale computing.

  11. Extreme Conditions Modeling Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Coe, Ryan Geoffrey [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Neary, Vincent Sinclair [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Lawon, Michael J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Lab. (NREL), Golden, CO (United States); Weber, Jochem [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-07-01

    Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) hosted the Wave Energy Converter (WEC) Extreme Conditions Modeling (ECM) Workshop in Albuquerque, New Mexico on May 13–14, 2014. The objective of the workshop was to review the current state of knowledge on how to numerically and experimentally model WECs in extreme conditions (e.g. large ocean storms) and to suggest how national laboratory resources could be used to improve ECM methods for the benefit of the wave energy industry. More than 30 U.S. and European WEC experts from industry, academia, and national research institutes attended the workshop, which consisted of presentations from W EC developers, invited keynote presentations from subject matter experts, breakout sessions, and a final plenary session .

  12. ISC High Performance 2017 International Workshops, DRBSD, ExaComm, HCPM, HPC-IODC, IWOPH, IXPUG, P^3MA, VHPC, Visualization at Scale, WOPSSS

    CERN Document Server

    Yokota, Rio; Taufer, Michela; Shalf, John

    2017-01-01

    This book constitutes revised selected papers from 10 workshops that were held as the ISC High Performance 2017 conference in Frankfurt, Germany, in June 2017. The 59 papers presented in this volume were carefully reviewed and selected for inclusion in this book. They stem from the following workshops: Workshop on Virtualization in High-Performance Cloud Computing (VHPC) Visualization at Scale: Deployment Case Studies and Experience Reports International Workshop on Performance Portable Programming Models for Accelerators (P^3MA) OpenPOWER for HPC (IWOPH) International Workshop on Data Reduction for Big Scientific Data (DRBSD) International Workshop on Communication Architectures for HPC, Big Data, Deep Learning and Clouds at Extreme Scale Workshop on HPC Computing in a Post Moore's Law World (HCPM) HPC I/O in the Data Center ( HPC-IODC) Workshop on Performance and Scalability of Storage Systems (WOPSSS) IXPUG: Experiences on Intel Knights Landing at the One Year Mark International Workshop on Communicati...

  13. Extreme Conditions Modeling Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Coe, R. G.; Neary, V. S.; Lawson, M. J.; Yu, Y.; Weber, J.

    2014-07-01

    Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) hosted the Wave Energy Converter (WEC) Extreme Conditions Modeling (ECM) Workshop in Albuquerque, NM on May 13th-14th, 2014. The objective of the workshop was to review the current state of knowledge on how to model WECs in extreme conditions (e.g. hurricanes and other large storms) and to suggest how U.S. Department of Energy (DOE) and national laboratory resources could be used to improve ECM methods for the benefit of the wave energy industry.

  14. Report from the 2nd Workshop on Extremely Large Databases

    Directory of Open Access Journals (Sweden)

    Jacek Becla

    2009-03-01

    Full Text Available The complexity and sophistication of large scale analytics in science and industry have advanced dramatically in recent years. Analysts are struggling to use complex techniques such as time series analysis and classification algorithms because their familiar, powerful tools are not scalable and cannot effectively use scalable database systems. The 2nd Extremely Large Databases (XLDB workshop was organized to understand these issues, examine their implications, and brainstorm possible solutions. The design of a new open source science database, SciDB that emerged from the first workshop in this series was also debated. This paper is the final report of the discussions and activities at this workshop.

  15. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  16. A Portable Computer Security Workshop

    Science.gov (United States)

    Wagner, Paul J.; Phillips, Andrew T.

    2006-01-01

    We have developed a computer security workshop designed to instruct post-secondary instructors who want to start a course or laboratory exercise sequence in computer security. This workshop has also been used to provide computer security education to IT professionals and students. It is effective in communicating basic computer security principles…

  17. 8th International Symposium on Intelligent Distributed Computing & Workshop on Cyber Security and Resilience of Large-Scale Systems & 6th International Workshop on Multi-Agent Systems Technology and Semantics

    CERN Document Server

    Braubach, Lars; Venticinque, Salvatore; Badica, Costin

    2015-01-01

    This book represents the combined peer-reviewed proceedings of the Eight International Symposium on Intelligent Distributed Computing - IDC'2014, of the Workshop on Cyber Security and Resilience of Large-Scale Systems - WSRL-2014, and of the Sixth International Workshop on Multi-Agent Systems Technology and Semantics- MASTS-2014. All the events were held in Madrid, Spain, during September 3-5, 2014. The 47 contributions published in this book address several topics related to theory and applications of the intelligent distributed computing and multi-agent systems, including: agent-based data processing, ambient intelligence, collaborative systems, cryptography and security, distributed algorithms, grid and cloud computing, information extraction, knowledge management, big data and ontologies, social networks, swarm intelligence or videogames amongst others.

  18. Report from the 5th Workshop on Extremely Large Databases

    Directory of Open Access Journals (Sweden)

    Jacek Becla

    2012-03-01

    Full Text Available The 5th XLDB workshop brought together scientific and industrial users, developers, and researchers of extremely large data and focused on emerging challenges in the healthcare and genomics communities, spreadsheet-based large scale analysis, and challenges in applying statistics to large scale analysis, including machine learning. Major problems discussed were the lack of scalable applications, the lack of expertise in developing solutions, the lack of respect for or attention to big data problems, data volume growth exceeding Moore's Law, poorly scaling algorithms, and poor data quality and integration. More communication between users, developers, and researchers is sorely needed. A variety of future work to help all three groups was discussed, ranging from collecting challenge problems to connecting with particular industrial or academic sectors.

  19. Recovery Act - CAREER: Sustainable Silicon -- Energy-Efficient VLSI Interconnect for Extreme-Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Patrick [Oregon State Univ., Corvallis, OR (United States)

    2014-01-31

    The research goal of this CAREER proposal is to develop energy-efficient, VLSI interconnect circuits and systems that will facilitate future massively-parallel, high-performance computing. Extreme-scale computing will exhibit massive parallelism on multiple vertical levels, from thou­ sands of computational units on a single processor to thousands of processors in a single data center. Unfortunately, the energy required to communicate between these units at every level (on­ chip, off-chip, off-rack) will be the critical limitation to energy efficiency. Therefore, the PI's career goal is to become a leading researcher in the design of energy-efficient VLSI interconnect for future computing systems.

  20. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Maynard, Robert [Kitware, Inc., Clifton Park, NY (United States)

    2017-10-27

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respective features into a new visualization toolkit called VTK-m.

  1. Waterloo Workshop on Computer Algebra

    CERN Document Server

    Zima, Eugene; WWCA-2016; Advances in computer algebra : in honour of Sergei Abramov's' 70th birthday

    2018-01-01

    This book discusses the latest advances in algorithms for symbolic summation, factorization, symbolic-numeric linear algebra and linear functional equations. It presents a collection of papers on original research topics from the Waterloo Workshop on Computer Algebra (WWCA-2016), a satellite workshop of the International Symposium on Symbolic and Algebraic Computation (ISSAC’2016), which was held at Wilfrid Laurier University (Waterloo, Ontario, Canada) on July 23–24, 2016.   This workshop and the resulting book celebrate the 70th birthday of Sergei Abramov (Dorodnicyn Computing Centre of the Russian Academy of Sciences, Moscow), whose highly regarded and inspirational contributions to symbolic methods have become a crucial benchmark of computer algebra and have been broadly adopted by many Computer Algebra systems.

  2. H2@Scale Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Pivovar, Bryan

    2017-03-31

    Final report from the H2@Scale Workshop held November 16-17, 2016, at the National Renewable Energy Laboratory in Golden, Colorado. The U.S. Department of Energy's National Renewable Energy Laboratory hosted a technology workshop to identify the current barriers and research needs of the H2@Scale concept. H2@Scale is a concept regarding the potential for wide-scale impact of hydrogen produced from diverse domestic resources to enhance U.S. energy security and enable growth of innovative technologies and domestic industries. Feedback received from a diverse set of stakeholders at the workshop will guide the development of an H2@Scale roadmap for research, development, and early stage demonstration activities that can enable hydrogen as an energy carrier at a national scale.

  3. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogers, David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States)

    2017-10-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  4. 2016 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Runnels, Scott Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bachrach, Harrison Ian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Carlson, Nils [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Collier, Angela [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dumas, William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fankell, Douglas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferris, Natalie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gonzalez, Francisco [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Griffith, Alec [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Guston, Brandon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kenyon, Connor [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Li, Benson [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mookerjee, Adaleena [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parkinson, Christian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Peck, Hailee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Peters, Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Poondla, Yasvanth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rogers, Brandon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shaffer, Nathaniel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Trettel, Andrew [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Valaitis, Sonata Mae [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Venzke, Joel Aaron [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Black, Mason [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Demircan, Samet [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Holladay, Robert Tyler [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-22

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transport and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it.

  5. The 3d International Workshop on Computational Electronics

    Science.gov (United States)

    Goodnick, Stephen M.

    1994-09-01

    The Third International Workshop on Computational Electronics (IWCE) was held at the Benson Hotel in downtown Portland, Oregon, on May 18, 19, and 20, 1994. The workshop was devoted to a broad range of topics in computational electronics related to the simulation of electronic transport in semiconductors and semiconductor devices, particularly those which use large computational resources. The workshop was supported by the National Science Foundation (NSF), the Office of Naval Research and the Army Research Office, as well as local support from the Oregon Joint Graduate Schools of Engineering and the Oregon Center for Advanced Technology Education. There were over 100 participants in the Portland workshop, of which more than one quarter represented research groups outside of the United States from Austria, Canada, France, Germany, Italy, Japan, Switzerland, and the United Kingdom. There were a total 81 papers presented at the workshop, 9 invited talks, 26 oral presentations and 46 poster presentations. The emphasis of the contributions reflected the interdisciplinary nature of computational electronics with researchers from the Chemistry, Computer Science, Mathematics, Engineering, and Physics communities participating in the workshop.

  6. Multi-level programming paradigm for extreme computing

    International Nuclear Information System (INIS)

    Petiton, S.; Sato, M.; Emad, N.; Calvin, C.; Tsuji, M.; Dandouna, M.

    2013-01-01

    In order to propose a framework and programming paradigms for post peta-scale computing, on the road to exa-scale computing and beyond, we introduced new languages, associated with a hierarchical multi-level programming paradigm, allowing scientific end-users and developers to program highly hierarchical architectures designed for extreme computing. In this paper, we explain the interest of such hierarchical multi-level programming paradigm for extreme computing and its well adaptation to several large computational science applications, such as for linear algebra solvers used for reactor core physic. We describe the YML language and framework allowing describing graphs of parallel components, which may be developed using PGAS-like language such as XMP, scheduled and computed on supercomputers. Then, we propose experimentations on supercomputers (such as the 'K' and 'Hooper' ones) of the hybrid method MERAM (Multiple Explicitly Restarted Arnoldi Method) as a case study for iterative methods manipulating sparse matrices, and the block Gauss-Jordan method as a case study for direct method manipulating dense matrices. We conclude proposing evolutions for this programming paradigm. (authors)

  7. 2015 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Runnels, Scott Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Caldwell, Wendy [Arizona State Univ., Mesa, AZ (United States); Brown, Barton Jed [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pederson, Clark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Justin [Univ. of California, Santa Cruz, CA (United States); Burrill, Daniel [Univ. of Vermont, Burlington, VT (United States); Feinblum, David [Univ. of California, Irvine, CA (United States); Hyde, David [SLAC National Accelerator Lab., Menlo Park, CA (United States). Stanford Institute for Materials and Energy Science (SIMES); Levick, Nathan [Univ. of New Mexico, Albuquerque, NM (United States); Lyngaas, Isaac [Florida State Univ., Tallahassee, FL (United States); Maeng, Brad [Univ. of Michigan, Ann Arbor, MI (United States); Reed, Richard LeRoy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarno-Smith, Lois [Univ. of Michigan, Ann Arbor, MI (United States); Shohet, Gil [Univ. of Illinois, Urbana-Champaign, IL (United States); Skarda, Jinhie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stevens, Josey [Missouri Univ. of Science and Technology, Rolla, MO (United States); Zeppetello, Lucas [Columbia Univ., New York, NY (United States); Grossman-Ponemon, Benjamin [Stanford Univ., CA (United States); Bottini, Joseph Larkin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Loudon, Tyson Shane [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); VanGessel, Francis Gilbert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Nagaraj, Sriram [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Price, Jacob [Univ. of Washington, Seattle, WA (United States)

    2015-10-15

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transport and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it. This report includes both the background for the program and the reports from the students.

  8. Workshop report on large-scale matrix diagonalization methods in chemistry theory institute

    Energy Technology Data Exchange (ETDEWEB)

    Bischof, C.H.; Shepard, R.L.; Huss-Lederman, S. [eds.

    1996-10-01

    The Large-Scale Matrix Diagonalization Methods in Chemistry theory institute brought together 41 computational chemists and numerical analysts. The goal was to understand the needs of the computational chemistry community in problems that utilize matrix diagonalization techniques. This was accomplished by reviewing the current state of the art and looking toward future directions in matrix diagonalization techniques. This institute occurred about 20 years after a related meeting of similar size. During those 20 years the Davidson method continued to dominate the problem of finding a few extremal eigenvalues for many computational chemistry problems. Work on non-diagonally dominant and non-Hermitian problems as well as parallel computing has also brought new methods to bear. The changes and similarities in problems and methods over the past two decades offered an interesting viewpoint for the success in this area. One important area covered by the talks was overviews of the source and nature of the chemistry problems. The numerical analysts were uniformly grateful for the efforts to convey a better understanding of the problems and issues faced in computational chemistry. An important outcome was an understanding of the wide range of eigenproblems encountered in computational chemistry. The workshop covered problems involving self- consistent-field (SCF), configuration interaction (CI), intramolecular vibrational relaxation (IVR), and scattering problems. In atomic structure calculations using the Hartree-Fock method (SCF), the symmetric matrices can range from order hundreds to thousands. These matrices often include large clusters of eigenvalues which can be as much as 25% of the spectrum. However, if Cl methods are also used, the matrix size can be between 10{sup 4} and 10{sup 9} where only one or a few extremal eigenvalues and eigenvectors are needed. Working with very large matrices has lead to the development of

  9. ASCR Workshop on Quantum Computing for Science

    Energy Technology Data Exchange (ETDEWEB)

    Aspuru-Guzik, Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Van Dam, Wim [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Farhi, Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gaitan, Frank [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Humble, Travis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jordan, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Landahl, Andrew J [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lucas, Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Preskill, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Muller, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Svore, Krysta [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wiebe, Nathan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Williams, Carl [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms for linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.

  10. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sewell, Christopher [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Meredith, Jeremy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  11. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogers, David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware Inc., Clifton Park, NY (United States)

    2017-05-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  12. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D.; Sewell, Christopher (LANL); Childs, Hank (U of Oregon); Ma, Kwan-Liu (UC Davis); Geveci, Berk (Kitware); Meredith, Jeremy (ORNL)

    2016-05-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  13. Software challenges in extreme scale systems

    International Nuclear Information System (INIS)

    Sarkar, Vivek; Harrod, William; Snavely, Allan E

    2009-01-01

    Computer systems anticipated in the 2015 - 2020 timeframe are referred to as Extreme Scale because they will be built using massive multi-core processors with 100's of cores per chip. The largest capability Extreme Scale system is expected to deliver Exascale performance of the order of 10 18 operations per second. These systems pose new critical challenges for software in the areas of concurrency, energy efficiency and resiliency. In this paper, we discuss the implications of the concurrency and energy efficiency challenges on future software for Extreme Scale Systems. From an application viewpoint, the concurrency and energy challenges boil down to the ability to express and manage parallelism and locality by exploring a range of strong scaling and new-era weak scaling techniques. For expressing parallelism and locality, the key challenges are the ability to expose all of the intrinsic parallelism and locality in a programming model, while ensuring that this expression of parallelism and locality is portable across a range of systems. For managing parallelism and locality, the OS-related challenges include parallel scalability, spatial partitioning of OS and application functionality, direct hardware access for inter-processor communication, and asynchronous rather than interrupt-driven events, which are accompanied by runtime system challenges for scheduling, synchronization, memory management, communication, performance monitoring, and power management. We conclude by discussing the importance of software-hardware co-design in addressing the fundamental challenges for application enablement on Extreme Scale systems.

  14. Extreme-scale Algorithms and Solver Resilience

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States)

    2016-12-10

    A widening gap exists between the peak performance of high-performance computers and the performance achieved by complex applications running on these platforms. Over the next decade, extreme-scale systems will present major new challenges to algorithm development that could amplify this mismatch in such a way that it prevents the productive use of future DOE Leadership computers due to the following; Extreme levels of parallelism due to multicore processors; An increase in system fault rates requiring algorithms to be resilient beyond just checkpoint/restart; Complex memory hierarchies and costly data movement in both energy and performance; Heterogeneous system architectures (mixing CPUs, GPUs, etc.); and Conflicting goals of performance, resilience, and power requirements.

  15. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  16. Workshop on extremely high energy density plasmas and their diagnostics

    Energy Technology Data Exchange (ETDEWEB)

    Ishii, Shozo (ed.)

    2001-09-01

    Compiled are the papers presented at the workshop on 'Extremely High Energy Density Plasmas and Their Diagnostics' held at National Institute for Fusion Science. The papers cover physics and applications of extremely high-energy density plasmas such as dense z-pinch, plasma focus, and intense pulsed charged beams. Separate abstracts were presented for 7 of the papers in this report. The remaining 25 were considered outside the subject scope of INIS. (author)

  17. Workshop on extremely high energy density plasmas and their diagnostics

    International Nuclear Information System (INIS)

    Ishii, Shozo

    2001-09-01

    Compiled are the papers presented at the workshop on 'Extremely High Energy Density Plasmas and Their Diagnostics' held at National Institute for Fusion Science. The papers cover physics and applications of extremely high-energy density plasmas such as dense z-pinch, plasma focus, and intense pulsed charged beams. Separate abstracts were presented for 7 of the papers in this report. The remaining 25 were considered outside the subject scope of INIS. (author)

  18. The Efficient Use of Vector Computers with Emphasis on Computational Fluid Dynamics : a GAMM-Workshop

    CERN Document Server

    Gentzsch, Wolfgang

    1986-01-01

    The GAMM Committee for Numerical Methods in Fluid Mechanics organizes workshops which should bring together experts of a narrow field of computational fluid dynamics (CFD) to exchange ideas and experiences in order to speed-up the development in this field. In this sense it was suggested that a workshop should treat the solution of CFD problems on vector computers. Thus we organized a workshop with the title "The efficient use of vector computers with emphasis on computational fluid dynamics". The workshop took place at the Computing Centre of the University of Karlsruhe, March 13-15,1985. The participation had been restricted to 22 people of 7 countries. 18 papers have been presented. In the announcement of the workshop we wrote: "Fluid mechanics has actively stimulated the development of superfast vector computers like the CRAY's or CYBER 205. Now these computers on their turn stimulate the development of new algorithms which result in a high degree of vectorization (sca1ar/vectorized execution-time). But w...

  19. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Poster

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict drug response, and improve treatments for patients.

  20. 77 FR 26509 - Notice of Public Meeting-Cloud Computing Forum & Workshop V

    Science.gov (United States)

    2012-05-04

    ...--Cloud Computing Forum & Workshop V AGENCY: National Institute of Standards & Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop V to be held on Tuesday... workshop. This workshop will provide information on the U.S. Government (USG) Cloud Computing Technology...

  1. Faster Parallel Traversal of Scale Free Graphs at Extreme Scale with Vertex Delegates

    KAUST Repository

    Pearce, Roger

    2014-11-01

    © 2014 IEEE. At extreme scale, irregularities in the structure of scale-free graphs such as social network graphs limit our ability to analyze these important and growing datasets. A key challenge is the presence of high-degree vertices (hubs), that leads to parallel workload and storage imbalances. The imbalances occur because existing partitioning techniques are not able to effectively partition high-degree vertices. We present techniques to distribute storage, computation, and communication of hubs for extreme scale graphs in distributed memory supercomputers. To balance the hub processing workload, we distribute hub data structures and related computation among a set of delegates. The delegates coordinate using highly optimized, yet portable, asynchronous broadcast and reduction operations. We demonstrate scalability of our new algorithmic technique using Breadth-First Search (BFS), Single Source Shortest Path (SSSP), K-Core Decomposition, and Page-Rank on synthetically generated scale-free graphs. Our results show excellent scalability on large scale-free graphs up to 131K cores of the IBM BG/P, and outperform the best known Graph500 performance on BG/P Intrepid by 15%

  2. Faster Parallel Traversal of Scale Free Graphs at Extreme Scale with Vertex Delegates

    KAUST Repository

    Pearce, Roger; Gokhale, Maya; Amato, Nancy M.

    2014-01-01

    © 2014 IEEE. At extreme scale, irregularities in the structure of scale-free graphs such as social network graphs limit our ability to analyze these important and growing datasets. A key challenge is the presence of high-degree vertices (hubs), that leads to parallel workload and storage imbalances. The imbalances occur because existing partitioning techniques are not able to effectively partition high-degree vertices. We present techniques to distribute storage, computation, and communication of hubs for extreme scale graphs in distributed memory supercomputers. To balance the hub processing workload, we distribute hub data structures and related computation among a set of delegates. The delegates coordinate using highly optimized, yet portable, asynchronous broadcast and reduction operations. We demonstrate scalability of our new algorithmic technique using Breadth-First Search (BFS), Single Source Shortest Path (SSSP), K-Core Decomposition, and Page-Rank on synthetically generated scale-free graphs. Our results show excellent scalability on large scale-free graphs up to 131K cores of the IBM BG/P, and outperform the best known Graph500 performance on BG/P Intrepid by 15%

  3. Summer 1994 Computational Science Workshop. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This report documents the work performed by the University of New Mexico Principal Investigators and Research Assistants while hosting the highly successful Summer 1994 Computational Sciences Workshop in Albuquerque on August 6--11, 1994. Included in this report is a final budget for the workshop, along with a summary of the participants` evaluation of the workshop. The workshop proceeding have been delivered under separate cover. In order to assist in the organization of future workshops, we have also included in this report detailed documentation of the pre- and post-workshop activities associated with this contract. Specifically, we have included a section that documents the advertising performed, along with the manner in which applications were handled. A complete list of the workshop participants in this section. Sample letters that were generated while dealing with various commercial entities and departments at the University are also included in a section dealing with workshop logistics. Finally, we have included a section in this report that deals with suggestions for future workshops.

  4. Gravo-Aeroelastic Scaling for Extreme-Scale Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Fingersh, Lee J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Loth, Eric [University of Virginia; Kaminski, Meghan [University of Virginia; Qin, Chao [University of Virginia; Griffith, D. Todd [Sandia National Laboratories

    2017-06-09

    A scaling methodology is described in the present paper for extreme-scale wind turbines (rated at 10 MW or more) that allow their sub-scale turbines to capture their key blade dynamics and aeroelastic deflections. For extreme-scale turbines, such deflections and dynamics can be substantial and are primarily driven by centrifugal, thrust and gravity forces as well as the net torque. Each of these are in turn a function of various wind conditions, including turbulence levels that cause shear, veer, and gust loads. The 13.2 MW rated SNL100-03 rotor design, having a blade length of 100-meters, is herein scaled to the CART3 wind turbine at NREL using 25% geometric scaling and blade mass and wind speed scaled by gravo-aeroelastic constraints. In order to mimic the ultralight structure on the advanced concept extreme-scale design the scaling results indicate that the gravo-aeroelastically scaled blades for the CART3 are be three times lighter and 25% longer than the current CART3 blades. A benefit of this scaling approach is that the scaled wind speeds needed for testing are reduced (in this case by a factor of two), allowing testing under extreme gust conditions to be much more easily achieved. Most importantly, this scaling approach can investigate extreme-scale concepts including dynamic behaviors and aeroelastic deflections (including flutter) at an extremely small fraction of the full-scale cost.

  5. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR Staging

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  6. 16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT)

    CERN Document Server

    Lokajicek, M; Tumova, N

    2015-01-01

    16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT). The ACAT workshop series, formerly AIHENP (Artificial Intelligence in High Energy and Nuclear Physics), was created back in 1990. Its main purpose is to gather researchers related with computing in physics research together, from both physics and computer science sides, and bring them a chance to communicate with each other. It has established bridges between physics and computer science research, facilitating the advances in our understanding of the Universe at its smallest and largest scales. With the Large Hadron Collider and many astronomy and astrophysics experiments collecting larger and larger amounts of data, such bridges are needed now more than ever. The 16th edition of ACAT aims to bring related researchers together, once more, to explore and confront the boundaries of computing, automatic data analysis and theoretical calculation technologies. It will create a forum for exchanging ideas among the fields an...

  7. Extreme Scale Computing to Secure the Nation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; McGraw, J R; Johnson, J R; Frincke, D

    2009-11-10

    absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This

  8. Large Scale Computing and Storage Requirements for Basic Energy Sciences Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Wasserman, Harvey

    2011-03-31

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility supporting research within the Department of Energy's Office of Science. NERSC provides high-performance computing (HPC) resources to approximately 4,000 researchers working on about 400 projects. In addition to hosting large-scale computing facilities, NERSC provides the support and expertise scientists need to effectively and efficiently use HPC systems. In February 2010, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR) and DOE's Office of Basic Energy Sciences (BES) held a workshop to characterize HPC requirements for BES research through 2013. The workshop was part of NERSC's legacy of anticipating users future needs and deploying the necessary resources to meet these demands. Workshop participants reached a consensus on several key findings, in addition to achieving the workshop's goal of collecting and characterizing computing requirements. The key requirements for scientists conducting research in BES are: (1) Larger allocations of computational resources; (2) Continued support for standard application software packages; (3) Adequate job turnaround time and throughput; and (4) Guidance and support for using future computer architectures. This report expands upon these key points and presents others. Several 'case studies' are included as significant representative samples of the needs of science teams within BES. Research teams scientific goals, computational methods of solution, current and 2013 computing requirements, and special software and support needs are summarized in these case studies. Also included are researchers strategies for computing in the highly parallel, 'multi-core' environment that is expected to dominate HPC architectures over the next few years. NERSC has strategic plans and initiatives already underway that address key workshop findings. This report includes a

  9. Extreme-Scale De Novo Genome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Georganas, Evangelos [Intel Corporation, Santa Clara, CA (United States); Hofmeyr, Steven [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.; Egan, Rob [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Buluc, Aydin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.; Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.; Rokhsar, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Yelick, Katherine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Joint Genome Inst.

    2017-09-26

    De novo whole genome assembly reconstructs genomic sequence from short, overlapping, and potentially erroneous DNA segments and is one of the most important computations in modern genomics. This work presents HipMER, a high-quality end-to-end de novo assembler designed for extreme scale analysis, via efficient parallelization of the Meraculous code. Genome assembly software has many components, each of which stresses different components of a computer system. This chapter explains the computational challenges involved in each step of the HipMer pipeline, the key distributed data structures, and communication costs in detail. We present performance results of assembling the human genome and the large hexaploid wheat genome on large supercomputers up to tens of thousands of cores.

  10. The December 2006 ATLAS Computing & Software Workshop

    CERN Multimedia

    Fred Luehring

    The 29th ATLAS Computing & Software Workshop was held on December 11-15 at CERN. With the rapidly approaching onset of data taking, the workshop participants had an air of urgency about them. There was considerable discussion on hot topics such as physics validation of the software, data analysis, actual software production on the GRID, and the schedule of work for 2007 including the Final Dress Rehearsal (FDR). However don't be fooled, the workshop was not all work - there were also two social events which were greatly enjoyed by the attendees. The workshop welcomed Wouter Verkerke as the new Physics Validation Coordinator (replacing Davide Costanzo). Most recent validation work has centered on the 12.0.X release series that will be used for the Computing System Commissioning (CSC) exercise. The validation is now a big job because it needs to be done over a variety of conditions (magnetic field on/off, aligned/misaligned geometry) for every candidate release. Luckily there have been a large number of pe...

  11. Third Workshop on Affective Brain-Computer Interfaces: introduction

    NARCIS (Netherlands)

    Mühl, C.; Chanel, G.; Allison, B.; Nijholt, Antinus

    2013-01-01

    Following the first and second workshop on affective brain-computer interfaces, held in conjunction with ACII in Amsterdam (2009) and Memphis (2011), the third workshop explores the advantages and limitations of using neurophysiological signals for the automatic recognition of affective and

  12. DIKU-LASMEA Workshop on Computer Vision, Copenhagen, March, 2009

    DEFF Research Database (Denmark)

    Fihl, Preben

    This report will cover the participation in the DIKU-LASMEA Workshop on Computer Vision held at the department of computer science, University of Copenhagen, in March 2009. The report will give a concise description of the topics presented at the workshop, and briefly discuss how the work relates...... to the HERMES project and human motion and action recognition....

  13. Workshop on Computational Optimization

    CERN Document Server

    2016-01-01

    This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2014, held at Warsaw, Poland, September 7-10, 2014. The book presents recent advances in computational optimization. The volume includes important real problems like parameter settings for controlling processes in bioreactor and other processes, resource constrained project scheduling, infection distribution, molecule distance geometry, quantum computing, real-time management and optimal control, bin packing, medical image processing, localization the abrupt atmospheric contamination source and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks.

  14. Measuring activity limitations in walking : Development of a hierarchical scale for patients with lower-extremity disorders who live at home

    NARCIS (Netherlands)

    Roorda, LD; Roebroeck, ME; van Tilburg, T; Molenaar, IW; Lankhorst, GJ; Bouter, LM

    2005-01-01

    Objective: To develop a hierarchical scale that measures activity limitations in walking in patients with lower-extremity disorders who live at home. Design: Cross-sectional study. Setting: Orthopedic workshops and outpatient clinics of secondary and tertiary care centers. Participants: Patients

  15. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  16. Third Workshop on Teaching Computational Science (WTCS 2009)

    NARCIS (Netherlands)

    Tirado-Ramos, A.; Shiflet, A.

    2009-01-01

    The Third Workshop on Teaching Computational Science, within the International Conference on Computational Science, provides a platform for discussing innovations in teaching computational sciences at all levels and contexts of higher education. This editorial provides an introduction to the work

  17. Second Workshop on Teaching Computational Science WTCS 2008

    NARCIS (Netherlands)

    Tirado-Ramos, A.

    2008-01-01

    The Second Workshop on Teaching Computational Science, within the International Conference on Computational Science, provides a platform for discussing innovations in teaching computational sciences at all levels and contexts of higher education. This editorial provides an introduction to the work

  18. A Network Contention Model for the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2015-01-01

    The Extreme-scale Simulator (xSim) is a performance investigation toolkit for high-performance computing (HPC) hardware/software co-design. It permits running a HPC application with millions of concurrent execution threads, while observing its performance in a simulated extreme-scale system. This paper details a newly developed network modeling feature for xSim, eliminating the shortcomings of the existing network modeling capabilities. The approach takes a different path for implementing network contention and bandwidth capacity modeling using a less synchronous and accurate enough model design. With the new network modeling feature, xSim is able to simulate on-chip and on-node networks with reasonable accuracy and overheads.

  19. ISC High Performance 2016 International Workshops, ExaComm, E-MuCoCoS, HPC-IODC, IXPUG, IWOPH, P^3MA, VHPC, WOPSSS

    CERN Document Server

    Mohr, Bernd; Kunkel, Julian M

    2016-01-01

    This book constitutes revised selected papers from 7 workshops that were held in conjunction with the ISC High Performance 2016 conference in Frankfurt, Germany, in June 2016. The 45 papers presented in this volume were carefully reviewed and selected for inclusion in this book. They stem from the following workshops: Workshop on Exascale Multi/Many Core Computing Systems, E-MuCoCoS; Second International Workshop on Communication Architectures at Extreme Scale, ExaComm; HPC I/O in the Data Center Workshop, HPC-IODC; International Workshop on OpenPOWER for HPC, IWOPH; Workshop on the Application Performance on Intel Xeon Phi – Being Prepared for KNL and Beyond, IXPUG; Workshop on Performance and Scalability of Storage Systems, WOPSSS; and International Workshop on Performance Portable Programming Models for Accelerators, P3MA.

  20. Workshop on Computational Optimization

    CERN Document Server

    2015-01-01

    Our everyday life is unthinkable without optimization. We try to minimize our effort and to maximize the achieved profit. Many real world and industrial problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks. This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2013. It presents recent advances in computational optimization. The volume includes important real life problems like parameter settings for controlling processes in bioreactor, resource constrained project scheduling, problems arising in transport services, error correcting codes, optimal system performance and energy consumption and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others.

  1. Large Scale Computing and Storage Requirements for High Energy Physics

    International Nuclear Information System (INIS)

    Gerber, Richard A.; Wasserman, Harvey

    2010-01-01

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes

  2. 'Cloud computing' and clinical trials: report from an ECRIN workshop.

    Science.gov (United States)

    Ohmann, Christian; Canham, Steve; Danielyan, Edgar; Robertshaw, Steve; Legré, Yannick; Clivio, Luca; Demotes, Jacques

    2015-07-29

    Growing use of cloud computing in clinical trials prompted the European Clinical Research Infrastructures Network, a European non-profit organisation established to support multinational clinical research, to organise a one-day workshop on the topic to clarify potential benefits and risks. The issues that arose in that workshop are summarised and include the following: the nature of cloud computing and the cloud computing industry; the risks in using cloud computing services now; the lack of explicit guidance on this subject, both generally and with reference to clinical trials; and some possible ways of reducing risks. There was particular interest in developing and using a European 'community cloud' specifically for academic clinical trial data. It was recognised that the day-long workshop was only the start of an ongoing process. Future discussion needs to include clarification of trial-specific regulatory requirements for cloud computing and involve representatives from the relevant regulatory bodies.

  3. 76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV

    Science.gov (United States)

    2011-10-07

    ...--Cloud Computing Forum & Workshop IV AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop IV to be held on... to help develop open standards in interoperability, portability and security in cloud computing. This...

  4. PENERAPAN GLOBAL EXTREME PROGRAMMING PADA SISTEM INFORMASI WORKSHOP, SEMINAR DAN PELATIHAN DI LEMBAGA EDUKASI

    Directory of Open Access Journals (Sweden)

    Baginda Oloan Lubis

    2016-09-01

    Full Text Available Abstract Institutions of education as one of the service agencies workshops, seminars and training with information systems that are still manual at the time of printing id cards, the manufacture of attendance, printing certificates and report generation often had difficulty in searching data and other impacts arising from manual systems. Therefore the design of information systems or desktop based application creation is a solution to provide convenience to the user in the process of data processing, data retrieval to preparing reports. Model systems development workshops, seminars and training used in the design of information systems workshops, seminars and training using a model of Agile Software Development, which was discovered and developed by Robert Cecil Martin with process models Global Extreme Programming, namely the development of XP (Extreme Programming Life Cycle. While the tools used by the UML (Unifed Modeling Language and ERD (Entity Relationship Diagram. Hopefully, by the design of this system will allow a user who uses the id card printing, manufacture of attendance, certificate printing and reporting each end of the activity.

  5. 7th International Workshop on Natural Computing

    CERN Document Server

    Hagiya, Masami

    2015-01-01

    This book highlights recent advances in natural computing, including biology and its theory, bio-inspired computing, computational aesthetics, computational models and theories, computing with natural media, philosophy of natural computing and educational technology. It presents extended versions of the best papers selected from the symposium “7th International Workshop on Natural Computing” (IWNC7), held in Tokyo, Japan, in 2013. The target audience is not limited to researchers working in natural computing but also those active in biological engineering, fine/media art design, aesthetics and philosophy.

  6. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Xiu, Dongbin [Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  7. 77 FR 74829 - Notice of Public Meeting-Cloud Computing and Big Data Forum and Workshop

    Science.gov (United States)

    2012-12-18

    ...--Cloud Computing and Big Data Forum and Workshop AGENCY: National Institute of Standards and Technology... Standards and Technology (NIST) announces a Cloud Computing and Big Data Forum and Workshop to be held on... followed by a one-day hands-on workshop. The NIST Cloud Computing and Big Data Forum and Workshop will...

  8. 8th International Workshop on Natural Computing

    CERN Document Server

    Hagiya, Masami

    2016-01-01

    This book highlights recent advances in natural computing, including biology and its theory, bio-inspired computing, computational aesthetics, computational models and theories, computing with natural media, philosophy of natural computing, and educational technology. It presents extended versions of the best papers selected from the “8th International Workshop on Natural Computing” (IWNC8), a symposium held in Hiroshima, Japan, in 2014. The target audience is not limited to researchers working in natural computing but also includes those active in biological engineering, fine/media art design, aesthetics, and philosophy.

  9. Providing a computing environment for a high energy physics workshop

    International Nuclear Information System (INIS)

    Nicholls, J.

    1991-03-01

    Although computing facilities have been provided at conferences and workshops remote from the hose institution for some years, the equipment provided has rarely been capable of providing for much more than simple editing and electronic mail over leased lines. This presentation describes the pioneering effort involved by the Computing Department/Division at Fermilab in providing a local computing facility with world-wide networking capability for the Physics at Fermilab in the 1990's workshop held in Breckenridge, Colorado, in August 1989, as well as the enhanced facilities provided for the 1990 Summer Study on High Energy Physics at Snowmass, Colorado, in June/July 1990. Issues discussed include type and sizing of the facilities, advance preparations, shipping, on-site support, as well as an evaluation of the value of the facility to the workshop participants

  10. Sixth Computational Biomechanics for Medicine Workshop

    CERN Document Server

    Nielsen, Poul MF; Miller, Karol; Computational Biomechanics for Medicine : Deformation and Flow

    2012-01-01

    One of the greatest challenges for mechanical engineers is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, biomedical sciences, and medicine. This book is an opportunity for computational biomechanics specialists to present and exchange opinions on the opportunities of applying their techniques to computer-integrated medicine. Computational Biomechanics for Medicine: Deformation and Flow collects the papers from the Sixth Computational Biomechanics for Medicine Workshop held in Toronto in conjunction with the Medical Image Computing and Computer Assisted Intervention conference. The topics covered include: medical image analysis, image-guided surgery, surgical simulation, surgical intervention planning, disease prognosis and diagnostics, injury mechanism analysis, implant and prostheses design, and medical robotics.

  11. Large Scale Computing and Storage Requirements for High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years

  12. Proceedings of workshop on 'future in HEP computing'

    International Nuclear Information System (INIS)

    Karita, Yukio; Amako, Katsuya; Watase, Yoshiyuki

    1993-12-01

    The workshop was held on March 11 and 12, 1993, at the National Laboratory for High Energy Physics (KEK). The large flow from the conventional system centering around large versatile computers to the down-sizing taking distributed processing systems in it is formed, but its destination is not yet seen. As the concrete themes of 'future in HEP computing', problems toward down-sizing and the approach, future perspective of the networks, and adaptation of software engineering and pointing to object were taken up. At the workshop, lectures were given on requirements in HEP computing, possible solutions from Hitachi and Fujitsu, and network computing with work-stations regarding down-sizing and HEP computing; approaches in INS and KEK regarding future computing system in HEP laboratories; user requirement for future network, network service available in 1995-2005, multi-media communication and network protocols regarding future networks; object-oriented approach for software development, OOP for real time data acquisition and accelerator control; ProdiG activities and future of FORTRAN, F90 and HPF regarding OOP and physics, and trends in software development methodology. (K.I.)

  13. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    Science.gov (United States)

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613

  14. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.

    Science.gov (United States)

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  15. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    Directory of Open Access Journals (Sweden)

    Jakob Jordan

    2018-02-01

    Full Text Available State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  16. Research directions in computer engineering. Report of a workshop

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, H

    1982-09-01

    The results of a workshop held in November 1981 in Washington, DC, to outline research directions for computer engineering are reported upon. The purpose of the workshop was to provide guidance to government research funding agencies, as well as to universities and industry, as to the directions which computer engineering research should take for the next five to ten years. A select group of computer engineers was assembled, drawn from all over the United States and with expertise in virtually every aspect of today's computer technology. Industrial organisations and universities were represented in roughly equal numbers. The panel proceeded to provide a sharper definition of computer engineering than had been in popular use previously, to identify the social and national needs which provide the basis for encouraging research, to probe for obstacles to research and seek means of overcoming them and to delineate high-priority areas in which computer engineering research should be fostered. These included experimental software engineering, architectures in support of programming style, computer graphics, pattern recognition. VLSI design tools, machine intelligence, programmable automation, architectures for speech and signal processing, computer architecture and robotics. 13 references.

  17. Extreme Scale Computing Studies

    Science.gov (United States)

    2010-12-01

    systems that would fall under the Exascale rubric . In this chapter, we first discuss the attributes by which achievement of the label “Exascale” may be...Carrington, and E. Strohmaier. A Genetic Algorithms Approach to Modeling the Performance of Memory-bound Computations. Reno, NV, November 2007. ACM/IEEE... genetic stochasticity (random mating, mutation, etc). Outcomes are thus stochastic as well, and ecologists wish to ask questions like, “What is the

  18. 16th UK Workshop on Computational Intelligence

    CERN Document Server

    Gegov, Alexander; Jayne, Chrisina; Shen, Qiang

    2017-01-01

    The book is a timely report on advanced methods and applications of computational intelligence systems. It covers a long list of interconnected research areas, such as fuzzy systems, neural networks, evolutionary computation, evolving systems and machine learning. The individual chapters are based on peer-reviewed contributions presented at the 16th Annual UK Workshop on Computational Intelligence, held on September 7-9, 2016, in Lancaster, UK. The book puts a special emphasis on novels methods and reports on their use in a wide range of applications areas, thus providing both academics and professionals with a comprehensive and timely overview of new trends in computational intelligence.

  19. Computer-Assisted Language Learning : proceedings of the seventh Twente Workshop on Language Technology

    NARCIS (Netherlands)

    Appelo, L.; de Jong, Franciska M.G.

    1994-01-01

    TWLT is an acronym of Twente Workshop(s) on Language Technology. These workshops on natural language theory and technology are organised bij Project Parlevink (sometimes with the help of others) a language theory and technology project conducted at the Department of Computer Science of the

  20. Computational Diffusion MRI : MICCAI Workshop

    CERN Document Server

    Grussu, Francesco; Ning, Lipeng; Tax, Chantal; Veraart, Jelle

    2018-01-01

    This volume presents the latest developments in the highly active and rapidly growing field of diffusion MRI. The reader will find numerous contributions covering a broad range of topics, from the mathematical foundations of the diffusion process and signal generation, to new computational methods and estimation techniques for the in-vivo recovery of microstructural and connectivity features, as well as frontline applications in neuroscience research and clinical practice. These proceedings contain the papers presented at the 2017 MICCAI Workshop on Computational Diffusion MRI (CDMRI’17) held in Québec, Canada on September 10, 2017, sharing new perspectives on the most recent research challenges for those currently working in the field, but also offering a valuable starting point for anyone interested in learning computational techniques in diffusion MRI. This book includes rigorous mathematical derivations, a large number of rich, full-colour visualisations and clinically relevant results. As such, it wil...

  1. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  2. 8th Workshop on Computational Optimization

    CERN Document Server

    2016-01-01

    This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2015. It presents recent advances in computational optimization. The volume includes important real life problems like parameter settings for controlling processes in bioreactor, control of ethanol production, minimal convex hill with application in routing algorithms, graph coloring, flow design in photonic data transport system, predicting indoor temperature, crisis control center monitoring, fuel consumption of helicopters, portfolio selection, GPS surveying and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics, medicine and other domains can be formulated as optimization problems. .

  3. 10th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Hilbrich, Tobias; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2017-01-01

    This book presents the proceedings of the 10th International Parallel Tools Workshop, held October 4-5, 2016 in Stuttgart, Germany – a forum to discuss the latest advances in parallel tools. High-performance computing plays an increasingly important role for numerical simulation and modelling in academic and industrial research. At the same time, using large-scale parallel systems efficiently is becoming more difficult. A number of tools addressing parallel program development and analysis have emerged from the high-performance computing community over the last decade, and what may have started as collection of small helper script has now matured to production-grade frameworks. Powerful user interfaces and an extensive body of documentation allow easy usage by non-specialists.

  4. Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M. [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science

    2017-06-06

    QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather input in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.

  5. Computational discovery of extremal microstructure families

    Science.gov (United States)

    Chen, Desai; Skouras, Mélina; Zhu, Bo; Matusik, Wojciech

    2018-01-01

    Modern fabrication techniques, such as additive manufacturing, can be used to create materials with complex custom internal structures. These engineered materials exhibit a much broader range of bulk properties than their base materials and are typically referred to as metamaterials or microstructures. Although metamaterials with extraordinary properties have many applications, designing them is very difficult and is generally done by hand. We propose a computational approach to discover families of microstructures with extremal macroscale properties automatically. Using efficient simulation and sampling techniques, we compute the space of mechanical properties covered by physically realizable microstructures. Our system then clusters microstructures with common topologies into families. Parameterized templates are eventually extracted from families to generate new microstructure designs. We demonstrate these capabilities on the computational design of mechanical metamaterials and present five auxetic microstructure families with extremal elastic material properties. Our study opens the way for the completely automated discovery of extremal microstructures across multiple domains of physics, including applications reliant on thermal, electrical, and magnetic properties. PMID:29376124

  6. Risk Management Techniques and Practice Workshop Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, T; Zosel, M

    2008-12-02

    At the request of the Department of Energy (DOE) Office of Science (SC), Lawrence Livermore National Laboratory (LLNL) hosted a two-day Risk Management Techniques and Practice (RMTAP) workshop held September 18-19 at the Hotel Nikko in San Francisco. The purpose of the workshop, which was sponsored by the SC/Advanced Scientific Computing Research (ASCR) program and the National Nuclear Security Administration (NNSA)/Advanced Simulation and Computing (ASC) program, was to assess current and emerging techniques, practices, and lessons learned for effectively identifying, understanding, managing, and mitigating the risks associated with acquiring leading-edge computing systems at high-performance computing centers (HPCCs). Representatives from fifteen high-performance computing (HPC) organizations, four HPC vendor partners, and three government agencies attended the workshop. The overall workshop findings were: (1) Standard risk management techniques and tools are in the aggregate applicable to projects at HPCCs and are commonly employed by the HPC community; (2) HPC projects have characteristics that necessitate a tailoring of the standard risk management practices; (3) All HPCC acquisition projects can benefit by employing risk management, but the specific choice of risk management processes and tools is less important to the success of the project; (4) The special relationship between the HPCCs and HPC vendors must be reflected in the risk management strategy; (5) Best practices findings include developing a prioritized risk register with special attention to the top risks, establishing a practice of regular meetings and status updates with the platform partner, supporting regular and open reviews that engage the interests and expertise of a wide range of staff and stakeholders, and documenting and sharing the acquisition/build/deployment experience; and (6) Top risk categories include system scaling issues, request for proposal/contract and acceptance testing, and

  7. 6th International Workshop Soft Computing Applications

    CERN Document Server

    Jain, Lakhmi; Kovačević, Branko

    2016-01-01

    These volumes constitute the Proceedings of the 6th International Workshop on Soft Computing Applications, or SOFA 2014, held on 24-26 July 2014 in Timisoara, Romania. This edition was organized by the University of Belgrade, Serbia in conjunction with Romanian Society of Control Engineering and Technical Informatics (SRAIT) - Arad Section, The General Association of Engineers in Romania - Arad Section, Institute of Computer Science, Iasi Branch of the Romanian Academy and IEEE Romanian Section.                 The Soft Computing concept was introduced by Lotfi Zadeh in 1991 and serves to highlight the emergence of computing methodologies in which the accent is on exploiting the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solution cost. Soft computing facilitates the use of fuzzy logic, neurocomputing, evolutionary computing and probabilistic computing in combination, leading to the concept of hybrid intelligent systems.        The combination of ...

  8. Large Scale Meteorological Pattern of Extreme Rainfall in Indonesia

    Science.gov (United States)

    Kuswanto, Heri; Grotjahn, Richard; Rachmi, Arinda; Suhermi, Novri; Oktania, Erma; Wijaya, Yosep

    2014-05-01

    Extreme Weather Events (EWEs) cause negative impacts socially, economically, and environmentally. Considering these facts, forecasting EWEs is crucial work. Indonesia has been identified as being among the countries most vulnerable to the risk of natural disasters, such as floods, heat waves, and droughts. Current forecasting of extreme events in Indonesia is carried out by interpreting synoptic maps for several fields without taking into account the link between the observed events in the 'target' area with remote conditions. This situation may cause misidentification of the event leading to an inaccurate prediction. Grotjahn and Faure (2008) compute composite maps from extreme events (including heat waves and intense rainfall) to help forecasters identify such events in model output. The composite maps show large scale meteorological patterns (LSMP) that occurred during historical EWEs. Some vital information about the EWEs can be acquired from studying such maps, in addition to providing forecaster guidance. Such maps have robust mid-latitude meteorological patterns (for Sacramento and California Central Valley, USA EWEs). We study the performance of the composite approach for tropical weather condition such as Indonesia. Initially, the composite maps are developed to identify and forecast the extreme weather events in Indramayu district- West Java, the main producer of rice in Indonesia and contributes to about 60% of the national total rice production. Studying extreme weather events happening in Indramayu is important since EWEs there affect national agricultural and fisheries activities. During a recent EWE more than a thousand houses in Indramayu suffered from serious flooding with each home more than one meter underwater. The flood also destroyed a thousand hectares of rice plantings in 5 regencies. Identifying the dates of extreme events is one of the most important steps and has to be carried out carefully. An approach has been applied to identify the

  9. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  10. Effects of ergonomic intervention on work-related upper extremity musculoskeletal disorders among computer workers: a randomized controlled trial.

    Science.gov (United States)

    Esmaeilzadeh, Sina; Ozcan, Emel; Capan, Nalan

    2014-01-01

    The aim of the study was to determine effects of ergonomic intervention on work-related upper extremity musculoskeletal disorders (WUEMSDs) among computer workers. Four hundred computer workers answered a questionnaire on work-related upper extremity musculoskeletal symptoms (WUEMSS). Ninety-four subjects with WUEMSS using computers at least 3 h a day participated in a prospective, randomized controlled 6-month intervention. Body posture and workstation layouts were assessed by the Ergonomic Questionnaire. We used the Visual Analogue Scale to assess the intensity of WUEMSS. The Upper Extremity Function Scale was used to evaluate functional limitations at the neck and upper extremities. Health-related quality of life was assessed with the Short Form-36. After baseline assessment, those in the intervention group participated in a multicomponent ergonomic intervention program including a comprehensive ergonomic training consisting of two interactive sessions, an ergonomic training brochure, and workplace visits with workstation adjustments. Follow-up assessment was conducted after 6 months. In the intervention group, body posture (p 0.05). Ergonomic intervention programs may be effective in reducing ergonomic risk factors among computer workers and consequently in the secondary prevention of WUEMSDs.

  11. Thirteenth Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion and Launch Vehicle Technology. Volume 2

    Science.gov (United States)

    Williams, R. W. (Compiler)

    1996-01-01

    This conference publication includes various abstracts and presentations given at the 13th Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion and Launch Vehicle Technology held at the George C. Marshall Space Flight Center April 25-27 1995. The purpose of the workshop was to discuss experimental and computational fluid dynamic activities in rocket propulsion and launch vehicles. The workshop was an open meeting for government, industry, and academia. A broad number of topics were discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.

  12. Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Biros, George [Univ. of Texas, Austin, TX (United States)

    2018-01-12

    Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. These include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a

  13. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  14. 6th International Workshop on Computer-Aided Scheduling of Public Transport

    CERN Document Server

    Branco, Isabel; Paixão, José

    1995-01-01

    This proceedings volume consists of papers presented at the Sixth International Workshop on Computer-Aided Scheduling of Public Transpon, which was held at the Fund~lio Calouste Gulbenkian in Lisbon from July 6th to 9th, 1993. In the tradition of alternating Workshops between North America and Europe - Chicago (1975), Leeds (1980), Montreal (1983), Hamburg (1987) and again Montreal (1990), the European city of Lisbon was selected as the venue for the Workshop in 1993. As in earlier Workshops, the central theme dealt with vehicle and duty scheduling problems and the employment of operations-research-based software systems for operational planning in public transport. However, as was initiated in Hamburg in 1987, the scope of this Workshop was broadened to include topics in related fields. This fundamental alteration was an inevitable consequence of the growing demand over the last decade for solutions to the complete planning process in public transport through integrated systems. Therefore, the program of thi...

  15. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    International Nuclear Information System (INIS)

    Khaleel, Mohammad A.

    2009-01-01

    This report is an account of the deliberations and conclusions of the workshop on 'Forefront Questions in Nuclear Science and the Role of High Performance Computing' held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to (1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; (2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; (3) provide nuclear physicists the opportunity to influence the development of high performance computing; and (4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  16. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2009-10-01

    This report is an account of the deliberations and conclusions of the workshop on "Forefront Questions in Nuclear Science and the Role of High Performance Computing" held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to 1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; 2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; 3) provide nuclear physicists the opportunity to influence the development of high performance computing; and 4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  17. Proceedings of the workshop on scattering experiments under extreme conditions

    International Nuclear Information System (INIS)

    Sakai, N.; Ikeda, H.; Ando, M.

    1991-10-01

    In the National Laboratory for High Energy Physics (KEK), as the research facilities, there are Photon Factory, the facility for utilizing the booster and University of Tokyo Meson Science Research Center. For the research on physical properties, it is very important to do structural analysis in a broad sense and to observe the behavior of quasiparticles in solids. The X-ray and pulsed neutrons required for these researches can be obtained in a single laboratory in KEK, and it is rare in the world. At this opportunity of the workshop on scattering experiments under extreme conditions, it is hoped that the positive interchange between both PF and booster groups will be carried out. The research on magnetic substances using X-ray is a most noteworthy utilization of synchrotron radiation. The discovery of X-ray resonance magnetic scattering by K. Namikawa is one of the remarkable researches using synchrotron radiation in the world. When the extreme conditions around samples are prepared, the quality of signals for the research on physical properties is to be heightened. In this report, the researches on physical properties under ultrahigh pressure and ultralow temperature are reported. (K.I.)

  18. Scaling of Precipitation Extremes Modelled by Generalized Pareto Distribution

    Science.gov (United States)

    Rajulapati, C. R.; Mujumdar, P. P.

    2017-12-01

    Precipitation extremes are often modelled with data from annual maximum series or peaks over threshold series. The Generalized Pareto Distribution (GPD) is commonly used to fit the peaks over threshold series. Scaling of precipitation extremes from larger time scales to smaller time scales when the extremes are modelled with the GPD is burdened with difficulties arising from varying thresholds for different durations. In this study, the scale invariance theory is used to develop a disaggregation model for precipitation extremes exceeding specified thresholds. A scaling relationship is developed for a range of thresholds obtained from a set of quantiles of non-zero precipitation of different durations. The GPD parameters and exceedance rate parameters are modelled by the Bayesian approach and the uncertainty in scaling exponent is quantified. A quantile based modification in the scaling relationship is proposed for obtaining the varying thresholds and exceedance rate parameters for shorter durations. The disaggregation model is applied to precipitation datasets of Berlin City, Germany and Bangalore City, India. From both the applications, it is observed that the uncertainty in the scaling exponent has a considerable effect on uncertainty in scaled parameters and return levels of shorter durations.

  19. Computational Humor 2012 : extended abstacts of the (3rd international) Workshop on computational Humor

    NARCIS (Netherlands)

    Nijholt, Antinus; Unknown, [Unknown

    2012-01-01

    Like its predecessors in 1996 (University of Twente, the Netherlands) and 2002 (ITC-irst, Trento, Italy), this Third International Workshop on Computational Humor (IWCH 2012) focusses on the possibility to find algorithms that allow understanding and generation of humor. There is the general aim of

  20. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  1. Large Scale Obscuration and Related Climate Effects Workshop: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Zak, B.D.; Russell, N.A.; Church, H.W.; Einfeld, W.; Yoon, D.; Behl, Y.K. [eds.

    1994-05-01

    A Workshop on Large Scale Obsurcation and Related Climate Effects was held 29--31 January, 1992, in Albuquerque, New Mexico. The objectives of the workshop were: to determine through the use of expert judgement the current state of understanding of regional and global obscuration and related climate effects associated with nuclear weapons detonations; to estimate how large the uncertainties are in the parameters associated with these phenomena (given specific scenarios); to evaluate the impact of these uncertainties on obscuration predictions; and to develop an approach for the prioritization of further work on newly-available data sets to reduce the uncertainties. The workshop consisted of formal presentations by the 35 participants, and subsequent topical working sessions on: the source term; aerosol optical properties; atmospheric processes; and electro-optical systems performance and climatic impacts. Summaries of the conclusions reached in the working sessions are presented in the body of the report. Copies of the transparencies shown as part of each formal presentation are contained in the appendices (microfiche).

  2. Improving the Performance of the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2014-01-01

    Investigating the performance of parallel applications at scale on future high-performance computing (HPC) architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. The Extreme-scale Simulator (xSim) is a simulation-based toolkit for investigating the performance of parallel applications at scale. xSim scales to millions of simulated Message Passing Interface (MPI) processes. The overhead introduced by a simulation tool is an important performance and productivity aspect. This paper documents two improvements to xSim: (1) a new deadlock resolution protocol to reduce the parallel discrete event simulation management overhead and (2) a new simulated MPI message matching algorithm to reduce the oversubscription management overhead. The results clearly show a significant performance improvement, such as by reducing the simulation overhead for running the NAS Parallel Benchmark suite inside the simulator from 1,020\\% to 238% for the conjugate gradient (CG) benchmark and from 102% to 0% for the embarrassingly parallel (EP) and benchmark, as well as, from 37,511% to 13,808% for CG and from 3,332% to 204% for EP with accurate process failure simulation.

  3. Proceedings of the workshop on molten salts technology and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hayashi, Hirokazu; Minato, Kazuo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-12-01

    Applications of molten salts technology to separation and synthesis of materials have been studied eagerly, which would develop new fields of materials science. Research Group for Actinides Science, Department of Materials Science, Japan Atomic Energy Research Institute (JAERI), together with Reprocessing and Recycle Technology Division, Atomic Energy Society of Japan, organized the Workshop on Molten Salts Technology and Computer Simulation at Tokai Research Establishment, JAERI on July 18, 2001. In the workshop eleven lectures were made and lively discussions were there on the fundamentals and applications of the molten salts technology that covered the structure and basic properties of molten salts, the pyrochemical reprocessing technology and the relevant computer simulation. The 10 of the presented papers are indexed individually. (J.P.N.)

  4. Extreme Scale FMM-Accelerated Boundary Integral Equation Solver for Wave Scattering

    KAUST Repository

    AbdulJabbar, Mustafa Abdulmajeed

    2018-03-27

    Algorithmic and architecture-oriented optimizations are essential for achieving performance worthy of anticipated energy-austere exascale systems. In this paper, we present an extreme scale FMM-accelerated boundary integral equation solver for wave scattering, which uses FMM as a matrix-vector multiplication inside the GMRES iterative method. Our FMM Helmholtz kernels treat nontrivial singular and near-field integration points. We implement highly optimized kernels for both shared and distributed memory, targeting emerging Intel extreme performance HPC architectures. We extract the potential thread- and data-level parallelism of the key Helmholtz kernels of FMM. Our application code is well optimized to exploit the AVX-512 SIMD units of Intel Skylake and Knights Landing architectures. We provide different performance models for tuning the task-based tree traversal implementation of FMM, and develop optimal architecture-specific and algorithm aware partitioning, load balancing, and communication reducing mechanisms to scale up to 6,144 compute nodes of a Cray XC40 with 196,608 hardware cores. With shared memory optimizations, we achieve roughly 77% of peak single precision floating point performance of a 56-core Skylake processor, and on average 60% of peak single precision floating point performance of a 72-core KNL. These numbers represent nearly 5.4x and 10x speedup on Skylake and KNL, respectively, compared to the baseline scalar code. With distributed memory optimizations, on the other hand, we report near-optimal efficiency in the weak scalability study with respect to both the logarithmic communication complexity as well as the theoretical scaling complexity of FMM. In addition, we exhibit up to 85% efficiency in strong scaling. We compute in excess of 2 billion DoF on the full-scale of the Cray XC40 supercomputer.

  5. Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Musial, W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lawson, M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Rooney, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-02-01

    The Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop was hosted by the National Renewable Energy Laboratory (NREL) in Broomfield, Colorado, July 9–10, 2012. The workshop brought together over 60 experts in marine energy technologies to disseminate technical information to the marine energy community, and to collect information to help identify ways in which the development of a commercially viable marine energy industry can be accelerated. The workshop was comprised of plenary sessions that reviewed the state of the marine energy industry and technical sessions that covered specific topics of relevance. Each session consisted of presentations, followed by facilitated discussions. During the facilitated discussions, the session chairs posed several prepared questions to the presenters and audience to encourage communication and the exchange of ideas between technical experts. Following the workshop, attendees were asked to provide written feedback on their takeaways from the workshop and their best ideas on how to accelerate the pace of marine energy technology development. The first four sections of this document give a general overview of the workshop format, provide presentation abstracts, supply discussion session notes, and list responses to the post-workshop questions. The final section presents key findings and conclusions from the workshop that suggest what the most pressing MHK technology needs are and how the U.S. Department of Energy (DOE) and national laboratory resources can be utilized to assist the marine energy industry in the most effective manner.

  6. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  7. NCI Workshop Report: Clinical and Computational Requirements for Correlating Imaging Phenotypes with Genomics Signatures

    Directory of Open Access Journals (Sweden)

    Rivka Colen

    2014-10-01

    Full Text Available The National Cancer Institute (NCI Cancer Imaging Program organized two related workshops on June 26–27, 2013, entitled “Correlating Imaging Phenotypes with Genomics Signatures Research” and “Scalable Computational Resources as Required for Imaging-Genomics Decision Support Systems.” The first workshop focused on clinical and scientific requirements, exploring our knowledge of phenotypic characteristics of cancer biological properties to determine whether the field is sufficiently advanced to correlate with imaging phenotypes that underpin genomics and clinical outcomes, and exploring new scientific methods to extract phenotypic features from medical images and relate them to genomics analyses. The second workshop focused on computational methods that explore informatics and computational requirements to extract phenotypic features from medical images and relate them to genomics analyses and improve the accessibility and speed of dissemination of existing NIH resources. These workshops linked clinical and scientific requirements of currently known phenotypic and genotypic cancer biology characteristics with imaging phenotypes that underpin genomics and clinical outcomes. The group generated a set of recommendations to NCI leadership and the research community that encourage and support development of the emerging radiogenomics research field to address short-and longer-term goals in cancer research.

  8. 24th & 25th Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Gienger, Michael; Kobayashi, Hiroaki

    2017-01-01

    This book presents the state of the art in High Performance Computing on modern supercomputer architectures. It addresses trends in hardware and software development in general, as well as the future of High Performance Computing systems and heterogeneous architectures. The contributions cover a broad range of topics, from improved system management to Computational Fluid Dynamics, High Performance Data Analytics, and novel mathematical approaches for large-scale systems. In addition, they explore innovative fields like coupled multi-physics and multi-scale simulations. All contributions are based on selected papers presented at the 24th Workshop on Sustained Simulation Performance, held at the University of Stuttgart’s High Performance Computing Center in Stuttgart, Germany in December 2016 and the subsequent Workshop on Sustained Simulation Performance, held at the Cyberscience Center, Tohoku University, Japan in March 2017.

  9. Thirteenth Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion and Launch Vehicle Technology. Volume 1

    Science.gov (United States)

    Williams, R. W. (Compiler)

    1996-01-01

    The purpose of the workshop was to discuss experimental and computational fluid dynamic activities in rocket propulsion and launch vehicles. The workshop was an open meeting for government, industry, and academia. A broad number of topics were discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.

  10. Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Musial, W.; Lawson, M.; Rooney, S.

    2013-02-01

    The Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop was hosted by the National Renewable Energy Laboratory (NREL) in Broomfield, Colorado, July 9-10, 2012. The workshop brought together over 60 experts in marine energy technologies to disseminate technical information to the marine energy community and collect information to help identify ways in which the development of a commercially viable marine energy industry can be accelerated. The workshop was comprised of plenary sessions that reviewed the state of the marine energy industry and technical sessions that covered specific topics of relevance. Each session consisted of presentations, followed by facilitated discussions. During the facilitated discussions, the session chairs posed several prepared questions to the presenters and audience to encourage communication and the exchange of ideas between technical experts. Following the workshop, attendees were asked to provide written feedback on their takeaways and their best ideas on how to accelerate the pace of marine energy technology development. The first four sections of this document give a general overview of the workshop format, provide presentation abstracts and discussion session notes, and list responses to the post-workshop questions. The final section presents key findings and conclusions from the workshop that suggest how the U.S. Department of Energy and national laboratory resources can be utilized to most effectively assist the marine energy industry.

  11. FOREWORD: 5th International Workshop on New Computational Methods for Inverse Problems

    Science.gov (United States)

    Vourc'h, Eric; Rodet, Thomas

    2015-11-01

    This volume of Journal of Physics: Conference Series is dedicated to the scientific research presented during the 5th International Workshop on New Computational Methods for Inverse Problems, NCMIP 2015 (http://complement.farman.ens-cachan.fr/NCMIP_2015.html). This workshop took place at Ecole Normale Supérieure de Cachan, on May 29, 2015. The prior editions of NCMIP also took place in Cachan, France, firstly within the scope of ValueTools Conference, in May 2011, and secondly at the initiative of Institut Farman, in May 2012, May 2013 and May 2014. The New Computational Methods for Inverse Problems (NCMIP) workshop focused on recent advances in the resolution of inverse problems. Indeed, inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finances. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, Kernel methods, learning methods

  12. Computational data sciences for assessment and prediction of climate extremes

    Science.gov (United States)

    Ganguly, A. R.

    2011-12-01

    Climate extremes may be defined inclusively as severe weather events or large shifts in global or regional weather patterns which may be caused or exacerbated by natural climate variability or climate change. This area of research arguably represents one of the largest knowledge-gaps in climate science which is relevant for informing resource managers and policy makers. While physics-based climate models are essential in view of non-stationary and nonlinear dynamical processes, their current pace of uncertainty reduction may not be adequate for urgent stakeholder needs. The structure of the models may in some cases preclude reduction of uncertainty for critical processes at scales or for the extremes of interest. On the other hand, methods based on complex networks, extreme value statistics, machine learning, and space-time data mining, have demonstrated significant promise to improve scientific understanding and generate enhanced predictions. When combined with conceptual process understanding at multiple spatiotemporal scales and designed to handle massive data, interdisciplinary data science methods and algorithms may complement or supplement physics-based models. Specific examples from the prior literature and our ongoing work suggests how data-guided improvements may be possible, for example, in the context of ocean meteorology, climate oscillators, teleconnections, and atmospheric process understanding, which in turn can improve projections of regional climate, precipitation extremes and tropical cyclones in an useful and interpretable fashion. A community-wide effort is motivated to develop and adapt computational data science tools for translating climate model simulations to information relevant for adaptation and policy, as well as for improving our scientific understanding of climate extremes from both observed and model-simulated data.

  13. Current status and future perspectives of electron interactions with molecules, clusters, surfaces, and interfaces [Workshop on Fundamental challenges in electron-driven chemistry; Workshop on Electron-driven processes: Scientific challenges and technological opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Kurt H.; McCurdy, C. William; Orlando, Thomas M.; Rescigno, Thomas N.

    2000-09-01

    This report is based largely on presentations and discussions at two workshops and contributions from workshop participants. The workshop on Fundamental Challenges in Electron-Driven Chemistry was held in Berkeley, October 9-10, 1998, and addressed questions regarding theory, computation, and simulation. The workshop on Electron-Driven Processes: Scientific Challenges and Technological Opportunities was held at Stevens Institute of Technology, March 16-17, 2000, and focused largely on experiments. Electron-molecule and electron-atom collisions initiate and drive almost all the relevant chemical processes associated with radiation chemistry, environmental chemistry, stability of waste repositories, plasma-enhanced chemical vapor deposition, plasma processing of materials for microelectronic devices and other applications, and novel light sources for research purposes (e.g. excimer lamps in the extreme ultraviolet) and in everyday lighting applications. The life sciences are a rapidly advancing field where the important role of electron-driven processes is only now beginning to be recognized. Many of the applications of electron-initiated chemical processes require results in the near term. A large-scale, multidisciplinary and collaborative effort should be mounted to solve these problems in a timely way so that their solution will have the needed impact on the urgent questions of understanding the physico-chemical processes initiated and driven by electron interactions.

  14. 17th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2016)

    International Nuclear Information System (INIS)

    2016-01-01

    Preface The 2016 version of the International Workshop on Advanced Computing and Analysis Techniques in Physics Research took place on January 18-22, 2016, at the Universidad Técnica Federico Santa Maria -UTFSM- in Valparaiso, Chile. The present volume of IOP Conference Series is devoted to the selected scientific contributions presented at the workshop. In order to guarantee the scientific quality of the Proceedings all papers were thoroughly peer-reviewed by an ad-hoc Editorial Committee with the help of many careful reviewers. The ACAT Workshop series has a long tradition starting in 1990 (Lyon, France), and takes place in intervals of a year and a half. Formerly these workshops were known under the name AIHENP (Artificial Intelligence for High Energy and Nuclear Physics). Each edition brings together experimental and theoretical physicists and computer scientists/experts, from particle and nuclear physics, astronomy and astrophysics in order to exchange knowledge and experience in computing and data analysis in physics. Three tracks cover the main topics: Computing technology: languages and system architectures. Data analysis: algorithms and tools. Theoretical Physics: techniques and methods. Although most contributions and discussions are related to particle physics and computing, other fields like condensed matter physics, earth physics, biophysics are often addressed in the hope to share our approaches and visions. It created a forum for exchanging ideas among fields, exploring and promoting cutting-edge computing technologies and debating hot topics. (paper)

  15. 2015 MICCAI Workshop on Computational Diffusion MRI

    CERN Document Server

    Ghosh, Aurobrata; Kaden, Enrico; Rathi, Yogesh; Reisert, Marco

    2016-01-01

    These Proceedings of the 2015 MICCAI WorkshopComputational Diffusion MRI” offer a snapshot of the current state of the art on a broad range of topics within the highly active and growing field of diffusion MRI. The topics vary from fundamental theoretical work on mathematical modeling, to the development and evaluation of robust algorithms, new computational methods applied to diffusion magnetic resonance imaging data, and applications in neuroscientific studies and clinical practice. Over the last decade interest in diffusion MRI has exploded. The technique provides unique insights into the microstructure of living tissue and enables in-vivo connectivity mapping of the brain. Computational techniques are key to the continued success and development of diffusion MRI and to its widespread transfer into clinical practice. New processing methods are essential for addressing issues at each stage of the diffusion MRI pipeline: acquisition, reconstruction, modeling and model fitting, image processing, fiber t...

  16. Instructional Styles, Attitudes and Experiences of Seniors in Computer Workshops

    Science.gov (United States)

    Wood, Eileen; Lanuza, Catherine; Baciu, Iuliana; MacKenzie, Meagan; Nosko, Amanda

    2010-01-01

    Sixty-four seniors were introduced to computers through a series of five weekly workshops. Participants were given instruction followed by hands-on experience for topics related to social communication, information seeking, games, and word processing and were observed to determine their preferences for instructional support. Observations of…

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  18. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Conrad, Patrick [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Bigoni, Daniele [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Parno, Matthew [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-06-09

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a history of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT

  19. CNRA/CSNI workshop on licensing and operating experience of computer-based I and C systems - Summary and conclusions

    International Nuclear Information System (INIS)

    2002-01-01

    The OECD Workshop on Licensing and Operating Experience of Computer-Based I and C Systems, was sponsored by both the Committee on Nuclear Regulatory Activities (CNRA) and the Committee on the Safety of Nuclear Installations (CSNI) of the OECD Nuclear Energy Agency (NEA). It was organised in collaboration with the Czech State Office for Nuclear Safety (SUJB), the Czech Power Board CEZ a.s., I and C Energo a.s. and the Nuclear Research Institute, Rez near Prague. The objectives of the Workshop were to exchange the experience gained by both the regulators and the industry in different countries in the licensing and operation of computer-based I and C systems, to discuss the existing differences in their licensing approaches in various countries, to consider the safety aspects of their practical use, and to discuss the ways of promoting future international co-operation in the given area. The scope of the Workshop included: - review of the progress made since the CNRA/CSNI workshop which was held in 1996 - current and future regulatory needs and/or requirements for the computer-based I and C systems - progress made in software life cycle activities, including verification and validation, and safety/hazards analysis - benefits of applying the computer-based I and C systems to improve plant performance and safety. The Technical Sessions and Discussion Sessions covered the following topics: Opening Session: Advances made in the use and planning of computer-based I and C systems; Topic 1: National and international standards and guides for computer-based safety systems; Topic 2: Regulatory aspects; Topic 3: Analysis and assessment of digital I and C systems; Topic 4: Software life cycle activities; Topic 4: Experience with applications, system aspects, potential limits and future trends and needs; Final Session: Workshop summary. The workshop provided a unique opportunity for people with experience in licensing, developing, manufacturing, implementing, maintaining or

  20. Proceedings of the second workshop of LHC Computing Grid, LCG-France

    International Nuclear Information System (INIS)

    Chollet, Frederique; Hernandez, Fabio; Malek, Fairouz; Gaelle, Shifrin

    2007-03-01

    The second LCG-France Workshop was held in Clermont-Ferrand on 14-15 March 2007. These sessions organized by IN2P3 and DAPNIA were attended by around 70 participants working with the Computing Grid of LHC in France. The workshop was a opportunity of exchanges of information between the French and foreign site representatives on one side and delegates of experiments on the other side. The event allowed enlightening the place of LHC Computing Task within the frame of W-LCG world project, the undergoing actions and the prospects in 2007 and beyond. The following communications were presented: 1. The current status of the LHC computation in France; 2.The LHC Grid infrastructure in France and associated resources; 3.Commissioning of Tier 1; 4.The sites of Tier-2s and Tier-3s; 5.Computing in ALICE experiment; 6.Computing in ATLAS experiment; 7.Computing in the CMS experiments; 8.Computing in the LHCb experiments; 9.Management and operation of computing grids; 10.'The VOs talk to sites'; 11.Peculiarities of ATLAS; 12.Peculiarities of CMS and ALICE; 13.Peculiarities of LHCb; 14.'The sites talk to VOs'; 15. Worldwide operation of Grid; 16.Following-up the Grid jobs; 17.Surveillance and managing the failures; 18. Job scheduling and tuning; 19.Managing the site infrastructure; 20.LCG-France communications; 21.Managing the Grid data; 22.Pointing the net infrastructure and site storage. 23.ALICE bulk transfers; 24.ATLAS bulk transfers; 25.CMS bulk transfers; 26. LHCb bulk transfers; 27.Access to LHCb data; 28.Access to CMS data; 29.Access to ATLAS data; 30.Access to ALICE data; 31.Data analysis centers; 32.D0 Analysis Farm; 33.Some CMS grid analyses; 34.PROOF; 35.Distributed analysis using GANGA; 36.T2 set-up for end-users. In their concluding remarks Fairouz Malek and Dominique Pallin stressed that the current workshop was more close to users while the tasks for tightening the links between the sites and the experiments were definitely achieved. The IN2P3 leadership expressed

  1. Workshops of the Sixth International Brain–Computer Interface Meeting : brain–computer interfaces past, present, and future

    NARCIS (Netherlands)

    Huggins, Jane E.; Guger, Christoph; Ziat, Mounia; Zander, Thorsten O.; Taylor, Denise; Tangermann, Michael; Soria-Frisch, Aureli; Simeral, John; Scherer, Reinhold; Rupp, Rüdiger; Ruffini, Giulio; Robinson, Douglas K.R.; Ramsey, Nick F.; Nijholt, Anton; Müller-Putz, Gernot R.; McFarland, Dennis J.; Mattia, Donatella; Lance, Brent J.; Kindermans, Pieter-Jan; Iturrate, Iñaki; Herff, Christian; Gupta, Disha; Do, An H.; Collinger, Jennifer L.; Chavarriaga, Ricardo; Chasey, Steven M.; Bleichner, Martin G.; Batista, Aaron; Anderson, Charles W.; Aarnoutse, Erik J.

    2017-01-01

    The Sixth International Brain–Computer Interface (BCI) Meeting was held 30 May–3 June 2016 at the Asilomar Conference Grounds, Pacific Grove, California, USA. The conference included 28 workshops covering topics in BCI and brain–machine interface research. Topics included BCI for specific

  2. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  3. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  4. Proceedings of the Third International Workshop on Mathematical Foundations of Computational Anatomy

    DEFF Research Database (Denmark)

    the mathematical community around shapes and the MICCAI community in view of computational anatomy applications. It targets more particularly researchers investigating the combination of statistical and geometrical aspects in the modeling of the variability of biological shapes. The workshop is a forum...... of the workshop: statistics on manifolds and diff eomorphisms for surface or longitudinal registration. One session gathers papers exploring new mathematical structures beyond Riemannian geometry while the last oral session deals with the emerging theme of statistics on graphs and trees. Finally, a poster session......Computational anatomy is an emerging discipline at the interface of geometry, statistics and image analysis which aims at modeling and analyzing the biological shape of tissues and organs. The goal is to estimate representative organ anatomies across diseases, populations, species or ages, to model...

  5. Workshops of the Fifth International Brain-Computer Interface Meeting: Defining the Future.

    Science.gov (United States)

    Huggins, Jane E; Guger, Christoph; Allison, Brendan; Anderson, Charles W; Batista, Aaron; Brouwer, Anne-Marie A-M; Brunner, Clemens; Chavarriaga, Ricardo; Fried-Oken, Melanie; Gunduz, Aysegul; Gupta, Disha; Kübler, Andrea; Leeb, Robert; Lotte, Fabien; Miller, Lee E; Müller-Putz, Gernot; Rutkowski, Tomasz; Tangermann, Michael; Thompson, David Edward

    2014-01-01

    The Fifth International Brain-Computer Interface (BCI) Meeting met June 3-7 th , 2013 at the Asilomar Conference Grounds, Pacific Grove, California. The conference included 19 workshops covering topics in brain-computer interface and brain-machine interface research. Topics included translation of BCIs into clinical use, standardization and certification, types of brain activity to use for BCI, recording methods, the effects of plasticity, special interest topics in BCIs applications, and future BCI directions. BCI research is well established and transitioning to practical use to benefit people with physical impairments. At the same time, new applications are being explored, both for people with physical impairments and beyond. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and high-lighting important issues for future research and development.

  6. 2nd International Workshop on Eigenvalue Problems : Algorithms, Software and Applications in Petascale Computing

    CERN Document Server

    Zhang, Shao-Liang; Imamura, Toshiyuki; Yamamoto, Yusaku; Kuramashi, Yoshinobu; Hoshi, Takeo

    2017-01-01

    This book provides state-of-the-art and interdisciplinary topics on solving matrix eigenvalue problems, particularly by using recent petascale and upcoming post-petascale supercomputers. It gathers selected topics presented at the International Workshops on Eigenvalue Problems: Algorithms; Software and Applications, in Petascale Computing (EPASA2014 and EPASA2015), which brought together leading researchers working on the numerical solution of matrix eigenvalue problems to discuss and exchange ideas – and in so doing helped to create a community for researchers in eigenvalue problems. The topics presented in the book, including novel numerical algorithms, high-performance implementation techniques, software developments and sample applications, will contribute to various fields that involve solving large-scale eigenvalue problems.

  7. MICCAI Workshops

    CERN Document Server

    Nedjati-Gilani, Gemma; Venkataraman, Archana; O'Donnell, Lauren; Panagiotaki, Eleftheria

    2014-01-01

    This volume contains the proceedings from two closely related workshops: Computational Diffusion MRI (CDMRI’13) and Mathematical Methods from Brain Connectivity (MMBC’13), held under the auspices of the 16th International Conference on Medical Image Computing and Computer Assisted Intervention, which took place in Nagoya, Japan, September 2013. Inside, readers will find contributions ranging from mathematical foundations and novel methods for the validation of inferring large-scale connectivity from neuroimaging data to the statistical analysis of the data, accelerated methods for data acquisition, and the most recent developments on mathematical diffusion modeling. This volume offers a valuable starting point for anyone interested in learning computational diffusion MRI and mathematical methods for brain connectivity as well as offers new perspectives and insights on current research challenges for those currently in the field. It will be of interest to researchers and practitioners in computer science, ...

  8. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    Science.gov (United States)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  9. On the nonlinearity of spatial scales in extreme weather attribution statements

    Science.gov (United States)

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; Alexander, Lisa V.; Wehner, Michael; Shiogama, Hideo; Wolski, Piotr; Ciavarella, Andrew; Christidis, Nikolaos

    2018-04-01

    In the context of ongoing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporal scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.

  10. 20th Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Patel, Nisarg; Kobayashi, Hiroaki

    2016-01-01

    The book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It explores general trends in hardware and software development, and then focuses specifically on the future of high-performance systems and heterogeneous architectures. It also covers applications such as computational fluid dynamics, material science, medical applications and climate research and discusses innovative fields like coupled multi-physics or multi-scale simulations. The papers included were selected from the presentations given at the 20th Workshop on Sustained Simulation Performance at the HLRS, University of Stuttgart, Germany in December 2015, and the subsequent Workshop on Sustained Simulation Performance at Tohoku University in February 2016.

  11. A Fault Oblivious Extreme-Scale Execution Environment

    Energy Technology Data Exchange (ETDEWEB)

    McKie, Jim

    2014-11-20

    The FOX project, funded under the ASCR X-stack I program, developed systems software and runtime libraries for a new approach to the data and work distribution for massively parallel, fault oblivious application execution. Our work was motivated by the premise that exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today’s machines. To deliver the capability of exascale hardware, the systems software must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massive data analysis in a highly unreliable hardware environment with billions of threads of execution. Our OS research has prototyped new methods to provide efficient resource sharing, synchronization, and protection in a many-core compute node. We have experimented with alternative task/dataflow programming models and shown scalability in some cases to hundreds of thousands of cores. Much of our software is in active development through open source projects. Concepts from FOX are being pursued in next generation exascale operating systems. Our OS work focused on adaptive, application tailored OS services optimized for multi → many core processors. We developed a new operating system NIX that supports role-based allocation of cores to processes which was released to open source. We contributed to the IBM FusedOS project, which promoted the concept of latency-optimized and throughput-optimized cores. We built a task queue library based on distributed, fault tolerant key-value store and identified scaling issues. A second fault tolerant task parallel library was developed, based on the Linda tuple space model, that used low level interconnect primitives for optimized communication. We designed fault tolerance mechanisms for task parallel computations

  12. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  13. Climatic forecast: down-scaling and extremes

    International Nuclear Information System (INIS)

    Deque, M.; Li, L.

    2007-01-01

    There is a strong demand for specifying the future climate at local scale and about extreme events. New methods, allowing a better output from the climate models, are currently being developed and French laboratories involved in the Escrime project are actively participating. (authors)

  14. FOREWORD: 4th International Workshop on New Computational Methods for Inverse Problems (NCMIP2014)

    Science.gov (United States)

    2014-10-01

    This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 4th International Workshop on New Computational Methods for Inverse Problems, NCMIP 2014 (http://www.farman.ens-cachan.fr/NCMIP_2014.html). This workshop took place at Ecole Normale Supérieure de Cachan, on May 23, 2014. The prior editions of NCMIP also took place in Cachan, France, firstly within the scope of ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/), and secondly at the initiative of Institut Farman, in May 2012 and May 2013, (http://www.farman.ens-cachan.fr/NCMIP_2012.html), (http://www.farman.ens-cachan.fr/NCMIP_2013.html). The New Computational Methods for Inverse Problems (NCMIP) Workshop focused on recent advances in the resolution of inverse problems. Indeed, inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finances. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the

  15. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir

    2018-02-24

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  16. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir; Ltaief, Hatem; Mikhalev, Aleksandr; Charara, Ali; Keyes, David E.

    2018-01-01

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  17. Large Scale Computing and Storage Requirements for Biological and Environmental Research

    Energy Technology Data Exchange (ETDEWEB)

    DOE Office of Science, Biological and Environmental Research Program Office (BER),

    2009-09-30

    In May 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of Biological and Environmental Research (BER) held a workshop to characterize HPC requirements for BER-funded research over the subsequent three to five years. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. Chief among them: scientific progress in BER-funded research is limited by current allocations of computational resources. Additionally, growth in mission-critical computing -- combined with new requirements for collaborative data manipulation and analysis -- will demand ever increasing computing, storage, network, visualization, reliability and service richness from NERSC. This report expands upon these key points and adds others. It also presents a number of"case studies" as significant representative samples of the needs of science teams within BER. Workshop participants were asked to codify their requirements in this"case study" format, summarizing their science goals, methods of solution, current and 3-5 year computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel,"multi-core" environment that is expected to dominate HPC architectures over the next few years.

  18. Second International workshop Geometry and Symbolic Computation

    CERN Document Server

    Walczak, Paweł; Geometry and its Applications

    2014-01-01

    This volume has been divided into two parts: Geometry and Applications. The geometry portion of the book relates primarily to geometric flows, laminations, integral formulae, geometry of vector fields on Lie groups, and osculation; the articles in the applications portion concern some particular problems of the theory of dynamical systems, including mathematical problems of liquid flows and a study of cycles for non-dynamical systems. This Work is based on the second international workshop entitled "Geometry and Symbolic Computations," held on May 15-18, 2013 at the University of Haifa and is dedicated to modeling (using symbolic calculations) in differential geometry and its applications in fields such as computer science, tomography, and mechanics. It is intended to create a forum for students and researchers in pure and applied geometry to promote discussion of modern state-of-the-art in geometric modeling using symbolic programs such as Maple™ and Mathematica®, as well as presentation of new results. ...

  19. Multidisciplinary and participatory workshops with stakeholders in a community of extreme poverty in the Peruvian Amazon: development of priority concerns and potential health, nutrition and education interventions.

    Science.gov (United States)

    Casapia, Martin; Joseph, Serene A; Gyorkos, Theresa W

    2007-07-10

    Communities of extreme poverty suffer disproportionately from a wide range of adverse outcomes, but are often neglected or underserved by organized services and research attention. In order to target the first Millennium Development Goal of eradicating extreme poverty, thereby reducing health inequalities, participatory research in these communities is needed. Therefore, the purpose of this study was to determine the priority problems and respective potential cost-effective interventions in Belen, a community of extreme poverty in the Peruvian Amazon, using a multidisciplinary and participatory focus. Two multidisciplinary and participatory workshops were conducted with important stakeholders from government, non-government and community organizations, national institutes and academic institutions. In Workshop 1, participants prioritized the main health and health-related problems in the community of Belen. Problem trees were developed to show perceived causes and effects for the top six problems. In Workshop 2, following presentations describing data from recently completed field research in school and household populations of Belen, participants listed potential interventions for the priority problems, including associated barriers, enabling factors, costs and benefits. The top ten priority problems in Belen were identified as: 1) infant malnutrition; 2) adolescent pregnancy; 3) diarrhoea; 4) anaemia; 5) parasites; 6) lack of basic sanitation; 7) low level of education; 8) sexually transmitted diseases; 9) domestic violence; and 10) delayed school entry. Causes and effects for the top six problems, proposed interventions, and factors relating to the implementation of interventions were multidisciplinary in nature and included health, nutrition, education, social and environmental issues. The two workshops provided valuable insight into the main health and health-related problems facing the community of Belen. The participatory focus of the workshops ensured the

  20. 78 FR 54453 - Notice of Public Meeting-Intersection of Cloud Computing and Mobility Forum and Workshop

    Science.gov (United States)

    2013-09-04

    ...--Intersection of Cloud Computing and Mobility Forum and Workshop AGENCY: National Institute of Standards and.../intersection-of-cloud-and-mobility.cfm . SUPPLEMENTARY INFORMATION: NIST hosted six prior Cloud Computing Forum... interoperability, portability, and security, discuss the Federal Government's experience with cloud computing...

  1. Computational Humor 2012 : extended abstacts of the (3rd international) Workshop on computational Humor

    OpenAIRE

    Nijholt, Antinus; Unknown, [Unknown

    2012-01-01

    Like its predecessors in 1996 (University of Twente, the Netherlands) and 2002 (ITC-irst, Trento, Italy), this Third International Workshop on Computational Humor (IWCH 2012) focusses on the possibility to find algorithms that allow understanding and generation of humor. There is the general aim of modeling humor, and if we can do that, it will provide us with lots of information about our cognitive abilities in general, such as reasoning, remembering, understanding situations, and understand...

  2. First Joint Workshop on Energy Management for Large-Scale Research Infrastructures

    CERN Document Server

    2011-01-01

      CERN, ERF (European Association of National Research Facilities) and ESS (European Spallation Source) announce the first Joint Workshop on Energy Management for Large-Scale Research Infrastructures. The event will take place on 13-14 October 2011 at the ESS office in Sparta - Lund, Sweden.   The workshop will bring together international experts on energy and representatives from laboratories and future projects all over the world in order to identify the challenges and best practice in respect of energy efficiency and optimization, solutions and implementation as well as to review the challenges represented by potential future technical solutions and the tools for effective collaboration. Further information at: http://ess-scandinavia.eu/general-information

  3. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    Science.gov (United States)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  4. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Tick, E. (comp.)

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. The fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.

  5. 2014 National Workshop on Advances in Communication and Computing

    CERN Document Server

    Prasanna, S; Sarma, Kandarpa; Saikia, Navajit

    2015-01-01

    The present volume is a compilation of research work in computation, communication, vision sciences, device design, fabrication, upcoming materials and related process design, etc. It is derived out of selected manuscripts submitted to the 2014 National Workshop on Advances in Communication and Computing (WACC 2014), Assam Engineering College, Guwahati, Assam, India which is emerging out to be a premier platform for discussion and dissemination of knowhow in this part of the world. The papers included in the volume are indicative of the recent thrust in computation, communications and emerging technologies. Certain recent advances in ZnO nanostructures for alternate energy generation provide emerging insights into an area that has promises for the energy sector including conservation and green technology. Similarly, scholarly contributions have focused on malware detection and related issues. Several contributions have focused on biomedical aspects including contributions related to cancer detection using act...

  6. 6th International Workshop on New Computational Methods for Inverse Problems

    International Nuclear Information System (INIS)

    2016-01-01

    Foreword This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 6 th International Workshop on New Computational Methods for Inverse Problems, NCMIP 2016 (http://complement.farman.ens-cachan.fr/NCMIP 2016.html). This workshop took place at Ecole Normale Supérieure de Cachan, on May 20, 2016. The prior editions of NCMIP also took place in Cachan, France, firstly within the scope of ValueTools Conference, in May 2011, and secondly at the initiative of Institut Farman, in May 2012, May 2013, May 2014 and May 2015. The New Computational Methods for Inverse Problems (NCMIP) workshop focused on recent advances in the resolution of inverse problems. Indeed, inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finances. The resolution of inverse problems consists in estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one- day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, Kernel

  7. Human Performance under Extreme Conditions with Respect to a Resilient Organisation. Proceedings of a CSNI International Workshop, 24-26 February 2015, Brugg, Switzerland

    International Nuclear Information System (INIS)

    2015-01-01

    After the Fukushima Daiichi accident a number of initiatives have been undertaken internationally to learn from the accident and to implement lessons learned to improve nuclear safety. The accident has shown in particular the challenges in supporting reliable human performance under extreme conditions. Acknowledging that further work is needed to be better prepared for the HOF (Human and Organisational Factors) challenges of the extreme conditions that may be present in severe accidents, the NEA's Working Group on Human and Organisational Factors (WGHOF), one of the working groups for the Committee on the Safety of Nuclear Installations (CSNI) initiated a new task with the objectives to: - share experiences and knowledge of human and organisational performance under extreme conditions, - identify specific currently applied HOF principles in nuclear and other high risk industries and compare them with the available knowledge, - provide a basis for improvements and necessary research taking into account HOF issues in the design and use of measures, and - make recommendations with the aim to achieve the best level of human and organisational performance as possible under extreme conditions. In order to move those issues forward WGHOF hosted together with the Swiss Federal Nuclear Safety Inspectorate ENSI a workshop entitled 'Human Performance under Extreme Conditions with respect to a Resilient Organization'. The workshop was conducted with participation of a number of invited key speakers from academic research and a range of industries, including nuclear. Thirty-four experts from 12 countries, the IAEA and OECD/Halden participated. Experts came from nuclear authorities, research centres, technical support organisations, training simulator centres, utilities and from non-nuclear field (aircraft accident investigation, fire fighting, military, design of resilient organisations). From the discussions at the workshop, it is clear that the accident at Fukushima has

  8. 18th and 19th Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Kobayashi, Hiroaki; Patel, Nisarg

    2015-01-01

    This book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and the future of high-performance systems and heterogeneous architectures in particular. The application-related contributions cover computational fluid dynamics, material science, medical applications and climate research; innovative fields such as coupled multi-physics and multi-scale simulations are highlighted. All papers were chosen from presentations given at the 18th Workshop on Sustained Simulation Performance held at the HLRS, University of Stuttgart, Germany in October 2013 and subsequent Workshop of the same name held at Tohoku University in March 2014.  

  9. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  10. Multidisciplinary and participatory workshops with stakeholders in a community of extreme poverty in the Peruvian Amazon: Development of priority concerns and potential health, nutrition and education interventions

    Directory of Open Access Journals (Sweden)

    Gyorkos Theresa W

    2007-07-01

    Full Text Available Abstract Background Communities of extreme poverty suffer disproportionately from a wide range of adverse outcomes, but are often neglected or underserved by organized services and research attention. In order to target the first Millennium Development Goal of eradicating extreme poverty, thereby reducing health inequalities, participatory research in these communities is needed. Therefore, the purpose of this study was to determine the priority problems and respective potential cost-effective interventions in Belen, a community of extreme poverty in the Peruvian Amazon, using a multidisciplinary and participatory focus. Methods Two multidisciplinary and participatory workshops were conducted with important stakeholders from government, non-government and community organizations, national institutes and academic institutions. In Workshop 1, participants prioritized the main health and health-related problems in the community of Belen. Problem trees were developed to show perceived causes and effects for the top six problems. In Workshop 2, following presentations describing data from recently completed field research in school and household populations of Belen, participants listed potential interventions for the priority problems, including associated barriers, enabling factors, costs and benefits. Results The top ten priority problems in Belen were identified as: 1 infant malnutrition; 2 adolescent pregnancy; 3 diarrhoea; 4 anaemia; 5 parasites; 6 lack of basic sanitation; 7 low level of education; 8 sexually transmitted diseases; 9 domestic violence; and 10 delayed school entry. Causes and effects for the top six problems, proposed interventions, and factors relating to the implementation of interventions were multidisciplinary in nature and included health, nutrition, education, social and environmental issues. Conclusion The two workshops provided valuable insight into the main health and health-related problems facing the community of

  11. Extreme Physics and Informational/Computational Limits

    Energy Technology Data Exchange (ETDEWEB)

    Di Sia, Paolo, E-mail: paolo.disia@univr.it, E-mail: 10alla33@virgilio.it [Department of Computer Science, Faculty of Science, Verona University, Strada Le Grazie 15, I-37134 Verona (Italy) and Faculty of Computer Science, Free University of Bozen, Piazza Domenicani 3, I-39100 Bozen-Bolzano (Italy)

    2011-07-08

    A sector of the current theoretical physics, even called 'extreme physics', deals with topics concerning superstring theories, multiverse, quantum teleportation, negative energy, and more, that only few years ago were considered scientific imaginations or purely speculative physics. Present experimental lines of evidence and implications of cosmological observations seem on the contrary support such theories. These new physical developments lead to informational limits, as the quantity of information, that a physical system can record, and computational limits, resulting from considerations regarding black holes and space-time fluctuations. In this paper I consider important limits for information and computation resulting in particular from string theories and its foundations.

  12. Extreme Physics and Informational/Computational Limits

    International Nuclear Information System (INIS)

    Di Sia, Paolo

    2011-01-01

    A sector of the current theoretical physics, even called 'extreme physics', deals with topics concerning superstring theories, multiverse, quantum teleportation, negative energy, and more, that only few years ago were considered scientific imaginations or purely speculative physics. Present experimental lines of evidence and implications of cosmological observations seem on the contrary support such theories. These new physical developments lead to informational limits, as the quantity of information, that a physical system can record, and computational limits, resulting from considerations regarding black holes and space-time fluctuations. In this paper I consider important limits for information and computation resulting in particular from string theories and its foundations.

  13. A Large-Scale Multi-Hop Localization Algorithm Based on Regularized Extreme Learning for Wireless Networks.

    Science.gov (United States)

    Zheng, Wei; Yan, Xiaoyong; Zhao, Wei; Qian, Chengshan

    2017-12-20

    A novel large-scale multi-hop localization algorithm based on regularized extreme learning is proposed in this paper. The large-scale multi-hop localization problem is formulated as a learning problem. Unlike other similar localization algorithms, the proposed algorithm overcomes the shortcoming of the traditional algorithms which are only applicable to an isotropic network, therefore has a strong adaptability to the complex deployment environment. The proposed algorithm is composed of three stages: data acquisition, modeling and location estimation. In data acquisition stage, the training information between nodes of the given network is collected. In modeling stage, the model among the hop-counts and the physical distances between nodes is constructed using regularized extreme learning. In location estimation stage, each node finds its specific location in a distributed manner. Theoretical analysis and several experiments show that the proposed algorithm can adapt to the different topological environments with low computational cost. Furthermore, high accuracy can be achieved by this method without setting complex parameters.

  14. ExM:System Support for Extreme-Scale, Many-Task Applications

    Energy Technology Data Exchange (ETDEWEB)

    Katz, Daniel S

    2011-05-31

    The ever-increasing power of supercomputer systems is both driving and enabling the emergence of new problem-solving methods that require the effi cient execution of many concurrent and interacting tasks. Methodologies such as rational design (e.g., in materials science), uncertainty quanti fication (e.g., in engineering), parameter estimation (e.g., for chemical and nuclear potential functions, and in economic energy systems modeling), massive dynamic graph pruning (e.g., in phylogenetic searches), Monte-Carlo- based iterative fi xing (e.g., in protein structure prediction), and inverse modeling (e.g., in reservoir simulation) all have these requirements. These many-task applications frequently have aggregate computing needs that demand the fastest computers. For example, proposed next-generation climate model ensemble studies will involve 1,000 or more runs, each requiring 10,000 cores for a week, to characterize model sensitivity to initial condition and parameter uncertainty. The goal of the ExM project is to achieve the technical advances required to execute such many-task applications efficiently, reliably, and easily on petascale and exascale computers. In this way, we will open up extreme-scale computing to new problem solving methods and application classes. In this document, we report on combined technical progress of the collaborative ExM project, and the institutional financial status of the portion of the project at University of Chicago, over the rst 8 months (through April 30, 2011)

  15. PARA'04 Workshop on State-of-the-art in Scientific Computing, June 20-23, 2004: Complementary Proceedings

    DEFF Research Database (Denmark)

    Dongarra, Jack; Madsen, Kaj; Wasniewski, Jerzy

    2004-01-01

    , was held in Lyngby, Denmark, June 20-23, 2004. The PARA'04 Workshop was organized by Jack Dongarra from the University of Tennessee and Oak Ridge National Laboratory, and Kaj Madsen and Jerzy Wasniewski from the Technical University of Denmark. The emphasis here was shifted to High-Performance Computing...... (HPC). The ongoing development of ever more advanced computers provides the potential for solving increasingly dif cult computational problems. However, given the complexity of modern computer architectures, the task of realizing this potential needs careful attention. For example, the failure......The PARA workshops in the past have been devoted to parallel computing methods in science and technology. There have been seven PARA meetings to date: PARA'94, PARA'95 and PARA'96 in Lyngby, Denmark, PARA'98 in Umeå, Sweden, PARA'2000 in Bergen, Norway, PARA'02 in Espoo, Finland, and PARA'04 again...

  16. Workshops of the Sixth International Brain–Computer Interface Meeting: brain–computer interfaces past, present, and future

    Science.gov (United States)

    Huggins, Jane E.; Guger, Christoph; Ziat, Mounia; Zander, Thorsten O.; Taylor, Denise; Tangermann, Michael; Soria-Frisch, Aureli; Simeral, John; Scherer, Reinhold; Rupp, Rüdiger; Ruffini, Giulio; Robinson, Douglas K. R.; Ramsey, Nick F.; Nijholt, Anton; Müller-Putz, Gernot; McFarland, Dennis J.; Mattia, Donatella; Lance, Brent J.; Kindermans, Pieter-Jan; Iturrate, Iñaki; Herff, Christian; Gupta, Disha; Do, An H.; Collinger, Jennifer L.; Chavarriaga, Ricardo; Chase, Steven M.; Bleichner, Martin G.; Batista, Aaron; Anderson, Charles W.; Aarnoutse, Erik J.

    2017-01-01

    The Sixth International Brain–Computer Interface (BCI) Meeting was held 30 May–3 June 2016 at the Asilomar Conference Grounds, Pacific Grove, California, USA. The conference included 28 workshops covering topics in BCI and brain–machine interface research. Topics included BCI for specific populations or applications, advancing BCI research through use of specific signals or technological advances, and translational and commercial issues to bring both implanted and non-invasive BCIs to market. BCI research is growing and expanding in the breadth of its applications, the depth of knowledge it can produce, and the practical benefit it can provide both for those with physical impairments and the general public. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and highlighting important issues and calls for action to support future research and development. PMID:29152523

  17. Workshops of the Sixth International Brain-Computer Interface Meeting: brain-computer interfaces past, present, and future.

    Science.gov (United States)

    Huggins, Jane E; Guger, Christoph; Ziat, Mounia; Zander, Thorsten O; Taylor, Denise; Tangermann, Michael; Soria-Frisch, Aureli; Simeral, John; Scherer, Reinhold; Rupp, Rüdiger; Ruffini, Giulio; Robinson, Douglas K R; Ramsey, Nick F; Nijholt, Anton; Müller-Putz, Gernot; McFarland, Dennis J; Mattia, Donatella; Lance, Brent J; Kindermans, Pieter-Jan; Iturrate, Iñaki; Herff, Christian; Gupta, Disha; Do, An H; Collinger, Jennifer L; Chavarriaga, Ricardo; Chase, Steven M; Bleichner, Martin G; Batista, Aaron; Anderson, Charles W; Aarnoutse, Erik J

    2017-01-01

    The Sixth International Brain-Computer Interface (BCI) Meeting was held 30 May-3 June 2016 at the Asilomar Conference Grounds, Pacific Grove, California, USA. The conference included 28 workshops covering topics in BCI and brain-machine interface research. Topics included BCI for specific populations or applications, advancing BCI research through use of specific signals or technological advances, and translational and commercial issues to bring both implanted and non-invasive BCIs to market. BCI research is growing and expanding in the breadth of its applications, the depth of knowledge it can produce, and the practical benefit it can provide both for those with physical impairments and the general public. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and highlighting important issues and calls for action to support future research and development.

  18. Temporal and spatial scaling impacts on extreme precipitation

    Science.gov (United States)

    Eggert, B.; Berg, P.; Haerter, J. O.; Jacob, D.; Moseley, C.

    2015-01-01

    Both in the current climate and in the light of climate change, understanding of the causes and risk of precipitation extremes is essential for protection of human life and adequate design of infrastructure. Precipitation extreme events depend qualitatively on the temporal and spatial scales at which they are measured, in part due to the distinct types of rain formation processes that dominate extremes at different scales. To capture these differences, we first filter large datasets of high-resolution radar measurements over Germany (5 min temporally and 1 km spatially) using synoptic cloud observations, to distinguish convective and stratiform rain events. In a second step, for each precipitation type, the observed data are aggregated over a sequence of time intervals and spatial areas. The resulting matrix allows a detailed investigation of the resolutions at which convective or stratiform events are expected to contribute most to the extremes. We analyze where the statistics of the two types differ and discuss at which resolutions transitions occur between dominance of either of the two precipitation types. We characterize the scales at which the convective or stratiform events will dominate the statistics. For both types, we further develop a mapping between pairs of spatially and temporally aggregated statistics. The resulting curve is relevant when deciding on data resolutions where statistical information in space and time is balanced. Our study may hence also serve as a practical guide for modelers, and for planning the space-time layout of measurement campaigns. We also describe a mapping between different pairs of resolutions, possibly relevant when working with mismatched model and observational resolutions, such as in statistical bias correction.

  19. 20th and 21st Joint Workshop on Sustained Simulation Performance

    CERN Document Server

    Bez, Wolfgang; Focht, Erich; Kobayashi, Hiroaki; Qi, Jiaxing; Roller, Sabine

    2015-01-01

    The book presents the state of the art in high-performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general, and the future of high-performance systems and heterogeneous architectures specifically. The application contributions cover computational fluid dynamics, material science, medical applications and climate research. Innovative fields like coupled multi-physics or multi-scale simulations are also discussed. All papers were chosen from presentations given at the 20th Workshop on Sustained Simulation Performance in December 2014 at the HLRS, University of Stuttgart, Germany, and the subsequent Workshop on Sustained Simulation Performance at Tohoku University in February 2015.  .

  20. Trends in Middle East climate extreme indices from 1950 to 2003

    Science.gov (United States)

    Zhang, Xuebin; Aguilar, Enric; Sensoy, Serhat; Melkonyan, Hamlet; Tagiyeva, Umayra; Ahmed, Nader; Kutaladze, Nato; Rahimzadeh, Fatemeh; Taghipour, Afsaneh; Hantosh, T. H.; Albert, Pinhas; Semawi, Mohammed; Karam Ali, Mohammad; Said Al-Shabibi, Mansoor Halal; Al-Oulan, Zaid; Zatari, Taha; Al Dean Khelet, Imad; Hamoud, Saleh; Sagir, Ramazan; Demircan, Mesut; Eken, Mehmet; Adiguzel, Mustafa; Alexander, Lisa; Peterson, Thomas C.; Wallis, Trevor

    2005-11-01

    A climate change workshop for the Middle East brought together scientists and data for the region to produce the first area-wide analysis of climate extremes for the region. This paper reports trends in extreme precipitation and temperature indices that were computed during the workshop and additional indices data that became available after the workshop. Trends in these indices were examined for 1950-2003 at 52 stations covering 15 countries, including Armenia, Azerbaijan, Bahrain, Cyprus, Georgia, Iran, Iraq, Israel, Jordan, Kuwait, Oman, Qatar, Saudi Arabia, Syria, and Turkey. Results indicate that there have been statistically significant, spatially coherent trends in temperature indices that are related to temperature increases in the region. Significant, increasing trends have been found in the annual maximum of daily maximum and minimum temperature, the annual minimum of daily maximum and minimum temperature, the number of summer nights, and the number of days where daily temperature has exceeded its 90th percentile. Significant negative trends have been found in the number of days when daily temperature is below its 10th percentile and daily temperature range. Trends in precipitation indices, including the number of days with precipitation, the average precipitation intensity, and maximum daily precipitation events, are weak in general and do not show spatial coherence. The workshop attendees have generously made the indices data available for the international research community.

  1. Carcinogenic potential of extremely low frequency magnetic fields: proceedings of a workshop

    International Nuclear Information System (INIS)

    Delpizzo, V.; Keam, D.W.

    1989-02-01

    The debate over the suspected link between Extremely Low Frequency (ELF) magnetic fields and cancer is entering its second decade, but the end is not in sight. The epidemiological evidence is now somewhat stronger, mainly due to the Savitz study of residential exposure and childhood cancer, but far from overwhelming. The results of in-vitro studies are fragmentary, sometimes contradictory and, overall, confusing. Well designed animal studies are virtually non-existent. A plausible biological model has not yet been established. Although scant, the present body of knowledge is very complex encompassing several disciplines and this workshop brought together researchers of vastly different backgrounds. The nine papers presented deal with an overview of ELF and cancer; the biochemistry of processes implicated in ELF carcinogenesis; possible mechanisms of cancer promotion; the status of in-vitro ELF cellular interactions; epidemiological studies, both occupational and residential, and the use of wire coding configurations as indicators of magnetic field exposures in such studies. Discussion follows each paper. Refs., figs., tabs

  2. Asynchronous schemes for CFD at extreme scales

    Science.gov (United States)

    Konduri, Aditya; Donzis, Diego

    2013-11-01

    Recent advances in computing hardware and software have made simulations an indispensable research tool in understanding fluid flow phenomena in complex conditions at great detail. Due to the nonlinear nature of the governing NS equations, simulations of high Re turbulent flows are computationally very expensive and demand for extreme levels of parallelism. Current large simulations are being done on hundreds of thousands of processing elements (PEs). Benchmarks from these simulations show that communication between PEs take a substantial amount of time, overwhelming the compute time, resulting in substantial waste in compute cycles as PEs remain idle. We investigate a novel approach based on widely used finite-difference schemes in which computations are carried out asynchronously, i.e. synchronization of data among PEs is not enforced and computations proceed regardless of the status of messages. This drastically reduces PE idle time and results in much larger computation rates. We show that while these schemes remain stable, their accuracy is significantly affected. We present new schemes that maintain accuracy under asynchronous conditions and provide a viable path towards exascale computing. Performance of these schemes will be shown for simple models like Burgers' equation.

  3. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    Energy Technology Data Exchange (ETDEWEB)

    Babu, Sudarsanam Suresh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Peter, William H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Dehoff, Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility

    2016-05-01

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact of the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii

  4. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    Energy Technology Data Exchange (ETDEWEB)

    Wilke, Jeremiah J [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Kenny, Joseph P. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  5. Employability and Technical Skill Required to Establish a Small Scale Automobile Workshop

    Science.gov (United States)

    Olaitan, Olawale O.; Ikeh, Joshua O.

    2015-01-01

    The study focused on identifying the employability and technical skills needed to establish small-scale automobile workshop in Nsukka Urban of Enugu State. Five purposes of the study were stated to guide the study. Five research questions were stated and answered in line with the purpose of the study. The population for the study is 1,500…

  6. Forcings and feedbacks on convection in the 2010 Pakistan flood: Modeling extreme precipitation with interactive large-scale ascent

    Science.gov (United States)

    Nie, Ji; Shaevitz, Daniel A.; Sobel, Adam H.

    2016-09-01

    Extratropical extreme precipitation events are usually associated with large-scale flow disturbances, strong ascent, and large latent heat release. The causal relationships between these factors are often not obvious, however, the roles of different physical processes in producing the extreme precipitation event can be difficult to disentangle. Here we examine the large-scale forcings and convective heating feedback in the precipitation events, which caused the 2010 Pakistan flood within the Column Quasi-Geostrophic framework. A cloud-revolving model (CRM) is forced with large-scale forcings (other than large-scale vertical motion) computed from the quasi-geostrophic omega equation using input data from a reanalysis data set, and the large-scale vertical motion is diagnosed interactively with the simulated convection. Numerical results show that the positive feedback of convective heating to large-scale dynamics is essential in amplifying the precipitation intensity to the observed values. Orographic lifting is the most important dynamic forcing in both events, while differential potential vorticity advection also contributes to the triggering of the first event. Horizontal moisture advection modulates the extreme events mainly by setting the environmental humidity, which modulates the amplitude of the convection's response to the dynamic forcings. When the CRM is replaced by either a single-column model (SCM) with parameterized convection or a dry model with a reduced effective static stability, the model results show substantial discrepancies compared with reanalysis data. The reasons for these discrepancies are examined, and the implications for global models and theoretical models are discussed.

  7. Workshop report

    African Journals Online (AJOL)

    abp

    2017-09-14

    Sep 14, 2017 ... health: report of first EQUIST training workshop in Nigeria .... The difference between the before and after measurements was ... After the administration of the pre-workshop questionnaire the ... represent Likert rating scale of 1-5 points, where 1point = grossly .... Procedures Manual for the "Evaluating.

  8. 29th Workshop on Recent Developments in Computer Simulation Studies in Condensed Matter Physics

    International Nuclear Information System (INIS)

    2016-01-01

    Thirty years ago, because of the dramatic increase in the power and utility of computer simulations, The University of Georgia formed the first institutional unit devoted to the application of simulations in research and teaching: The Center for Simulational Physics. Then, as the international simulations community expanded further, we sensed the need for a meeting place for both experienced simulators and newcomers to discuss inventive algorithms and recent results in an environment that promoted lively discussion. As a consequence, the Center for Simulational Physics established an annual workshop series on Recent Developments in Computer Simulation Studies in Condensed Matter Physics. This year's highly interactive workshop was the 29th in the series marking our efforts to promote high quality research in simulational physics. The continued interest shown by the scientific community amply demonstrates the useful purpose that these meetings have served. The latest workshop was held at The University of Georgia from February 22-26, 2016. It served to mark the 30 th Anniversary of the founding of the Center for Simulational Physics. In addition, during this Workshop we celebrated the 60 th birthday of our esteemed colleague Prof. H.-Bernd Schuttler. Bernd has not only contributed to the understanding of strongly correlated electron system, but has made seminal contributions to systems biology through the introduction of modern methods of computational physics. These Proceedings provide a “status report” on a number of important topics. This on-line “volume” is published with the goal of timely dissemination of the material to a wider audience. This program was supported in part by the President's Venture Fund through the generous gifts of the University of Georgia Partners and other donors. We also wish to offer thanks to the Office of the Vice-President for Research, the Franklin College of Arts and Sciences, and the IBM Corporation for partial

  9. [Upper extremities, neck and back symptoms in office employees working at computer stations].

    Science.gov (United States)

    Zejda, Jan E; Bugajska, Joanna; Kowalska, Małgorzata; Krzych, Lukasz; Mieszkowska, Marzena; Brozek, Grzegorz; Braczkowska, Bogumiła

    2009-01-01

    To obtain current data on the occurrence ofwork-related symptoms of office computer users in Poland we implemented a questionnaire survey. Its goal was to assess the prevalence and intensity of symptoms of upper extremities, neck and back in office workers who use computers on a regular basis, and to find out if the occurrence of symptoms depends on the duration of computer use and other work-related factors. Office workers in two towns (Warszawa and Katowice), employed in large social services companies, were invited to fill in the Polish version of Nordic Questionnaire. The questions included work history and history of last-week symptoms of pain of hand/wrist, elbow, arm, neck and upper and lower back (occurrence and intensity measured by visual scale). Altogether 477 men and women returned the completed questionnaires. Between-group symptom differences (chi-square test) were verified by multivariate analysis (GLM). The prevalence of symptoms in individual body parts was as follows: neck, 55.6%; arm, 26.9%; elbow, 13.3%; wrist/hand, 29.9%; upper back, 49.6%; and lower back, 50.1%. Multivariate analysis confirmed the effect of gender, age and years of computer use on the occurrence of symptoms. Among other determinants, forearm support explained pain of wrist/hand, wrist support of elbow pain, and chair adjustment of arm pain. Association was also found between low back pain and chair adjustment and keyboard position. The findings revealed frequent occurrence of symptoms of pain in upper extremities and neck in office workers who use computers on a regular basis. Seating position could also contribute to the frequent occurrence of back pain in the examined population.

  10. Soft Computing Applications : Proceedings of the 5th International Workshop Soft Computing Applications

    CERN Document Server

    Fodor, János; Várkonyi-Kóczy, Annamária; Dombi, Joszef; Jain, Lakhmi

    2013-01-01

                    This volume contains the Proceedings of the 5thInternational Workshop on Soft Computing Applications (SOFA 2012).                                The book covers a broad spectrum of soft computing techniques, theoretical and practical applications employing knowledge and intelligence to find solutions for world industrial, economic and medical problems. The combination of such intelligent systems tools and a large number of applications introduce a need for a synergy of scientific and technological disciplines in order to show the great potential of Soft Computing in all domains.                   The conference papers included in these proceedings, published post conference, were grouped into the following area of research: ·         Soft Computing and Fusion Algorithms in Biometrics, ·         Fuzzy Theory, Control andApplications, ·         Modelling and Control Applications, ·         Steps towa...

  11. Extreme-Scale Alignments Of Quasar Optical Polarizations And Galactic Dust Contamination

    Science.gov (United States)

    Pelgrims, Vincent

    2017-10-01

    Almost twenty years ago the optical polarization vectors from quasars were shown to be aligned over extreme-scales. That evidence was later confirmed and enhanced thanks to additional optical data obtained with the ESO instrument FORS2 mounted on the VLT, in Chile. These observations suggest either Galactic foreground contamination of the data or, more interestingly, a cosmological origin. Using 353-GHz polarization data from the Planck satellite, I recently showed that the main features of the extreme-scale alignments of the quasar optical polarization vectors are unaffected by the Galactic thermal dust. This confirms previous studies based on optical starlight polarization and discards the scenario of Galactic contamination. In this talk, I shall briefly review the extreme-scale quasar polarization alignments, discuss the main results submitted in A&A and motivate forthcoming projects at the frontier between Galactic and extragalactic astrop hysics.

  12. Censored rainfall modelling for estimation of fine-scale extremes

    Science.gov (United States)

    Cross, David; Onof, Christian; Winter, Hugo; Bernardara, Pietro

    2018-01-01

    Reliable estimation of rainfall extremes is essential for drainage system design, flood mitigation, and risk quantification. However, traditional techniques lack physical realism and extrapolation can be highly uncertain. In this study, we improve the physical basis for short-duration extreme rainfall estimation by simulating the heavy portion of the rainfall record mechanistically using the Bartlett-Lewis rectangular pulse (BLRP) model. Mechanistic rainfall models have had a tendency to underestimate rainfall extremes at fine temporal scales. Despite this, the simple process representation of rectangular pulse models is appealing in the context of extreme rainfall estimation because it emulates the known phenomenology of rainfall generation. A censored approach to Bartlett-Lewis model calibration is proposed and performed for single-site rainfall from two gauges in the UK and Germany. Extreme rainfall estimation is performed for each gauge at the 5, 15, and 60 min resolutions, and considerations for censor selection discussed.

  13. Data co-processing for extreme scale analysis level II ASC milestone (4745).

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, David; Moreland, Kenneth D.; Oldfield, Ron A.; Fabian, Nathan D.

    2013-03-01

    Exascale supercomputing will embody many revolutionary changes in the hardware and software of high-performance computing. A particularly pressing issue is gaining insight into the science behind the exascale computations. Power and I/O speed con- straints will fundamentally change current visualization and analysis work ows. A traditional post-processing work ow involves storing simulation results to disk and later retrieving them for visualization and data analysis. However, at exascale, scien- tists and analysts will need a range of options for moving data to persistent storage, as the current o ine or post-processing pipelines will not be able to capture the data necessary for data analysis of these extreme scale simulations. This Milestone explores two alternate work ows, characterized as in situ and in transit, and compares them. We nd each to have its own merits and faults, and we provide information to help pick the best option for a particular use.

  14. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  15. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  16. ISEE-magnetopause observations - workshop results

    International Nuclear Information System (INIS)

    Paschmann, G.

    1982-01-01

    A brief history of ISEE magnetopause workshops held during 1977-1981 is presented, and an assessment of the activity of these workshops is made. Workshop results are surveyed, with attention given to magnetopause thickness and speed, large-scale reconnection, small-scale reconnection, magnetic field topology, plasma waves, boundary layer structure, surface waves, plasma origin, and the relationship between magnetopause and particle boundaries. Finally, a few topics that require particular attention in the future are mentioned

  17. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  18. Eighteenth Workshop on Recent Developments in Computer Simulation Studies in Condensed Matter Physics

    CERN Document Server

    Landau, David P; Schüttler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVIII

    2006-01-01

    This volume represents a "status report" emanating from presentations made during the 18th Annual Workshop on Computer Simulations Studies in Condensed Matter Physics at the Center for Simulational Physics at the University of Georgia in March 2005. It provides a broad overview of the most recent advances in the field, spanning the range from statistical physics to soft condensed matter and biological systems. Results on nanostructures and materials are included as are several descriptions of advances in quantum simulations and quantum computing as well as.methodological advances.

  19. European Workshop Industrical Computer Science Systems approach to design for safety

    Science.gov (United States)

    Zalewski, Janusz

    1992-01-01

    This paper presents guidelines on designing systems for safety, developed by the Technical Committee 7 on Reliability and Safety of the European Workshop on Industrial Computer Systems. The focus is on complementing the traditional development process by adding the following four steps: (1) overall safety analysis; (2) analysis of the functional specifications; (3) designing for safety; (4) validation of design. Quantitative assessment of safety is possible by means of a modular questionnaire covering various aspects of the major stages of system development.

  20. Workshop Report On Sustainable Urban Development

    Science.gov (United States)

    Langhoff, Stephanie; Martin, Gary; Barone, Larry; Wagener, Wolfgang

    2010-01-01

    The key workshop goal was to explore and document how NASA technologies, such as remote sensing, climate modeling, and high-end computing and visualization along with NASA assets such as Earth Observing Satellites (EOS) and Unmanned Aerial Vehicles (UAVs) can contribute to creating and managing a sustainable urban environment. The focus was on the greater Bay Area, but many aspects of the workshop were applicable to urban management at the local, regional and global scales. A secondary goal was to help NASA better understand the problems facing urban managers and to make city leaders in the Bay Area more aware of NASA's capabilities. By bringing members of these two groups together we hope to see the beginnings of new collaborations between NASA and those faced with instituting sustainable urban management in Bay Area cities.

  1. Investigating the Scaling Properties of Extreme Rainfall Depth ...

    African Journals Online (AJOL)

    Investigating the Scaling Properties of Extreme Rainfall Depth Series in Oromia Regional State, Ethiopia. ... Science, Technology and Arts Research Journal ... for storm duration ranging from 0.5 to 24 hr observed at network of rain gauges sited in Oromia regional state were analyzed using an approach based on moments.

  2. Belowground Carbon Cycling Processes at the Molecular Scale: An EMSL Science Theme Advisory Panel Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Hess, Nancy J.; Brown, Gordon E.; Plata, Charity

    2014-02-21

    As part of the Belowground Carbon Cycling Processes at the Molecular Scale workshop, an EMSL Science Theme Advisory Panel meeting held in February 2013, attendees discussed critical biogeochemical processes that regulate carbon cycling in soil. The meeting attendees determined that as a national scientific user facility, EMSL can provide the tools and expertise needed to elucidate the molecular foundation that underlies mechanistic descriptions of biogeochemical processes that control carbon allocation and fluxes at the terrestrial/atmospheric interface in landscape and regional climate models. Consequently, the workshop's goal was to identify the science gaps that hinder either development of mechanistic description of critical processes or their accurate representation in climate models. In part, this report offers recommendations for future EMSL activities in this research area. The workshop was co-chaired by Dr. Nancy Hess (EMSL) and Dr. Gordon Brown (Stanford University).

  3. 8th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  4. Computer Games : 5th Workshop on Computer Games, CGW 2016, and 5th Workshop on General Intelligence in Game-Playing Agents, GIGA 2016, held in conjunction with the 25th International Conference on Artificial Intelligence, IJCAI 2016, New York, USA, July 9-10, 2016, Revised selected papers

    NARCIS (Netherlands)

    Cazenave, Tristan; Winands, Mark H. M; Edelkamp, Stefan; Schiffel, Stephan; Thielscher, Michael; Togelius, Julian

    2017-01-01

    This book constitutes the refereed proceedings of the 5th Computer Games Workshop, CGW 2016, and the 5th Workshop on General Intelligence in Game-Playing Agents, GIGA 2016, held in conjunction with the 25th International Conference on Artificial Intelligence, IJCAI 2016, in New York, USA, in July

  5. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    Science.gov (United States)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  6. Emergency response workers workshop

    International Nuclear Information System (INIS)

    Agapeev, S.A.; Glukhikh, E.N.; Tyurin, R.L.

    2012-01-01

    A training workshop entitled Current issues and potential improvements in Rosatom Corporation emergency prevention and response system was held in May-June, 2012. The workshop combined theoretical training with full-scale practical exercise that demonstrated the existing innovative capabilities for radiation reconnaissance, diving equipment and robotics, aircraft, emergency response and rescue hardware and machinery. This paper describes the activities carried out during the workshop [ru

  7. Plasma Science Contribution to the SCaLeS Report

    International Nuclear Information System (INIS)

    Jardin, S.C.

    2003-01-01

    In June of 2003, about 250 computational scientists and mathematicians being funded by the DOE Office of Science met in Arlington, VA, to attend a 2-day workshop on the Science Case for Large-scale Simulation (SCaLeS). This document was the output of the Plasma Science Section of that workshop. The conclusion is that exciting and important progress can be made in the field of Plasma Science if computer power continues to grow and algorithmic development continues to occur at the rate that it has in the past. Full simulations of burning plasma experiments could be possible in the 5-10 year time frame if an aggressive growth program is launched in this area

  8. Proceedings of the workshop. Recognition of DNA damage as onset of successful repair. Computational and experimental approaches

    International Nuclear Information System (INIS)

    Pinak, Miroslav

    2002-03-01

    This was held at The Tokai Research Establishment, Japan Atomic Energy Research Institute, on the 18th and 19th of December 2001. The Laboratory of Radiation Risk Analysis of JAERI organized the workshop. The main subject of the workshop was the DNA damage and its repair. Presented works described the leading experimental as well computational approaches, focusing mainly on the formation of DNA damage, its proliferation, enzymatic recognition and repair, and finally imaging and detection of lesions on a DNA molecule. The 19 of the presented papers are indexed individually. (J.P.N.)

  9. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning

    NARCIS (Netherlands)

    Retalis, Symeon; Sloep, Peter

    2009-01-01

    Retalis, S., & Sloep, P. B. (Eds.) (2009). Collection of 4 symposium papers at EC-TEL 2009. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning. September, 30, 2009, Nice,

  10. Understanding convective extreme precipitation scaling using observations and an entraining plume model

    NARCIS (Netherlands)

    Loriaux, J.M.; Lenderink, G.; De Roode, S.R.; Siebesma, A.P.

    2013-01-01

    Previously observed twice-Clausius–Clapeyron (2CC) scaling for extreme precipitation at hourly time scales has led to discussions about its origin. The robustness of this scaling is assessed by analyzing a subhourly dataset of 10-min resolution over the Netherlands. The results confirm the validity

  11. Domain Decomposition for Computing Extremely Low Frequency Induced Current in the Human Body

    OpenAIRE

    Perrussel , Ronan; Voyer , Damien; Nicolas , Laurent; Scorretti , Riccardo; Burais , Noël

    2011-01-01

    International audience; Computation of electromagnetic fields in high resolution computational phantoms requires solving large linear systems. We present an application of Schwarz preconditioners with Krylov subspace methods for computing extremely low frequency induced fields in a phantom issued from the Visible Human.

  12. Minicomputer and computations in chemistry

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    The introduction of multiple-precision hardware and longer word lengths has given the minicomputer a much more general potential for chemistry applications. It was the purpose of this workshop to address this potential, particularly as it is related to computations. The workshop brought together persons with minicomputer experience and those who are considering how the minicomputer might enhance their research activities. The workshop sessions were arranged in sequence to address the following questions: (1) Is the general purpose minicomputer an appropriate tool to meet the computational requirements of a chemistry research laboratory. (2) What are the procedures for wisely designing a minicomputer configuration. (3) What special-purpose hardware is available to enhance the speed of a minicomputer. (4) How does one select the appropriate minicomputer and ensure that it can accomplish the tasks for which is was designed. (5) How can one network minicomputers for more efficient and flexible operation. (6) Can one do really large-scale computations on a minicomputer and what modifications are necessary to convert existing programs and algorithms. (7) How can the minicomputer be used to access the maxicomputers at the NRCC. (8) How are computers likely to evolve in the future. (9) What should be the role of the NRCC in relation to minicomputers. This report of the workshop consists mainly of edited transcripts of introductory remarks. These were augmented by relevant bibliographies as an alternative to transcription of the entire workshop. There was no attempt in the workshop to give final answers to the questions that were raised, since the answers are determined in large part by each particular minicomputer environment.

  13. Proceedings of the second workshop of LHC Computing Grid, LCG-France; ACTES, 2e colloque LCG-France

    Energy Technology Data Exchange (ETDEWEB)

    Chollet, Frederique; Hernandez, Fabio; Malek, Fairouz; Gaelle, Shifrin (eds.) [Laboratoire de Physique Corpusculaire Clermont-Ferrand, Campus des Cezeaux, 24, avenue des Landais, Clermont-Ferrand (France)

    2007-03-15

    The second LCG-France Workshop was held in Clermont-Ferrand on 14-15 March 2007. These sessions organized by IN2P3 and DAPNIA were attended by around 70 participants working with the Computing Grid of LHC in France. The workshop was a opportunity of exchanges of information between the French and foreign site representatives on one side and delegates of experiments on the other side. The event allowed enlightening the place of LHC Computing Task within the frame of W-LCG world project, the undergoing actions and the prospects in 2007 and beyond. The following communications were presented: 1. The current status of the LHC computation in France; 2.The LHC Grid infrastructure in France and associated resources; 3.Commissioning of Tier 1; 4.The sites of Tier-2s and Tier-3s; 5.Computing in ALICE experiment; 6.Computing in ATLAS experiment; 7.Computing in the CMS experiments; 8.Computing in the LHCb experiments; 9.Management and operation of computing grids; 10.'The VOs talk to sites'; 11.Peculiarities of ATLAS; 12.Peculiarities of CMS and ALICE; 13.Peculiarities of LHCb; 14.'The sites talk to VOs'; 15. Worldwide operation of Grid; 16.Following-up the Grid jobs; 17.Surveillance and managing the failures; 18. Job scheduling and tuning; 19.Managing the site infrastructure; 20.LCG-France communications; 21.Managing the Grid data; 22.Pointing the net infrastructure and site storage. 23.ALICE bulk transfers; 24.ATLAS bulk transfers; 25.CMS bulk transfers; 26. LHCb bulk transfers; 27.Access to LHCb data; 28.Access to CMS data; 29.Access to ATLAS data; 30.Access to ALICE data; 31.Data analysis centers; 32.D0 Analysis Farm; 33.Some CMS grid analyses; 34.PROOF; 35.Distributed analysis using GANGA; 36.T2 set-up for end-users. In their concluding remarks Fairouz Malek and Dominique Pallin stressed that the current workshop was more close to users while the tasks for tightening the links between the sites and the experiments were definitely achieved. The IN2P3

  14. A Pervasive Parallel Processing Framework for Data Visualization and Analysis at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States)

    2014-11-01

    The evolution of the computing world from teraflop to petaflop has been relatively effortless, with several of the existing programming models scaling effectively to the petascale. The migration to exascale, however, poses considerable challenges. All industry trends infer that the exascale machine will be built using processors containing hundreds to thousands of cores per chip. It can be inferred that efficient concurrency on exascale machines requires a massive amount of concurrent threads, each performing many operations on a localized piece of data. Currently, visualization libraries and applications are based off what is known as the visualization pipeline. In the pipeline model, algorithms are encapsulated as filters with inputs and outputs. These filters are connected by setting the output of one component to the input of another. Parallelism in the visualization pipeline is achieved by replicating the pipeline for each processing thread. This works well for today’s distributed memory parallel computers but cannot be sustained when operating on processors with thousands of cores. Our project investigates a new visualization framework designed to exhibit the pervasive parallelism necessary for extreme scale machines. Our framework achieves this by defining algorithms in terms of worklets, which are localized stateless operations. Worklets are atomic operations that execute when invoked unlike filters, which execute when a pipeline request occurs. The worklet design allows execution on a massive amount of lightweight threads with minimal overhead. Only with such fine-grained parallelism can we hope to fill the billions of threads we expect will be necessary for efficient computation on an exascale machine.

  15. Applications of ion beam analysis workshop. Workshop handbook

    International Nuclear Information System (INIS)

    1995-01-01

    A workshop on applications of ion beam analysis was held at ANSTO, immediate prior to the IBMM-95 Conference in Canberra. It aims was to review developments and current status on use of ion beams for analysis, emphasizing the following aspects: fundamental ion beam research and secondary effects of ion beams; material sciences, geological, life sciences, environmental and industrial applications; computing codes for use in accelerator research; high energy heavy ion scattering and recoil; recent technological development using ion beams. The handbook contains the workshop's program, 29 abstracts and a list of participants

  16. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  17. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  18. Changes and Attribution of Extreme Precipitation in Climate Models: Subdaily and Daily Scales

    Science.gov (United States)

    Zhang, W.; Villarini, G.; Scoccimarro, E.; Vecchi, G. A.

    2017-12-01

    Extreme precipitation events are responsible for numerous hazards, including flooding, soil erosion, and landslides. Because of their significant socio-economic impacts, the attribution and projection of these events is of crucial importance to improve our response, mitigation and adaptation strategies. Here we present results from our ongoing work.In terms of attribution, we use idealized experiments [pre-industrial control experiment (PI) and 1% per year increase (1%CO2) in atmospheric CO2] from ten general circulation models produced under the Coupled Model Intercomparison Project Phase 5 (CMIP5) and the fraction of attributable risk to examine the CO2 effects on extreme precipitation at the sub-daily and daily scales. We find that the increased CO2 concentration substantially increases the odds of the occurrence of sub-daily precipitation extremes compared to the daily scale in most areas of the world, with the exception of some regions in the sub-tropics, likely in relation to the subsidence of the Hadley Cell. These results point to the large role that atmospheric CO2 plays in extreme precipitation under an idealized framework. Furthermore, we investigate the changes in extreme precipitation events with the Community Earth System Model (CESM) climate experiments using the scenarios consistent with the 1.5°C and 2°C temperature targets. We find that the frequency of annual extreme precipitation at a global scale increases in both 1.5°C and 2°C scenarios until around 2070, after which the magnitudes of the trend become much weaker or even negative. Overall, the frequency of global annual extreme precipitation is similar between 1.5°C and 2°C for the period 2006-2035, and the changes in extreme precipitation in individual seasons are consistent with those for the entire year. The frequency of extreme precipitation in the 2°C experiments is higher than for the 1.5°C experiment after the late 2030s, particularly for the period 2071-2100.

  19. Climate Change Extreme Events: Meeting the Information Needs of Water Resource Managers

    Science.gov (United States)

    Quay, R.; Garfin, G. M.; Dominguez, F.; Hirschboeck, K. K.; Woodhouse, C. A.; Guido, Z.; White, D. D.

    2013-12-01

    Information about climate has long been used by water managers to develop short term and long term plans and strategies for regional and local water resources. Inherent within longer term forecasts is an element of uncertainty, which is particularly evident in Global Climate model results for precipitation. For example in the southwest estimates in the flow of the Colorado River based on GCM results indicate changes from 120% or current flow to 60%. Many water resource managers are now using global climate model down scaled estimates results as indications of potential climate change as part of that planning. They are addressing the uncertainty within these estimates by using an anticipatory planning approach looking at a range of possible futures. One aspect of climate that is important for such planning are estimates of future extreme storm (short term) and drought (long term) events. However, the climate science of future possible changes in extreme events is less mature than general climate change science. At a recent workshop among climate scientists and water managers in the southwest, it was concluded the science of climate change extreme events is at least a decade away from being robust enough to be useful for water managers in their water resource management activities. However, it was proposed that there are existing estimates and records of past flooding and drought events that could be combined with general climate change science to create possible future events. These derived events could be of sufficient detail to be used by water resource managers until such time that the science of extreme events is able to provide more detailed estimates. Based on the results of this workshop and other work being done by the Decision Center for a Desert City at Arizona State University and the Climate Assessment for the Southwest center at University of Arizona., this article will 1) review what are the extreme event data needs of Water Resource Managers in the

  20. Indonesian students' participation in an interprofessional learning workshop.

    Science.gov (United States)

    Ernawati, Desak Ketut; Lee, Ya Ping; Hughes, Jeffery

    2015-01-01

    Interprofessional learning activities, such as workshops allow students to learn from, with and about each other. This study assessed the impact on Indonesian health students' attitudes towards interprofessional education (IPE) from participating in a workshop on medication safety. The students attended a two-day IPE workshop on medication safety. Thirty-five (48.6%) students completed pre-/post-workshop surveys using a modified Readiness for Interprofessional Learning Scale (RIPLS) survey. The post-workshop survey also had a series of open-ended questions. Students' responses to each RIPLS statement pre-/post-workshop were compared, whilst their responses to open-ended questions in post-workshop survey were thematically analysed. Students reported positive attitudinal changes on statements of shared learning and teamwork sub-scale (Wilcoxon p value importance of teamwork and communication skills. This study found that learning with other health students through an IPE workshop improved medical, nursing and pharmacy students' attitudes towards the importance of shared learning, teamwork and communication in healthcare service.

  1. Visualization and parallel I/O at extreme scale

    International Nuclear Information System (INIS)

    Ross, R B; Peterka, T; Shen, H-W; Hong, Y; Ma, K-L; Yu, H; Moreland, K

    2008-01-01

    In our efforts to solve ever more challenging problems through computational techniques, the scale of our compute systems continues to grow. As we approach petascale, it becomes increasingly important that all the resources in the system be used as efficiently as possible, not just the floating-point units. Because of hardware, software, and usability challenges, storage resources are often one of the most poorly used and performing components of today's compute systems. This situation can be especially true in the case of the analysis phases of scientific workflows. In this paper we discuss the impact of large-scale data on visual analysis operations and examine a collection of approaches to I/O in the visual analysis process. First we examine the performance of volume rendering on a leadership-computing platform and assess the relative cost of I/O, rendering, and compositing operations. Next we analyze the performance implications of eliminating preprocessing from this example workflow. Then we describe a technique that uses data reorganization to improve access times for data-intensive volume rendering

  2. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  3. Exascale Co-design for Modeling Materials in Extreme Environments

    Energy Technology Data Exchange (ETDEWEB)

    Germann, Timothy C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-07-08

    Computational materials science has provided great insight into the response of materials under extreme conditions that are difficult to probe experimentally. For example, shock-induced plasticity and phase transformation processes in single-crystal and nanocrystalline metals have been widely studied via large-scale molecular dynamics simulations, and many of these predictions are beginning to be tested at advanced 4th generation light sources such as the Advanced Photon Source (APS) and Linac Coherent Light Source (LCLS). I will describe our simulation predictions and their recent verification at LCLS, outstanding challenges in modeling the response of materials to extreme mechanical and radiation environments, and our efforts to tackle these as part of the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx). ExMatEx has initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. We anticipate that we will be able to exploit hierarchical, heterogeneous architectures to achieve more realistic large-scale simulations with adaptive physics refinement, and are using tractable application scale-bridging proxy application testbeds to assess new approaches and requirements. Such current scale-bridging strategies accumulate (or recompute) a distributed response database from fine-scale calculations, in a top-down rather than bottom-up multiscale approach.

  4. Identification of large-scale meteorological patterns associated with extreme precipitation in the US northeast

    Science.gov (United States)

    Agel, Laurie; Barlow, Mathew; Feldstein, Steven B.; Gutowski, William J.

    2018-03-01

    Patterns of daily large-scale circulation associated with Northeast US extreme precipitation are identified using both k-means clustering (KMC) and Self-Organizing Maps (SOM) applied to tropopause height. The tropopause height provides a compact representation of the upper-tropospheric potential vorticity, which is closely related to the overall evolution and intensity of weather systems. Extreme precipitation is defined as the top 1% of daily wet-day observations at 35 Northeast stations, 1979-2008. KMC is applied on extreme precipitation days only, while the SOM algorithm is applied to all days in order to place the extreme results into the overall context of patterns for all days. Six tropopause patterns are identified through KMC for extreme day precipitation: a summertime tropopause ridge, a summertime shallow trough/ridge, a summertime shallow eastern US trough, a deeper wintertime eastern US trough, and two versions of a deep cold-weather trough located across the east-central US. Thirty SOM patterns for all days are identified. Results for all days show that 6 SOM patterns account for almost half of the extreme days, although extreme precipitation occurs in all SOM patterns. The same SOM patterns associated with extreme precipitation also routinely produce non-extreme precipitation; however, on extreme precipitation days the troughs, on average, are deeper and the downstream ridges more pronounced. Analysis of other fields associated with the large-scale patterns show various degrees of anomalously strong moisture transport preceding, and upward motion during, extreme precipitation events.

  5. Extreme daily precipitation in Western Europe with climate change at appropriate spatial scales

    NARCIS (Netherlands)

    Booij, Martijn J.

    2002-01-01

    Extreme daily precipitation for the current and changed climate at appropriate spatial scales is assessed. This is done in the context of the impact of climate change on flooding in the river Meuse in Western Europe. The objective is achieved by determining and comparing extreme precipitation from

  6. 9th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Camacho, David; Analide, Cesar; Seghrouchni, Amal; Badica, Costin

    2016-01-01

    This book represents the combined peer-reviewed proceedings of the ninth International Symposium on Intelligent Distributed Computing – IDC’2015, of the Workshop on Cyber Security and Resilience of Large-Scale Systems – WSRL’2015, and of the International Workshop on Future Internet and Smart Networks – FI&SN’2015. All the events were held in Guimarães, Portugal during October 7th-9th, 2015. The 46 contributions published in this book address many topics related to theory and applications of intelligent distributed computing, including: Intelligent Distributed Agent-Based Systems, Ambient Intelligence and Social Networks, Computational Sustainability, Intelligent Distributed Knowledge Representation and Processing, Smart Networks, Networked Intelligence and Intelligent Distributed Applications, amongst others.

  7. Proceedings of the workshop of three large tokamak cooperation on energy confinement scaling under intensive auxiliary heating, May 18 ∼ 20, 1992, Naka

    International Nuclear Information System (INIS)

    1992-09-01

    The workshop of three large tokamak cooperation W22 on 'Energy confinement scaling under intensive auxiliary heating' was held 18-20 May, 1992, at Naka Fusion Research Establishment. This proceedings compiles 14 synopses of contributions (5 from JET, 4 from JT-60, 3 from TFTR, and 1 each from DIII-D JFT-2M) and the summary of the workshop. Topic sections are ; (i) L-mode confinement and scaling, (ii) Confinement at high β P regimes, Supershots, High poloidal beta enhanced confinement mode etc., (iii) Confinement at various H-mode regimes and scaling (including the VH-mode), (iv) Characteristic time scales for present tokamak regimes, and (v) Theoretical comparison with experimental data. (author)

  8. FOREWORD: 3rd International Workshop on New Computational Methods for Inverse Problems (NCMIP 2013)

    Science.gov (United States)

    Blanc-Féraud, Laure; Joubert, Pierre-Yves

    2013-10-01

    Conference logo This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 3rd International Workshop on New Computational Methods for Inverse Problems, NCMIP 2013 (http://www.farman.ens-cachan.fr/NCMIP_2013.html). This workshop took place at Ecole Normale Supérieure de Cachan, in Cachan, France, on 22 May 2013, at the initiative of Institut Farman. The prior editions of NCMIP also took place in Cachan, France, firstly within the scope of the ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/), and secondly at the initiative of Institut Farman, in May 2012 (http://www.farman.ens-cachan.fr/NCMIP_2012.html). The NCMIP Workshop focused on recent advances in the resolution of inverse problems. Indeed inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finances. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational

  9. 9th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Hilbrich, Tobias; Niethammer, Christoph; Gracia, José; Nagel, Wolfgang; Resch, Michael

    2016-01-01

    High Performance Computing (HPC) remains a driver that offers huge potentials and benefits for science and society. However, a profound understanding of the computational matters and specialized software is needed to arrive at effective and efficient simulations. Dedicated software tools are important parts of the HPC software landscape, and support application developers. Even though a tool is by definition not a part of an application, but rather a supplemental piece of software, it can make a fundamental difference during the development of an application. Such tools aid application developers in the context of debugging, performance analysis, and code optimization, and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 9th International Parallel Tools Workshop held in Dresden, Germany, September 2-3, 2015, which offered an established forum for discussing the latest advances in paral...

  10. The Astronomy Workshop

    Science.gov (United States)

    Hamilton, D. P.; Asbury, M. L.; Proctor, A.

    2001-12-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is an interactive online astronomy resource developed, and maintained at the University of Maryland, for use by students, educators and the general public. The Astronomy Workshop has been extensively tested and used successfully at many different levels, including High School and Junior High School science classes, University introductory astronomy courses, and University intermediate and advanced astronomy courses. Some topics currently covered in the Astronomy Workshop are: Animated Orbits of Planets and Moons: The orbits of the nine planets and 91 known planetary satellites are shown in animated, to-scale drawings. The orbiting bodies move at their correct relative speeds about their parent, which is rendered as an attractive, to-scale gif image. Solar System Collisions: This most popular of our applications shows what happens when an asteroid or comet with user-defined size and speed impacts a given planet. The program calculates many effects, including the country impacted (if Earth is the target), energy of the explosion, crater size, magnitude of the planetquake generated. It also displays a relevant image (e.g. terrestrial crater, lunar crater, etc.). Planetary and Satellite Data Calculators: These tools allow the user to easily calculate physical data for all of the planets or satellites simultaneously, making comparison very easy. Orbital Simulations: These tools allow the student to investigate different aspects of the three-body problem of celestial mechanics. Astronomy Workshop Bulletin Board: Get innovative teaching ideas and read about in-class experiences with the Astronomy Workshop. Share your ideas with other educators by posting on the Bulletin Board. Funding for the Astronomy Workshop is provided by the National Science Foundation.

  11. A Qualitative Case Study Comparing a Computer-Mediated Delivery System to a Face-to-Face Mediated Delivery System for Teaching Creative Writing Fiction Workshops

    Science.gov (United States)

    Daniels, Mindy A.

    2012-01-01

    The purpose of this case study was to compare the pedagogical and affective efficiency and efficacy of creative prose fiction writing workshops taught via asynchronous computer-mediated online distance education with creative prose fiction writing workshops taught face-to-face in order to better understand their operational pedagogy and…

  12. Workshop on Human Activity at Scale in Earth System Models

    Energy Technology Data Exchange (ETDEWEB)

    Allen, Melissa R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Aziz, H. M. Abdul [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Coletti, Mark A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kennedy, Joseph H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nair, Sujithkumar S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Omitaomu, Olufemi A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-26

    Changing human activity within a geographical location may have significant influence on the global climate, but that activity must be parameterized in such a way as to allow these high-resolution sub-grid processes to affect global climate within that modeling framework. Additionally, we must have tools that provide decision support and inform local and regional policies regarding mitigation of and adaptation to climate change. The development of next-generation earth system models, that can produce actionable results with minimum uncertainties, depends on understanding global climate change and human activity interactions at policy implementation scales. Unfortunately, at best we currently have only limited schemes for relating high-resolution sectoral emissions to real-time weather, ultimately to become part of larger regions and well-mixed atmosphere. Moreover, even our understanding of meteorological processes at these scales is imperfect. This workshop addresses these shortcomings by providing a forum for discussion of what we know about these processes, what we can model, where we have gaps in these areas and how we can rise to the challenge to fill these gaps.

  13. A stress management workshop improves residents' coping skills.

    Science.gov (United States)

    McCue, J D; Sachs, C L

    1991-11-01

    We describe the effectiveness of a stress management workshop designed for physicians. Of the 64 medicine, pediatrics, and medicine-pediatrics residents who agreed to participate in the workshop, the 43 who could be freed from clinical responsibilities constituted the intervention group; the 21 residents who could not be freed from clinical responsibilities were asked to be the nonintervention group. The ESSI Stress Systems Instrument and Maslach Burnout Inventory were administered to control subjects and workshop participants 2 weeks before and 6 weeks after the workshop. The half-day workshops taught management of the stresses of medical practice through: (1) learning and practicing interpersonal skills that increase the availability of social support; (2) prioritization of personal, work, and educational demands; (3) techniques to increase stamina and attend to self-care needs; (4) recognition and avoidance of maladaptive responses; and (5) positive outlook skills. Overall, the ESSI Stress Systems Instrument test scores for the workshop participants improved (+1.27), while the nonintervention group's mean scores declined (-0.65). All 21 individual ESSI Stress Systems Instrument scale items improved for the workshop, compared with eight of 21 items for the nonintervention group. The workshop group improved in the Maslach Burnout Inventory emotional exhaustion scale and deteriorated less than the nonintervention group in the depersonalization scale. We conclude that a modest, inexpensive stress management workshop was received positively, and can lead to significant short-term improvement in stress and burnout test scores for medicine and pediatrics residents.

  14. Grid and Entrepreneurship Workshop

    CERN Multimedia

    2006-01-01

    The CERN openlab is organising a special workshop about Grid opportunities for entrepreneurship. This one-day event will provide an overview of what is involved in spin-off technology, with a special reference to the context of computing and data Grids. Lectures by experienced entrepreneurs will introduce the key concepts of entrepreneurship and review, in particular, the industrial potential of EGEE (the EU co-funded Enabling Grids for E-sciencE project, led by CERN). Case studies will be given by CEOs of European start-ups already active in the Grid and computing cluster area, and regional experts will provide an overview of efforts in several European regions to stimulate entrepreneurship. This workshop is designed to encourage students and researchers involved or interested in Grid technology to consider the entrepreneurial opportunities that this technology may create in the coming years. This workshop is organized as part of the CERN openlab student programme, which is co-sponsored by CERN, HP, ...

  15. Scaling of precipitation extremes with temperature in the French Mediterranean region: What explains the hook shape?

    Science.gov (United States)

    Drobinski, P.; Alonzo, B.; Bastin, S.; Silva, N. Da; Muller, C.

    2016-04-01

    Expected changes to future extreme precipitation remain a key uncertainty associated with anthropogenic climate change. Extreme precipitation has been proposed to scale with the precipitable water content in the atmosphere. Assuming constant relative humidity, this implies an increase of precipitation extremes at a rate of about 7% °C-1 globally as indicated by the Clausius-Clapeyron relationship. Increases faster and slower than Clausius-Clapeyron have also been reported. In this work, we examine the scaling between precipitation extremes and temperature in the present climate using simulations and measurements from surface weather stations collected in the frame of the HyMeX and MED-CORDEX programs in Southern France. Of particular interest are departures from the Clausius-Clapeyron thermodynamic expectation, their spatial and temporal distribution, and their origin. Looking at the scaling of precipitation extreme with temperature, two regimes emerge which form a hook shape: one at low temperatures (cooler than around 15°C) with rates of increase close to the Clausius-Clapeyron rate and one at high temperatures (warmer than about 15°C) with sub-Clausius-Clapeyron rates and most often negative rates. On average, the region of focus does not seem to exhibit super Clausius-Clapeyron behavior except at some stations, in contrast to earlier studies. Many factors can contribute to departure from Clausius-Clapeyron scaling: time and spatial averaging, choice of scaling temperature (surface versus condensation level), and precipitation efficiency and vertical velocity in updrafts that are not necessarily constant with temperature. But most importantly, the dynamical contribution of orography to precipitation in the fall over this area during the so-called "Cevenoles" events, explains the hook shape of the scaling of precipitation extremes.

  16. PREFACE: Joint IPPP Durham/Cockcroft Institute/ICFA Workshop on Advanced QED methods for Future Accelerators

    Science.gov (United States)

    Bailey, I. R.; Barber, D. P.; Chattopadhyay, S.; Hartin, A.; Heinzl, T.; Hesselbach, S.; Moortgat-Pick, G. A.

    2009-11-01

    The joint IPPP Durham/Cockcroft Institute/ICFA workshop on advanced QED methods for future accelerators took place at the Cockcroft Institute in early March 2009. The motivation for the workshop was the need for a detailed consideration of the physics processes associated with beam-beam effects at the interaction points of future high-energy electron-positron colliders. There is a broad consensus within the particle physics community that the next international facility for experimental high-energy physics research beyond the Large Hadron Collider at CERN should be a high-luminosity electron-positron collider working at the TeV energy scale. One important feature of such a collider will be its ability to deliver polarised beams to the interaction point and to provide accurate measurements of the polarisation state during physics collisions. The physics collisions take place in very dense charge bunches in the presence of extremely strong electromagnetic fields of field strength of order of the Schwinger critical field strength of 4.4×1013 Gauss. These intense fields lead to depolarisation processes which need to be thoroughly understood in order to reduce uncertainty in the polarisation state at collision. To that end, this workshop reviewed the formalisms for describing radiative processes and the methods of calculation in the future strong-field environments. These calculations are based on the Furry picture of organising the interaction term of the Lagrangian. The means of deriving the transition probability of the most important of the beam-beam processes - Beamsstrahlung - was reviewed. The workshop was honoured by the presentations of one of the founders, V N Baier, of the 'Operator method' - one means for performing these calculations. Other theoretical methods of performing calculations in the Furry picture, namely those due to A I Nikishov, V I Ritus et al, were reviewed and intense field quantum processes in fields of different form - namely those

  17. Computer work and musculoskeletal disorders of the neck and upper extremity: A systematic review

    Directory of Open Access Journals (Sweden)

    Veiersted Kaj Bo

    2010-04-01

    Full Text Available Abstract Background This review examines the evidence for an association between computer work and neck and upper extremity disorders (except carpal tunnel syndrome. Methods A systematic critical review of studies of computer work and musculoskeletal disorders verified by a physical examination was performed. Results A total of 22 studies (26 articles fulfilled the inclusion criteria. Results show limited evidence for a causal relationship between computer work per se, computer mouse and keyboard time related to a diagnosis of wrist tendonitis, and for an association between computer mouse time and forearm disorders. Limited evidence was also found for a causal relationship between computer work per se and computer mouse time related to tension neck syndrome, but the evidence for keyboard time was insufficient. Insufficient evidence was found for an association between other musculoskeletal diagnoses of the neck and upper extremities, including shoulder tendonitis and epicondylitis, and any aspect of computer work. Conclusions There is limited epidemiological evidence for an association between aspects of computer work and some of the clinical diagnoses studied. None of the evidence was considered as moderate or strong and there is a need for more and better documentation.

  18. PREFACE: 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics & 38th National Conference on Theoretical Physics

    Science.gov (United States)

    2014-09-01

    This volume contains selected papers presented at the 38th National Conference on Theoretical Physics (NCTP-38) and the 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics (IWTCP-1). Both the conference and the workshop were held from 29 July to 1 August 2013 in Pullman hotel, Da Nang, Vietnam. The IWTCP-1 was a new activity of the Vietnamese Theoretical Physics Society (VTPS) organized in association with the 38th National Conference on Theoretical Physics (NCTP-38), the most well-known annual scientific forum dedicated to the dissemination of the latest development in the field of theoretical physics within the country. The IWTCP-1 was also an External Activity of the Asia Pacific Center for Theoretical Physics (APCTP). The overriding goal of the IWTCP is to provide an international forum for scientists and engineers from academia to share ideas, problems and solution relating to the recent advances in theoretical physics as well as in computational physics. The main IWTCP motivation is to foster scientific exchanges between the Vietnamese theoretical and computational physics community and world-wide scientists as well as to promote high-standard level of research and education activities for young physicists in the country. About 110 participants coming from 10 countries participated in the conference and the workshop. 4 invited talks, 18 oral contributions and 46 posters were presented at the conference. In the workshop we had one keynote lecture and 9 invited talks presented by international experts in the fields of theoretical and computational physics, together with 14 oral and 33 poster contributions. The proceedings were edited by Nguyen Tri Lan, Trinh Xuan Hoang, and Nguyen Ai Viet. We would like to thank all invited speakers, participants and sponsors for making the conference and the workshop successful. Nguyen Ai Viet Chair of NCTP-38 and IWTCP-1

  19. Frameworks for visualization at the extreme scale

    International Nuclear Information System (INIS)

    Joy, Kenneth I; Miller, Mark; Childs, Hank; Bethel, E Wes; Clyne, John; Ostrouchov, George; Ahern, Sean

    2007-01-01

    The challenges of visualization at the extreme scale involve issues of scale, complexity, temporal exploration and uncertainty. The Visualization and Analytics Center for Enabling Technologies (VACET) focuses on leveraging scientific visualization and analytics software technology as an enabling technology to increased scientific discovery and insight. In this paper, we introduce new uses of visualization frameworks through the introduction of Equivalence Class Functions (ECFs). These functions give a new class of derived quantities designed to greatly expand the ability of the end user to explore and visualize data. ECFs are defined over equivalence classes (i.e., groupings) of elements from an original mesh, and produce summary values for the classes as output. ECFs can be used in the visualization process to directly analyze data, or can be used to synthesize new derived quantities on the original mesh. The design of ECFs enable a parallel implementation that allows the use of these techniques on massive data sets that require parallel processing

  20. Proceeding of 1998-workshop on MHD computations. Study on numerical methods related to plasma confinement

    International Nuclear Information System (INIS)

    Kako, T.; Watanabe, T.

    1999-04-01

    This is the proceeding of 'Study on Numerical Methods Related to Plasma Confinement' held in National Institute for Fusion Science. In this workshop, theoretical and numerical analyses of possible plasma equilibria with their stability properties are presented. These are also various talks on mathematical as well as numerical analyses related to the computational methods for fluid dynamics and plasma physics. The 14 papers are indexed individually. (J.P.N.)

  1. Proceeding of 1998-workshop on MHD computations. Study on numerical methods related to plasma confinement

    Energy Technology Data Exchange (ETDEWEB)

    Kako, T.; Watanabe, T. [eds.

    1999-04-01

    This is the proceeding of 'Study on Numerical Methods Related to Plasma Confinement' held in National Institute for Fusion Science. In this workshop, theoretical and numerical analyses of possible plasma equilibria with their stability properties are presented. These are also various talks on mathematical as well as numerical analyses related to the computational methods for fluid dynamics and plasma physics. The 14 papers are indexed individually. (J.P.N.)

  2. The Relationship between Spatial and Temporal Magnitude Estimation of Scientific Concepts at Extreme Scales

    Science.gov (United States)

    Price, Aaron; Lee, H.

    2010-01-01

    Many astronomical objects, processes, and events exist and occur at extreme scales of spatial and temporal magnitudes. Our research draws upon the psychological literature, replete with evidence of linguistic and metaphorical links between the spatial and temporal domains, to compare how students estimate spatial and temporal magnitudes associated with objects and processes typically taught in science class.. We administered spatial and temporal scale estimation tests, with many astronomical items, to 417 students enrolled in 12 undergraduate science courses. Results show that while the temporal test was more difficult, students’ overall performance patterns between the two tests were mostly similar. However, asymmetrical correlations between the two tests indicate that students think of the extreme ranges of spatial and temporal scales in different ways, which is likely influenced by their classroom experience. When making incorrect estimations, students tended to underestimate the difference between the everyday scale and the extreme scales on both tests. This suggests the use of a common logarithmic mental number line for both spatial and temporal magnitude estimation. However, there are differences between the two tests in the errors student make in the everyday range. Among the implications discussed is the use of spatio-temporal reference frames, instead of smooth bootstrapping, to help students maneuver between scales of magnitude and the use of logarithmic transformations between reference frames. Implications for astronomy range from learning about spectra to large scale galaxy structure.

  3. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  4. Large-Scale Atmospheric Circulation Patterns Associated with Temperature Extremes as a Basis for Model Evaluation: Methodological Overview and Results

    Science.gov (United States)

    Loikith, P. C.; Broccoli, A. J.; Waliser, D. E.; Lintner, B. R.; Neelin, J. D.

    2015-12-01

    Anomalous large-scale circulation patterns often play a key role in the occurrence of temperature extremes. For example, large-scale circulation can drive horizontal temperature advection or influence local processes that lead to extreme temperatures, such as by inhibiting moderating sea breezes, promoting downslope adiabatic warming, and affecting the development of cloud cover. Additionally, large-scale circulation can influence the shape of temperature distribution tails, with important implications for the magnitude of future changes in extremes. As a result of the prominent role these patterns play in the occurrence and character of extremes, the way in which temperature extremes change in the future will be highly influenced by if and how these patterns change. It is therefore critical to identify and understand the key patterns associated with extremes at local to regional scales in the current climate and to use this foundation as a target for climate model validation. This presentation provides an overview of recent and ongoing work aimed at developing and applying novel approaches to identifying and describing the large-scale circulation patterns associated with temperature extremes in observations and using this foundation to evaluate state-of-the-art global and regional climate models. Emphasis is given to anomalies in sea level pressure and 500 hPa geopotential height over North America using several methods to identify circulation patterns, including self-organizing maps and composite analysis. Overall, evaluation results suggest that models are able to reproduce observed patterns associated with temperature extremes with reasonable fidelity in many cases. Model skill is often highest when and where synoptic-scale processes are the dominant mechanisms for extremes, and lower where sub-grid scale processes (such as those related to topography) are important. Where model skill in reproducing these patterns is high, it can be inferred that extremes are

  5. First International Workshop on Grid Simulator Testing of Wind Turbine Drivetrains: Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Gevorgian, V.; Link, H.; McDade, M.; Mander, A.; Fox, J. C.; Rigas, N.

    2013-11-01

    This report summarizes the proceedings of the First International Workshop on Grid Simulator Testing of Wind Turbine Drivetrains, held from June 13 to 14, 2013, at the National Renewable Energy Laboratory's National Wind Technology Center, located south of Boulder, Colorado. The workshop was sponsored by the U.S. Department of Energy and cohosted by the National Renewable Energy Laboratory and Clemson University under ongoing collaboration via a cooperative research and development agreement. The purpose of the workshop was to provide a forum to discuss the research, testing needs, and state-of-the-art apparatuses involved in grid compliance testing of utility-scale wind turbine generators. This includes both dynamometer testing of wind turbine drivetrains ('ground testing') and field testing grid-connected wind turbines. Four sessions followed by discussions in which all attendees of the workshop were encouraged to participate comprised the workshop.

  6. Quantum universe on extremely small space-time scales

    International Nuclear Information System (INIS)

    Kuzmichev, V.E.; Kuzmichev, V.V.

    2010-01-01

    The semiclassical approach to the quantum geometrodynamical model is used for the description of the properties of the Universe on extremely small space-time scales. Under this approach, the matter in the Universe has two components of the quantum nature which behave as antigravitating fluids. The first component does not vanish in the limit h → 0 and can be associated with dark energy. The second component is described by an extremely rigid equation of state and goes to zero after the transition to large spacetime scales. On small space-time scales, this quantum correction turns out to be significant. It determines the geometry of the Universe near the initial cosmological singularity point. This geometry is conformal to a unit four-sphere embedded in a five-dimensional Euclidean flat space. During the consequent expansion of the Universe, when reaching the post-Planck era, the geometry of the Universe changes into that conformal to a unit four-hyperboloid in a five-dimensional Lorentzsignatured flat space. This agrees with the hypothesis about the possible change of geometry after the origin of the expanding Universe from the region near the initial singularity point. The origin of the Universe can be interpreted as a quantum transition of the system from a region in the phase space forbidden for the classical motion, but where a trajectory in imaginary time exists, into a region, where the equations of motion have the solution which describes the evolution of the Universe in real time. Near the boundary between two regions, from the side of real time, the Universe undergoes almost an exponential expansion which passes smoothly into the expansion under the action of radiation dominating over matter which is described by the standard cosmological model.

  7. European Workshop on High Order Nonlinear Numerical Schemes for Evolutionary PDEs

    CERN Document Server

    Beaugendre, Héloïse; Congedo, Pietro; Dobrzynski, Cécile; Perrier, Vincent; Ricchiuto, Mario

    2014-01-01

    This book collects papers presented during the European Workshop on High Order Nonlinear Numerical Methods for Evolutionary PDEs (HONOM 2013) that was held at INRIA Bordeaux Sud-Ouest, Talence, France in March, 2013. The central topic is high order methods for compressible fluid dynamics. In the workshop, and in this proceedings, greater emphasis is placed on the numerical than the theoretical aspects of this scientific field. The range of topics is broad, extending through algorithm design, accuracy, large scale computing, complex geometries, discontinuous Galerkin, finite element methods, Lagrangian hydrodynamics, finite difference methods and applications and uncertainty quantification. These techniques find practical applications in such fields as fluid mechanics, magnetohydrodynamics, nonlinear solid mechanics, and others for which genuinely nonlinear methods are needed.

  8. Computing the distribution of return levels of extreme warm temperatures for future climate projections

    Energy Technology Data Exchange (ETDEWEB)

    Pausader, M.; Parey, S.; Nogaj, M. [EDF/R and D, Chatou Cedex (France); Bernie, D. [Met Office Hadley Centre, Exeter (United Kingdom)

    2012-03-15

    In order to take into account uncertainties in the future climate projections there is a growing demand for probabilistic projections of climate change. This paper presents a methodology for producing such a probabilistic analysis of future temperature extremes. The 20- and 100-years return levels are obtained from that of the normalized variable and the changes in mean and standard deviation given by climate models for the desired future periods. Uncertainty in future change of these extremes is quantified using a multi-model ensemble and a perturbed physics ensemble. The probability density functions of future return levels are computed at a representative location from the joint probability distribution of mean and standard deviation changes given by the two combined ensembles of models. For the studied location, the 100-years return level at the end of the century is lower than 41 C with an 80% confidence. Then, as the number of model simulations is low to compute a reliable distribution, two techniques proposed in the literature (local pattern scaling and ANOVA) have been used to infer the changes in mean and standard deviation for the combinations of RCM and GCM which have not been run. The ANOVA technique leads to better results for the reconstruction of the mean changes, whereas the two methods fail to correctly infer the changes in standard deviation. As standard deviation change has a major impact on return level change, there is a need to improve the models and the different techniques regarding the variance changes. (orig.)

  9. Architectures, Concepts and Architectures for Service Oriented Computing : proceedings of the 1st International Workshop - ACT4SOC 2007

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Unknown, [Unknown

    2007-01-01

    This volume contains the proceedings of the First International Workshop on Architectures, Concepts and Technologies for Service Oriented Computing (ACT4SOC 2007), held on July 22 in Barcelona, Spain, in conjunction with the Second International Conference on Software and Data Technologies (ICSOFT

  10. Computer-Administered Interviews and Rating Scales

    Science.gov (United States)

    Garb, Howard N.

    2007-01-01

    To evaluate the value of computer-administered interviews and rating scales, the following topics are reviewed in the present article: (a) strengths and weaknesses of structured and unstructured assessment instruments, (b) advantages and disadvantages of computer administration, and (c) the validity and utility of computer-administered interviews…

  11. FOREWORD: 2nd International Workshop on New Computational Methods for Inverse Problems (NCMIP 2012)

    Science.gov (United States)

    Blanc-Féraud, Laure; Joubert, Pierre-Yves

    2012-09-01

    Conference logo This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 2nd International Workshop on New Computational Methods for Inverse Problems, (NCMIP 2012). This workshop took place at Ecole Normale Supérieure de Cachan, in Cachan, France, on 15 May 2012, at the initiative of Institut Farman. The first edition of NCMIP also took place in Cachan, France, within the scope of the ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/). The NCMIP Workshop focused on recent advances in the resolution of inverse problems. Indeed inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finance. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, kernel methods, learning methods, convex optimization, free discontinuity problems, metamodels, proper orthogonal decomposition

  12. Verifying a computational method for predicting extreme ground motion

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  13. How do the multiple large-scale climate oscillations trigger extreme precipitation?

    Science.gov (United States)

    Shi, Pengfei; Yang, Tao; Xu, Chong-Yu; Yong, Bin; Shao, Quanxi; Li, Zhenya; Wang, Xiaoyan; Zhou, Xudong; Li, Shu

    2017-10-01

    Identifying the links between variations in large-scale climate patterns and precipitation is of tremendous assistance in characterizing surplus or deficit of precipitation, which is especially important for evaluation of local water resources and ecosystems in semi-humid and semi-arid regions. Restricted by current limited knowledge on underlying mechanisms, statistical correlation methods are often used rather than physical based model to characterize the connections. Nevertheless, available correlation methods are generally unable to reveal the interactions among a wide range of climate oscillations and associated effects on precipitation, especially on extreme precipitation. In this work, a probabilistic analysis approach by means of a state-of-the-art Copula-based joint probability distribution is developed to characterize the aggregated behaviors for large-scale climate patterns and their connections to precipitation. This method is employed to identify the complex connections between climate patterns (Atlantic Multidecadal Oscillation (AMO), El Niño-Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO)) and seasonal precipitation over a typical semi-humid and semi-arid region, the Haihe River Basin in China. Results show that the interactions among multiple climate oscillations are non-uniform in most seasons and phases. Certain joint extreme phases can significantly trigger extreme precipitation (flood and drought) owing to the amplification effect among climate oscillations.

  14. Extreme value statistics and finite-size scaling at the ecological extinction/laminar-turbulence transition

    Science.gov (United States)

    Shih, Hong-Yan; Goldenfeld, Nigel

    Experiments on transitional turbulence in pipe flow seem to show that turbulence is a transient metastable state since the measured mean lifetime of turbulence puffs does not diverge asymptotically at a critical Reynolds number. Yet measurements reveal that the lifetime scales with Reynolds number in a super-exponential way reminiscent of extreme value statistics, and simulations and experiments in Couette and channel flow exhibit directed percolation type scaling phenomena near a well-defined transition. This universality class arises from the interplay between small-scale turbulence and a large-scale collective zonal flow, which exhibit predator-prey behavior. Why is asymptotically divergent behavior not observed? Using directed percolation and a stochastic individual level model of predator-prey dynamics related to transitional turbulence, we investigate the relation between extreme value statistics and power law critical behavior, and show that the paradox is resolved by carefully defining what is measured in the experiments. We theoretically derive the super-exponential scaling law, and using finite-size scaling, show how the same data can give both super-exponential behavior and power-law critical scaling.

  15. Multiscale Computation. Needs and Opportunities for BER Science

    Energy Technology Data Exchange (ETDEWEB)

    Scheibe, Timothy D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Jeremy C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-01

    The Environmental Molecular Sciences Laboratory (EMSL), a scientific user facility managed by Pacific Northwest National Laboratory for the U.S. Department of Energy, Office of Biological and Environmental Research (BER), conducted a one-day workshop on August 26, 2014 on the topic of “Multiscale Computation: Needs and Opportunities for BER Science.” Twenty invited participants, from various computational disciplines within the BER program research areas, were charged with the following objectives; Identify BER-relevant models and their potential cross-scale linkages that could be exploited to better connect molecular-scale research to BER research at larger scales and; Identify critical science directions that will motivate EMSL decisions regarding future computational (hardware and software) architectures.

  16. Scale orientated analysis of river width changes due to extreme flood hazards

    Directory of Open Access Journals (Sweden)

    G. Krapesch

    2011-08-01

    Full Text Available This paper analyses the morphological effects of extreme floods (recurrence interval >100 years and examines which parameters best describe the width changes due to erosion based on 5 affected alpine gravel bed rivers in Austria. The research was based on vertical aerial photos of the rivers before and after extreme floods, hydrodynamic numerical models and cross sectional measurements supported by LiDAR data of the rivers. Average width ratios (width after/before the flood were calculated and correlated with different hydraulic parameters (specific stream power, shear stress, flow area, specific discharge. Depending on the geomorphological boundary conditions of the different rivers, a mean width ratio between 1.12 (Lech River and 3.45 (Trisanna River was determined on the reach scale. The specific stream power (SSP best predicted the mean width ratios of the rivers especially on the reach scale and sub reach scale. On the local scale more parameters have to be considered to define the "minimum morphological spatial demand of rivers", which is a crucial parameter for addressing and managing flood hazards and should be used in hazard zone plans and spatial planning.

  17. Regional-Scale High-Latitude Extreme Geoelectric Fields Pertaining to Geomagnetically Induced Currents

    Science.gov (United States)

    Pulkkinen, Antti; Bernabeu, Emanuel; Eichner, Jan; Viljanen, Ari; Ngwira, Chigomezyo

    2015-01-01

    Motivated by the needs of the high-voltage power transmission industry, we use data from the high-latitude IMAGE magnetometer array to study characteristics of extreme geoelectric fields at regional scales. We use 10-s resolution data for years 1993-2013, and the fields are characterized using average horizontal geoelectric field amplitudes taken over station groups that span about 500-km distance. We show that geoelectric field structures associated with localized extremes at single stations can be greatly different from structures associated with regionally uniform geoelectric fields, which are well represented by spatial averages over single stations. Visual extrapolation and rigorous extreme value analysis of spatially averaged fields indicate that the expected range for 1-in-100-year extreme events are 3-8 V/km and 3.4-7.1 V/km, respectively. The Quebec reference ground model is used in the calculations.

  18. A Fast SVD-Hidden-nodes based Extreme Learning Machine for Large-Scale Data Analytics.

    Science.gov (United States)

    Deng, Wan-Yu; Bai, Zuo; Huang, Guang-Bin; Zheng, Qing-Hua

    2016-05-01

    Big dimensional data is a growing trend that is emerging in many real world contexts, extending from web mining, gene expression analysis, protein-protein interaction to high-frequency financial data. Nowadays, there is a growing consensus that the increasing dimensionality poses impeding effects on the performances of classifiers, which is termed as the "peaking phenomenon" in the field of machine intelligence. To address the issue, dimensionality reduction is commonly employed as a preprocessing step on the Big dimensional data before building the classifiers. In this paper, we propose an Extreme Learning Machine (ELM) approach for large-scale data analytic. In contrast to existing approaches, we embed hidden nodes that are designed using singular value decomposition (SVD) into the classical ELM. These SVD nodes in the hidden layer are shown to capture the underlying characteristics of the Big dimensional data well, exhibiting excellent generalization performances. The drawback of using SVD on the entire dataset, however, is the high computational complexity involved. To address this, a fast divide and conquer approximation scheme is introduced to maintain computational tractability on high volume data. The resultant algorithm proposed is labeled here as Fast Singular Value Decomposition-Hidden-nodes based Extreme Learning Machine or FSVD-H-ELM in short. In FSVD-H-ELM, instead of identifying the SVD hidden nodes directly from the entire dataset, SVD hidden nodes are derived from multiple random subsets of data sampled from the original dataset. Comprehensive experiments and comparisons are conducted to assess the FSVD-H-ELM against other state-of-the-art algorithms. The results obtained demonstrated the superior generalization performance and efficiency of the FSVD-H-ELM. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  20. Lightweight computational steering of very large scale molecular dynamics simulations

    International Nuclear Information System (INIS)

    Beazley, D.M.

    1996-01-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages

  1. Application of the extreme value theory to beam loss estimates in the SPIRAL2 linac based on large scale Monte Carlo computations

    Directory of Open Access Journals (Sweden)

    R. Duperrier

    2006-04-01

    Full Text Available The influence of random perturbations of high intensity accelerator elements on the beam losses is considered. This paper presents the error sensitivity study which has been performed for the SPIRAL2 linac in order to define the tolerances for the construction. The proposed driver aims to accelerate a 5 mA deuteron beam up to 20   A MeV and a 1 mA ion beam for q/A=1/3 up to 14.5 A MeV. It is a continuous wave regime linac, designed for a maximum efficiency in the transmission of intense beams and a tunable energy. It consists in an injector (two   ECRs   sources+LEBTs with the possibility to inject from several sources+radio frequency quadrupole followed by a superconducting section based on an array of independently phased cavities where the transverse focalization is performed with warm quadrupoles. The correction scheme and the expected losses are described. The extreme value theory is used to estimate the expected beam losses. The described method couples large scale computations to obtain probability distribution functions. The bootstrap technique is used to provide confidence intervals associated to the beam loss predictions. With such a method, it is possible to measure the risk to loose a few watts in this high power linac (up to 200 kW.

  2. Analysis of the Extremely Low Frequency Magnetic Field Emission from Laptop Computers

    Directory of Open Access Journals (Sweden)

    Brodić Darko

    2016-03-01

    Full Text Available This study addresses the problem of magnetic field emission produced by the laptop computers. Although, the magnetic field is spread over the entire frequency spectrum, the most dangerous part of it to the laptop users is the frequency range from 50 to 500 Hz, commonly called the extremely low frequency magnetic field. In this frequency region the magnetic field is characterized by high peak values. To examine the influence of laptop’s magnetic field emission in the office, a specific experiment is proposed. It includes the measurement of the magnetic field at six laptop’s positions, which are in close contact to its user. The results obtained from ten different laptop computers show the extremely high emission at some positions, which are dependent on the power dissipation or bad ergonomics. Eventually, the experiment extracts these dangerous positions of magnetic field emission and suggests possible solutions.

  3. Proceedings of the 2011 New York Workshop on Computer, Earth and Space Science

    CERN Document Server

    Naud, Catherine; CESS2011

    2011-01-01

    The purpose of the New York Workshop on Computer, Earth and Space Sciences is to bring together the New York area's finest Astronomers, Statisticians, Computer Scientists, Space and Earth Scientists to explore potential synergies between their respective fields. The 2011 edition (CESS2011) was a great success, and we would like to thank all of the presenters and participants for attending. This year was also special as it included authors from the upcoming book titled "Advances in Machine Learning and Data Mining for Astronomy". Over two days, the latest advanced techniques used to analyze the vast amounts of information now available for the understanding of our universe and our planet were presented. These proceedings attempt to provide a small window into what the current state of research is in this vast interdisciplinary field and we'd like to thank the speakers who spent the time to contribute to this volume.

  4. Asia-Pacific POPIN workshop on Internet.

    Science.gov (United States)

    1996-01-01

    This brief article announces the accomplishments of the ESCAP Population Division of the Department of Economic and Social Information and Policy Analysis (DESIPA) in conjunction with the Asia-Pacific POPIN Internet (Information Superhighway) Training Workshop in popularizing useful new computer information technologies. A successful workshop was held in Bangkok in November 1996 for 18 people from 8 countries in the Asian and Pacific region, many of whom were from population information centers. Participants were taught some techniques for disseminating population data and information through use of the Internet computer facility. Participants learned 1) how to use Windows software in the ESCAP local area network (LAN), 2) about concepts such as HTML (hypertext mark-up language), and 3) detailed information about computer language. Computer practices involved "surfing the Net (Internet)" and linking with the global POPIN site on the Internet. Participants learned about computer programs for information handling and learned how to prepare documents using HTML, how to mount information on the World Wide Web (WWW) of the Internet, how to convert existing documents into "HTML-style" files, and how to scan graphics, such as logos, photographs, and maps, for visual display on the Internet. The Workshop and the three training modules was funded by the UN Population Fund (UNFPA). The POPIN Coordinator was pleased that competency was accomplished in such a short period of time.

  5. Topic 14+16: High-performance and scientific applications and extreme-scale computing (Introduction)

    KAUST Repository

    Downes, Turlough P.

    2013-01-01

    As our understanding of the world around us increases it becomes more challenging to make use of what we already know, and to increase our understanding still further. Computational modeling and simulation have become critical tools in addressing this challenge. The requirements of high-resolution, accurate modeling have outstripped the ability of desktop computers and even small clusters to provide the necessary compute power. Many applications in the scientific and engineering domains now need very large amounts of compute time, while other applications, particularly in the life sciences, frequently have large data I/O requirements. There is thus a growing need for a range of high performance applications which can utilize parallel compute systems effectively, which have efficient data handling strategies and which have the capacity to utilise current and future systems. The High Performance and Scientific Applications topic aims to highlight recent progress in the use of advanced computing and algorithms to address the varied, complex and increasing challenges of modern research throughout both the "hard" and "soft" sciences. This necessitates being able to use large numbers of compute nodes, many of which are equipped with accelerators, and to deal with difficult I/O requirements. © 2013 Springer-Verlag.

  6. Resilience Design Patterns - A Structured Approach to Resilience at Extreme Scale (version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Hukerikar, Saurabh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Engelmann, Christian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest that very high fault rates in future systems. The errors resulting from these faults will propagate and generate various kinds of failures, which may result in outcomes ranging from result corruptions to catastrophic application crashes. Practical limits on power consumption in HPC systems will require future systems to embrace innovative architectures, increasing the levels of hardware and software complexities. The resilience challenge for extreme-scale HPC systems requires management of various hardware and software technologies that are capable of handling a broad set of fault models at accelerated fault rates. These techniques must seek to improve resilience at reasonable overheads to power consumption and performance. While the HPC community has developed various solutions, application-level as well as system-based solutions, the solution space of HPC resilience techniques remains fragmented. There are no formal methods and metrics to investigate and evaluate resilience holistically in HPC systems that consider impact scope, handling coverage, and performance & power eciency across the system stack. Additionally, few of the current approaches are portable to newer architectures and software ecosystems, which are expected to be deployed on future systems. In this document, we develop a structured approach to the management of HPC resilience based on the concept of resilience-based design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the commonly occurring problems and solutions used to deal with faults, errors and failures in HPC systems. The catalog of resilience design patterns provides designers with reusable design elements. We define a design framework that enhances our understanding of the important

  7. Moths produce extremely quiet ultrasonic courtship songs by rubbing specialized scales

    DEFF Research Database (Denmark)

    Nakano, Ryo; Skals, Niels; Takanashi, Takuma

    2008-01-01

    level at 1 cm) adapted for private sexual communication in the Asian corn borer moth, Ostrinia furnacalis. During courtship, the male rubs specialized scales on the wing against those on the thorax to produce the songs, with the wing membrane underlying the scales possibly acting as a sound resonator....... The male's song suppresses the escape behavior of the female, thereby increasing his mating success. Our discovery of extremely low-intensity ultrasonic communication may point to a whole undiscovered world of private communication, using "quiet" ultrasound....

  8. Physics of the 1 Teraflop RIKEN-BNL-Columbia QCD project. Proceedings of RIKEN BNL Research Center workshop: Volume 13

    International Nuclear Information System (INIS)

    1998-01-01

    A workshop was held at the RIKEN-BNL Research Center on October 16, 1998, as part of the first anniversary celebration for the center. This meeting brought together the physicists from RIKEN-BNL, BNL and Columbia who are using the QCDSP (Quantum Chromodynamics on Digital Signal Processors) computer at the RIKEN-BNL Research Center for studies of QCD. Many of the talks in the workshop were devoted to domain wall fermions, a discretization of the continuum description of fermions which preserves the global symmetries of the continuum, even at finite lattice spacing. This formulation has been the subject of analytic investigation for some time and has reached the stage where large-scale simulations in QCD seem very promising. With the computational power available from the QCDSP computers, scientists are looking forward to an exciting time for numerical simulations of QCD

  9. International workshop on multimodal virtual and augmented reality (workshop summary)

    NARCIS (Netherlands)

    Hürst, W.O.; Iwai, Daisuke; Balakrishnan, Prabhakaran

    2016-01-01

    Virtual reality (VR) and augmented reality (AR) are expected by many to become the next wave of computing with significant impacts on our daily lives. Motivated by this, we organized a workshop on “Multimodal Virtual and Augmented Reality (MVAR)” at the 18th ACM International Conference on

  10. Establishing the Turkish version of the SIGAM mobility scale, and determining its validity and reliability in lower extremity amputees.

    Science.gov (United States)

    Yilmaz, Hülya; Gafuroğlu, Ümit; Ryall, Nicola; Yüksel, Selcen

    2018-02-01

    The aim of this study is to adapt the Special Interest Group in Amputee Medicine (SIGAM) mobility scale to Turkish, and to test its validity and reliability in lower extremity amputees. Adaptation of the scale into Turkish was performed by following the steps in American Association of Orthopedic Surgeons (AAOS) guideline. Turkish version of the scale was tested twice on 109 patients who had lower extremity amputations, at hours 0 and 72. The reliability of the Turkish version was tested for internal consistency and test-retest reliability. Structural validity was tested using the "scale validity" method. For this purpose, the scores of the Short Form-36 (SF-36), Functional Ambulation Scale (FAS), Get Up and Go Test, and Satisfaction with the Prosthesis Questionnaire (SATPRO) were calculated, and analyzed using Spearman's correlation test. Cronbach's alpha coefficient was 0.67 for the Turkish version of the SIGAM mobility scale. Cohen's kappa coefficients were between 0.224 and 0.999. Repeatability according to the results of the SIGAM mobility scale (grades A-F) was 0.822. We found significant and strong positive correlations of the SIGAM mobility scale results with the FAS, Get Up and Go Test, SATPRO, and all of the SF-36 subscales. In our study, the Turkish version of the SIGAM mobility scale was found as a reliable, valid, and easy to use scale in everyday practice for measuring mobility in lower extremity amputees. Implications for Rehabilitation Amputation is the surgical removal of a severely injured and nonfunctional extremity, at a level of one or more bones proximal to the body. Loss of a lower extremity is one of the most important conditions that cause functional disability. The Special Interest Group in Amputee Medicine (SIGAM) mobility scale contains 21 questions that evaluate the mobility of lower extremity amputees. Lack of a specific Turkish scale that evaluates rehabilitation results and mobility of lower extremity amputees, and determines their

  11. Workshop on the applications of new computer tools to thermal engineering; Applications a la thermique des nouveaux outils informatiques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This workshop on the applications of new computer tools to thermal engineering has been organized by the French society of thermal engineers. Seven papers have been presented, from which two papers dealing with thermal diffusivity measurements in materials and with the optimization of dryers have been selected for ETDE. (J.S.)

  12. Proceeding of 1999-workshop on MHD computations 'study on numerical methods related to plasma confinement'

    International Nuclear Information System (INIS)

    Kako, T.; Watanabe, T.

    2000-06-01

    This is the proceeding of 'study on numerical methods related to plasma confinement' held in National Institute for Fusion Science. In this workshop, theoretical and numerical analyses of possible plasma equilibria with their stability properties are presented. There are also various lectures on mathematical as well as numerical analyses related to the computational methods for fluid dynamics and plasma physics. Separate abstracts were presented for 13 of the papers in this report. The remaining 6 were considered outside the subject scope of INIS. (J.P.N.)

  13. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  14. The AAAI-13 Conference Workshops

    OpenAIRE

    Agrawal, Vikas; Archibald, Christopher; Bhatt, Mehul; Bui, Hung; Cook, Diane J.; Cortés, Juan; Geib, Christopher; Gogate, Vibhav; Guesgen, Hans W.; Jannach, Dietmar; Johanson, Michael; Kersting, Kristian; Konidaris, George; Kotthoff, Lars; Michalowski, Martin

    2013-01-01

    The AAAI-13 Workshop Program, a part of the 27th AAAI Conference on Artificial Intelligence, was held Sunday and Monday, July 14–15, 2013 at the Hyatt Regency Bellevue Hotel in Bellevue, Washington, USA. The program included 12 workshops covering a wide range of topics in artificial intelligence, including Activity Context-Aware System Architectures (WS-13-05); Artificial Intelligence and Robotics Methods in Computational Biology (WS-13-06); Combining Constraint Solving with Mining and Lear...

  15. Computational applications of DNA physical scales

    DEFF Research Database (Denmark)

    Baldi, Pierre; Chauvin, Yves; Brunak, Søren

    1998-01-01

    that these scales provide an alternative or complementary compact representation of DNA sequences. As an example we construct a strand invariant representation of DNA sequences. The scales can also be used to analyze and discover new DNA structural patterns, especially in combinations with hidden Markov models......The authors study from a computational standpoint several different physical scales associated with structural features of DNA sequences, including dinucleotide scales such as base stacking energy and propellor twist, and trinucleotide scales such as bendability and nucleosome positioning. We show...

  16. Computational applications of DNA structural scales

    DEFF Research Database (Denmark)

    Baldi, P.; Chauvin, Y.; Brunak, Søren

    1998-01-01

    that these scales provide an alternative or complementary compact representation of DNA sequences. As an example, we construct a strand-invariant representation of DNA sequences. The scales can also be used to analyze and discover new DNA structural patterns, especially in combination with hidden Markov models......Studies several different physical scales associated with the structural features of DNA sequences from a computational standpoint, including dinucleotide scales, such as base stacking energy and propeller twist, and trinucleotide scales, such as bendability and nucleosome positioning. We show...

  17. Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings

    Science.gov (United States)

    1992-01-01

    The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.

  18. 7th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Nagel, Wolfgang; Resch, Michael

    2014-01-01

    Current advances in High Performance Computing (HPC) increasingly impact efficient software development workflows. Programmers for HPC applications need to consider trends such as increased core counts, multiple levels of parallelism, reduced memory per core, and I/O system challenges in order to derive well performing and highly scalable codes. At the same time, the increasing complexity adds further sources of program defects. While novel programming paradigms and advanced system libraries provide solutions for some of these challenges, appropriate supporting tools are indispensable. Such tools aid application developers in debugging, performance analysis, or code optimization and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 7th International Parallel Tools Workshop, held in Dresden, Germany, September 3-4, 2013.  

  19. Optical Network Testbeds Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  20. CARS 2009. Computer assisted radiology and surgery. Proceedings

    International Nuclear Information System (INIS)

    Anon.

    2009-01-01

    The CARS 2009 proceedings include contributions and poster sessions concerning different conferences and workshops: computer assisted radiology, 23rd international congress and exhibition, CARS clinical day, 13th annual conference of the international society for computer aided surgery, 10th CARS/SPIE/EuroPACS joint workshop on surgical PACS and the digital operating, 11th international workshop on computer-aided diagnosis, 15th computed maxillofacial imaging congress, CARS - computer assisted radiology and surgery, 1st EPMA/CARS workshop on personalized medicine and ICT, JICARS - Japanese institutes of CARS, 1st EuroNotes/CTAC/CARS workshop on NOTES: an interdisciplinary challenge, 13th annual conference for computer aided surgery, 27th international EuroPACS meeting.

  1. Using GRACE Satellite Gravimetry for Assessing Large-Scale Hydrologic Extremes

    Directory of Open Access Journals (Sweden)

    Alexander Y. Sun

    2017-12-01

    Full Text Available Global assessment of the spatiotemporal variability in terrestrial total water storage anomalies (TWSA in response to hydrologic extremes is critical for water resources management. Using TWSA derived from the gravity recovery and climate experiment (GRACE satellites, this study systematically assessed the skill of the TWSA-climatology (TC approach and breakpoint (BP detection method for identifying large-scale hydrologic extremes. The TC approach calculates standardized anomalies by using the mean and standard deviation of the GRACE TWSA corresponding to each month. In the BP detection method, the empirical mode decomposition (EMD is first applied to identify the mean return period of TWSA extremes, and then a statistical procedure is used to identify the actual occurrence times of abrupt changes (i.e., BPs in TWSA. Both detection methods were demonstrated on basin-averaged TWSA time series for the world’s 35 largest river basins. A nonlinear event coincidence analysis measure was applied to cross-examine abrupt changes detected by these methods with those detected by the Standardized Precipitation Index (SPI. Results show that our EMD-assisted BP procedure is a promising tool for identifying hydrologic extremes using GRACE TWSA data. Abrupt changes detected by the BP method coincide well with those of the SPI anomalies and with documented hydrologic extreme events. Event timings obtained by the TC method were ambiguous for a number of river basins studied, probably because the GRACE data length is too short to derive long-term climatology at this time. The BP approach demonstrates a robust wet-dry anomaly detection capability, which will be important for applications with the upcoming GRACE Follow-On mission.

  2. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  3. URBAN ATMOSPHERIC OBSERVATORY (UAO) FIRST PLANNING WORKSHOP, JANUARY 27-28-2003. WORKSHOP SUMMARY.

    Energy Technology Data Exchange (ETDEWEB)

    REYNOLDS,R.M.; LEE,H.N.

    2003-03-27

    The Urban Atmospheric Observatory (UAO) First Planning Workshop was held on 27-28 January 2003 at the Environmental Measurements Laboratory (EML) in downtown Manhattan, New York City. The meeting was well attended by local, state, and national administrators, as well as scientists and engineers from the national laboratories and academia. The real-time intensive UAO is a necessary step toward the development and validation of new technologies in support of the New York City emergency management and anti-terrorism effort. The real-time intensive UAO will be a dense array of meteorological instrumentation, remote sensing and satellite products and model output, as well as radiation detection, gamma spectrometer and aerosol measurements focused onto a small area in the heart of Manhattan. Such a test-bed, developed in a somewhat homogeneous urban area, and with a well-developed communication and data collection backbone, will be of immense utility for understanding how models of all scales can be improved and how they can best be integrated into the city's emergency program. The goal of the First Planning Workshop was to bring together a small group of experts in the fields of urban meteorology, modeling from mesoscale to fine-mesh computational fluid dynamics, instrumentation, communications and visualization, in order to (1) establish the importance of the observational program, (2) define the most efficient and cost-effective design for the program, (3) define needed intensive observational efforts and establish a schedule, and (4) define the importance of the UAO in emergency operations. The workshop achieved its goals with the enthusiastic participation of over forty persons. There was a synthesis of ideas towards a world-class facility that would benefit both immediate emergency management activities and, over an extended time, the entire field of urban meteorology and contaminant dispersion modeling.

  4. The ATLAS Electromagnetic Calorimeter Calibration Workshop

    CERN Multimedia

    Hong Ma; Isabelle Wingerter

    The ATLAS Electromagnetic Calorimeter Calibration Workshop took place at LAPP-Annecy from the 1st to the 3rd of October; 45 people attended the workshop. A detailed program was setup before the workshop. The agenda was organised around very focused presentations where questions were raised to allow arguments to be exchanged and answers to be proposed. The main topics were: Electronics calibration Handling of problematic channels Cluster level corrections for electrons and photons Absolute energy scale Streams for calibration samples Calibration constants processing Learning from commissioning Forty-five people attended the workshop. The workshop was on the whole lively and fruitful. Based on years of experience with test beam analysis and Monte Carlo simulation, and the recent operation of the detector in the commissioning, the methods to calibrate the electromagnetic calorimeter are well known. Some of the procedures are being exercised in the commisssioning, which have demonstrated the c...

  5. ARCSACC '99: Workshop Proceedings

    International Nuclear Information System (INIS)

    Nahir, M.; Biggar, K.

    1999-01-01

    The assessment and remediation of contaminated sites in cold and Arctic environments is an area of increasing concern, primarily because of the unique problems associated with northern regions. Not only the obvious effects of the cold temperatures on the operation of many systems, but also remedial effectiveness of measures under extreme cold conditions are of interest. Accordingly, this workshop was organized to provide a means of exchange of information among people responsible for cleaning-up contaminated sites in cold and Arctic environments, researchers, and providers of remediation services with experience in dealing with such conditions. Speakers at the workshop addressed problems concerning risk assessment and site characterization, contaminant migration in permafrost, contamination caused by mining and associated clean-up problems, assessed bioremediation as a means of contaminant control, reviewed various remediation technologies and techniques, and presented a number of bioremediation case studies. refs., tabs., figs

  6. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  7. A Pervasive Parallel Processing Framework for Data Visualization and Analysis at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Kwan-Liu [Univ. of California, Davis, CA (United States)

    2017-02-01

    efficient computation on an exascale computer. This project concludes with a functional prototype containing pervasively parallel algorithms that perform demonstratively well on many-core processors. These algorithms are fundamental for performing data analysis and visualization at extreme scale.

  8. Proceedings of the High Performance Embedded Computing Workshop (HPEC 2006) (10th). Held in Lexington, Massachusetts on September 19-21, 2006 (CD-ROM)

    National Research Council Canada - National Science Library

    Kepner, Jeremy

    2007-01-01

    ...: 1 CD-ROM; 4 3/4 in.; 78.3 MB. ABSTRACT: The High-Performance Embedded Computing (HPEC) technical committee announced the tenth annual HPEC Workshop held in September 2006 at MIT Lincoln Laboratory in Lexington, MA...

  9. Final Report National Laboratory Professional Development Workshop for Underrepresented Participants

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Valerie [Texas Engineering Experiment Station, College Station, TX (United States)

    2016-11-07

    The 2013 CMD-IT National Laboratories Professional Development Workshop for Underrepresented Participants (CMD-IT NLPDev 2013) was held at the Oak Ridge National Laboratory campus in Oak Ridge, TN. from June 13 - 14, 2013. Sponsored by the Department of Energy (DOE) Advanced Scientific Computing Research Program, the primary goal of these workshops is to provide information about career opportunities in computational science at the various national laboratories and to mentor the underrepresented participants through community building and expert presentations focused on career success. This second annual workshop offered sessions to facilitate career advancement and, in particular, the strategies and resources needed to be successful at the national laboratories.

  10. Proceedings of the TOUGH workshop

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K. [ed.

    1990-09-01

    A workshop on applications and enhancements of the TOUGH/MULKOM family of multiphase fluid and heat flow simulation programs was held at Lawrence Berkeley Laboratory on September 13--14, 1990. The workshop was attended by 62 scientists from seven countries with interests in geothermal reservoir engineering, nuclear waste isolation, unsaturated zone hydrology, environmental problems, and laboratory and field experimentation. The meeting featured 21 technical presentations, extended abstracts of which are reproduced in the present volume in unedited form. Simulator applications included processes on a broad range of space scales, from centimeters to kilometers, with transient times from seconds to geologic time scales. A number of code enhancements were reported that increased execution speeds for large 3-D problems by factors of order 20, reduced memory requirements, and improved user-friendliness. The workshop closed with an open discussion session that focussed on future needs and means for interaction in the TOUGH user community. Input from participants was gathered by means of a questionnaire that is reproduced in the appendix. 171 refs., 91 figs., 16 tabs.

  11. Proceedings of the TOUGH workshop

    International Nuclear Information System (INIS)

    Pruess, K.

    1990-09-01

    A workshop on applications and enhancements of the TOUGH/MULKOM family of multiphase fluid and heat flow simulation programs was held at Lawrence Berkeley Laboratory on September 13--14, 1990. The workshop was attended by 62 scientists from seven countries with interests in geothermal reservoir engineering, nuclear waste isolation, unsaturated zone hydrology, environmental problems, and laboratory and field experimentation. The meeting featured 21 technical presentations, extended abstracts of which are reproduced in the present volume in unedited form. Simulator applications included processes on a broad range of space scales, from centimeters to kilometers, with transient times from seconds to geologic time scales. A number of code enhancements were reported that increased execution speeds for large 3-D problems by factors of order 20, reduced memory requirements, and improved user-friendliness. The workshop closed with an open discussion session that focussed on future needs and means for interaction in the TOUGH user community. Input from participants was gathered by means of a questionnaire that is reproduced in the appendix. 171 refs., 91 figs., 16 tabs

  12. Boiling water reactor simulator. Workshop material

    International Nuclear Information System (INIS)

    2003-01-01

    The International Atomic Energy Agency (IAEA) has established an activity in nuclear reactor simulation computer programs to assist its Member States in education. The objective is to provide, for a variety of advanced reactor types, insight and practice in their operational characteristics and their response to perturbations and accident situations. To achieve this, the IAEA arranges for the development and distribution of simulation programs and workshop material and sponsors workshops. The workshops are in two parts: techniques and tools for reactor simulator development; and the use of reactor simulators in education. Workshop material for the first part is covered in the IAEA publication: Training Course Series No. 12, 'Reactor Simulator Development' (2001). Course material for workshops using a WWER- 1000 simulator from the Moscow Engineering and Physics Institute, Russian Federation is presented in the IAEA publication: Training Course Series No. 21 'WWER-1000 Reactor Simulator' (2002). Course material for workshops using a pressurized water reactor (PWR) simulator developed by Cassiopeia Technologies Incorporated, Canada, is presented in the IAEA publication: Training Course Series No. 22 'Pressurized Water Reactor Simulator' (2003). This report consists of course material for workshops using a boiling water reactor (BWR) simulator. Cassiopeia Technologies Incorporated, developed the simulator and prepared this report for the IAEA

  13. Ninth Thermal and Fluids Analysis Workshop Proceedings

    Science.gov (United States)

    Sakowski, Barbara (Compiler)

    1999-01-01

    The Ninth Thermal and Fluids Analysis Workshop (TFAWS 98) was held at the Ohio Aerospace Institute in Cleveland, Ohio from August 31 to September 4, 1998. The theme for the hands-on training workshop and conference was "Integrating Computational Fluid Dynamics and Heat Transfer into the Design Process." Highlights of the workshop (in addition to the papers published herein) included an address by the NASA Chief Engineer, Dr. Daniel Mulville; a CFD short course by Dr. John D. Anderson of the University of Maryland; and a short course by Dr. Robert Cochran of Sandia National Laboratories. In addition, lectures and hands-on training were offered in the use of several cutting-edge engineering design and analysis-oriented CFD and Heat Transfer tools. The workshop resulted in international participation of over 125 persons representing aerospace and automotive industries, academia, software providers, government agencies, and private corporations. The papers published herein address issues and solutions related to the integration of computational fluid dynamics and heat transfer into the engineering design process. Although the primary focus is aerospace, the topics and ideas presented are applicable to many other areas where these and other disciplines are interdependent.

  14. IPCC workshop on impacts of ocean acidification on marine biology and ecosystems. Workshop report

    Energy Technology Data Exchange (ETDEWEB)

    Field, C.B.; Barros, V.; Stocker, T.F.; Dahe, Q.; Mach, K.J.; Plattner, G.-K.; Mastrandrea, M.D.; Tignor, M.; Ebi, K.L.

    2011-09-15

    Understanding the effects of increasing atmospheric CO{sub 2} concentrations on ocean chemistry, commonly termed ocean acidification, as well as associated impacts on marine biology and ecosystems, is an important component of scientific knowledge about global change. The Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC) will include comprehensive coverage of ocean acidification and its impacts, including potential feedbacks to the climate system. To support ongoing AR5 assessment efforts, Working Group II and Working Group I (WGII and WGI) of the IPCC held a joint Workshop on Impacts of Ocean Acidification on Marine Biology and Ecosystems in Okinawa, Japan, from 17 to 19 January 2011. The workshop convened experts from the scientific community, including WGII and WGI AR5 authors and review editors, to synthesise scientific understanding of changes in ocean chemistry due to increased CO{sub 2} and of impacts of this changing chemistry on marine organisms, ecosystems, and ecosystem services. This workshop report summarises the scientific content and perspectives presented and discussed during the workshop. It provides syntheses of these perspectives for the workshop's core topics: (i) the changing chemistry of the oceans, (ii) impacts of ocean acidification for individual organisms, and (iii) scaling up responses from individual organisms to ecosystems. It also presents summaries of workshop discussions of key cross-cutting themes, ranging from detection and attribution of ocean acidification and its impacts to understanding ocean acidification in the context of other stressors on marine systems. Additionally, the workshop report includes extended abstracts for keynote and poster presentations at the workshop. (Author)

  15. Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.

    Energy Technology Data Exchange (ETDEWEB)

    Hebner, Gregory A.

    2017-04-01

    Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Once realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.

  16. Analyzing extreme sea levels for broad-scale impact and adaptation studies

    Science.gov (United States)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A.

    2017-12-01

    Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels (ESL), because increasing damage due to extreme events is one of the major consequences of sea-level rise (SLR) and climate change. Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future SLR; different scenarios were developed with process-based or semi-empirical models and used for coastal impact studies at various temporal and spatial scales to guide coastal management and adaptation efforts. Uncertainties in future SLR are typically accounted for by analyzing the impacts associated with a range of scenarios and model ensembles. ESL distributions are then displaced vertically according to the SLR scenarios under the inherent assumption that we have perfect knowledge on the statistics of extremes. However, there is still a limited understanding of present-day ESL which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of storm surge water levels, and (2) statistical models used for determining present-day ESL exceedance probabilities. There is no universally accepted approach to obtain such values for broad-scale flood risk assessments and while substantial research has explored SLR uncertainties, we quantify, for the first time globally, key uncertainties in ESL estimates. We find that contemporary ESL uncertainties exceed those from SLR projections and, assuming that we meet the Paris agreement, the projected SLR itself by the end of the century. Our results highlight the necessity to further improve our understanding of uncertainties in ESL estimates through (1) continued improvement of numerical and statistical models to simulate and analyze coastal water levels and (2) exploit the rich observational database and continue data archeology to obtain longer time series and remove model bias

  17. DOE JGI Quality Metrics; Approaches to Scaling and Improving Metagenome Assembly (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    Energy Technology Data Exchange (ETDEWEB)

    Copeland, Alex; Brown, C. Titus

    2011-10-13

    DOE JGI's Alex Copeland on "DOE JGI Quality Metrics" and Michigan State University's C. Titus Brown on "Approaches to Scaling and Improving Metagenome Assembly" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  18. Resilience Design Patterns: A Structured Approach to Resilience at Extreme Scale

    International Nuclear Information System (INIS)

    Engelmann, Christian; Hukerikar, Saurabh

    2017-01-01

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest the prevalence of very high fault rates in future systems. While the HPC community has developed various resilience solutions, application-level techniques as well as system-based solutions, the solution space remains fragmented. There are no formal methods and metrics to integrate the various HPC resilience techniques into composite solutions, nor are there methods to holistically evaluate the adequacy and efficacy of such solutions in terms of their protection coverage, and their performance \\& power efficiency characteristics. Additionally, few of the current approaches are portable to newer architectures and software environments that will be deployed on future systems. In this paper, we develop a structured approach to the design, evaluation and optimization of HPC resilience using the concept of design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the problems caused by various types of faults, errors and failures in HPC systems and the techniques used to deal with these events. Each well-known solution that addresses a specific HPC resilience challenge is described in the form of a pattern. We develop a complete catalog of such resilience design patterns, which may be used by system architects, system software and tools developers, application programmers, as well as users and operators as essential building blocks when designing and deploying resilience solutions. We also develop a design framework that enhances a designer's understanding the opportunities for integrating multiple patterns across layers of the system stack and the important constraints during implementation of the individual patterns. It is also useful for defining mechanisms and interfaces to coordinate flexible fault management across

  19. SIAM Workshop: Focus on Diversity 2001

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-01-01

    The Society for Industrial and Applied Mathematics (SIAM) held a workshop focused on underrepresented minorities--graduate and undergraduate students, postdocs, and recent Ph.D's--in the mathematical and computational sciences on July 11, 2001, as part of the SIAM Annual Meeting in San Diego, California. The workshop was intended to accomplish several goals: (1) to a provide workshop focused on careers for and retention of minority students in the mathematical and computational sciences; (2) to bring together a mixture of people from different levels of professional experience, ranging from undergraduate students to senior scientists in an informal setting in order to share career experiences and options; (3) to provide an opportunity for minority graduate students, postdocs, and recent Ph.D's to present their research at an international meeting; (4) to expose undergraduate students to the many professional opportunities resulting from graduate degrees in science and mathematics; and (5) to encourage undergraduate and graduate students to speak frankly with each other about personal issues and experiences associated with pursuing a scientific career.

  20. 4th Penn State Bioinorganic Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Krebs, Carsten [Pennsylvania State Univ., University Park, PA (United States)

    2017-08-22

    The research area of biological inorganic chemistry encompasses a wide variety of subfields, including molecular biology, biochemistry, biophysics, inorganic chemistry, analytical chemistry, physical chemistry, and theoretical chemistry, as well as many different methods, such as biochemical characterization of enzymes, reaction kinetics, a plethora of spectroscopic techniques, and computational methods. The above methods are combined to understand the formation, function, and regulation of the many metallo-cofactors found in Nature as well as to identify novel metallo-cofactors. Many metalloenzyme-catalyzed reactions are extremely complex, but are of fundamental importance to science and society. Examples include (i) the reduction of the chemically inert molecule, dinitrogen, to ammonia by the enzyme nitrogenase (this reaction is fundamental for the production of nitrogen fertilizers); (ii) the oxidation of water to dioxygen by the Mn4Ca cluster found in photosystem II; and (iii) myriad reactions in which aliphatic, inert C-H bonds are cleaved for subsequent functionalization of the carbon atoms (the latter reactions are important in the biosynthesis of many natural products). Because of the broad range of areas and techniques employed in this field, research in bioinorganic chemistry is typically carried out collaboratively between two or more research groups. It is of paramount importance that researchers working in this field have a good, basic, working knowledge of many methods and approaches employed in the field, in order to design and discuss experiments with collaborators. Therefore, the training of students working in bioinorganic chemistry is an important aspect of this field. Hugely successful “bioinorganic workshops” were offered in the 1990s at The University of Georgia. These workshops laid the foundation for many of the extant collaborative research efforts in this area today. The large and diverse group of bioinorganic chemists at The

  1. Computational approach on PEB process in EUV resist: multi-scale simulation

    Science.gov (United States)

    Kim, Muyoung; Moon, Junghwan; Choi, Joonmyung; Lee, Byunghoon; Jeong, Changyoung; Kim, Heebom; Cho, Maenghyo

    2017-03-01

    For decades, downsizing has been a key issue for high performance and low cost of semiconductor, and extreme ultraviolet lithography is one of the promising candidates to achieve the goal. As a predominant process in extreme ultraviolet lithography on determining resolution and sensitivity, post exposure bake has been mainly studied by experimental groups, but development of its photoresist is at the breaking point because of the lack of unveiled mechanism during the process. Herein, we provide theoretical approach to investigate underlying mechanism on the post exposure bake process in chemically amplified resist, and it covers three important reactions during the process: acid generation by photo-acid generator dissociation, acid diffusion, and deprotection. Density functional theory calculation (quantum mechanical simulation) was conducted to quantitatively predict activation energy and probability of the chemical reactions, and they were applied to molecular dynamics simulation for constructing reliable computational model. Then, overall chemical reactions were simulated in the molecular dynamics unit cell, and final configuration of the photoresist was used to predict the line edge roughness. The presented multiscale model unifies the phenomena of both quantum and atomic scales during the post exposure bake process, and it will be helpful to understand critical factors affecting the performance of the resulting photoresist and design the next-generation material.

  2. Complex Flow Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-05-01

    This report documents findings from a workshop on the impacts of complex wind flows in and out of wind turbine environments, the research needs, and the challenges of meteorological and engineering modeling at regional, wind plant, and wind turbine scales.

  3. Summary of SMIRT20 Preconference Topical Workshop - Identifying Structural Issues in Advanced Reactors

    International Nuclear Information System (INIS)

    Richins, William; Novascone, Stephen; O'Brien, Cheryl

    2009-01-01

    The Idaho National Laboratory (INL, USA) and IASMiRT sponsored an international forum Nov 5-6, 2008 in Porvoo, Finland for nuclear industry, academic, and regulatory representatives to identify structural issues in current and future advanced reactor design, especially for extreme conditions and external threats. The purpose of this Topical Workshop was to articulate research, engineering, and regulatory Code development needs. The topics addressed by the Workshop were selected to address critical industry needs specific to advanced reactor structures that have long lead times and can be the subject of future SMiRT technical sessions. The topics were; (1) structural/materials needs for extreme conditions and external threats in contemporary (Gen. III) and future (Gen. IV and NGNP) advanced reactors and (2) calibrating simulation software and methods that address topic 1. The workshop discussions and research needs identified are presented. The Workshop successfully produced interactive discussion on the two topics resulting in a list of research and technology needs. It is recommended that IASMiRT communicate the results of the discussion to industry and researchers to encourage new ideas and projects. In addition, opportunities exist to retrieve research reports and information that currently exists, and encourage more international cooperation and collaboration. It is recommended that IASMiRT continue with an off-year workshop series on select topics.

  4. The Grad Cohort Workshop: Evaluating an Intervention to Retain Women Graduate Students in Computing.

    Science.gov (United States)

    Stout, Jane G; Tamer, Burçin; Wright, Heather M; Clarke, Lori A; Dwarkadas, Sandhya; Howard, Ayanna M

    2016-01-01

    Women engaged in computing career tracks are vastly outnumbered by men and often must contend with negative stereotypes about their innate technical aptitude. Research suggests women's marginalized presence in computing may result in women psychologically disengaging, and ultimately dropping out, perpetuating women's underrepresentation in computing. To combat this vicious cycle, the Computing Research Association's Committee on the Status of Women in Computing Research (CRA-W) runs a multi-day mentorship workshop for women graduate students called Grad Cohort, which consists of a speaker series and networking opportunities. We studied the long-term impact of Grad Cohort on women Ph.D. students' (a) dedication to becoming well-known in one's field, and giving back to the community ( professional goals ), (b) the degree to which one feels computing is an important element of "who they are" ( computing identity) , and (c) beliefs that computing skills are innate ( entity beliefs ). Of note, entity beliefs are known to be demoralizing and can lead to disengagement from academic endeavors. We compared a propensity score matched sample of women and men Ph.D. students in computing programs who had never participated in Grad Cohort to a sample of past Grad Cohort participants. Grad Cohort participants reported interest in becoming well-known in their field to a greater degree than women non-participants, and to an equivalent degree as men. Also, Grad Cohort participants reported stronger interest in giving back to the community than their peers. Further, whereas women non-participants identified with computing to a lesser degree than men and held stronger entity beliefs than men, Grad Cohort participants' computing identity and entity beliefs were equivalent to men. Importantly, stronger entity beliefs predicted a weaker computing identity among students, with the exception of Grad Cohort participants. This latter finding suggests Grad Cohort may shield students

  5. Theory and modeling in nanoscience: Report of the May 10-11, 2002Workshop

    Energy Technology Data Exchange (ETDEWEB)

    McCurdy, C. William; Stechel, Ellen; Cummings, Peter; Hendrickson, Bruce; Keyes, David

    2002-06-28

    On May 10 and 11, 2002, a workshop entitled ''Theory and Modeling in Nanoscience'' was held in San Francisco, California, sponsored by the offices of Basic Energy Science and Advanced Scientific Computing Research of the Department of Energy. The Basic Energy Sciences Advisory Committee and the Advanced Scientific Computing Advisory Committee convened the workshop to identify challenges and opportunities for theory, modeling, and simulation in nanoscience and nanotechnology, and additionally to investigate the growing and promising role of applied mathematics and computer science in meeting those challenges. This report is the result of those contributions and the discussions at the workshop.

  6. 7th International Workshop on Meshfree Methods for Partial Differential Equations

    CERN Document Server

    Schweitzer, Marc

    2015-01-01

    Meshfree methods, particle methods, and generalized finite element methods have witnessed substantial development since the mid 1990s. The growing interest in these methods is due in part to the fact that they are extremely flexible numerical tools and can be interpreted in a number of ways. For instance, meshfree methods can be viewed as a natural extension of classical finite element and finite difference methods to scattered node configurations with no fixed connectivity. Furthermore, meshfree methods offer a number of advantageous features which are especially attractive when dealing with multiscale phenomena: a priori knowledge about particular local behavior of the solution can easily be introduced in the meshfree approximation space, and coarse-scale approximations can be seamlessly refined with fine-scale information. This volume collects selected papers presented at the Seventh International Workshop on Meshfree Methods, held in Bonn, Germany in September 2013. They address various aspec...

  7. Data and Visualization Corridors: Report on the 1998 DVC Workshop Series

    International Nuclear Information System (INIS)

    Smith, Paul H.; van Rosendale, John

    1998-01-01

    The Department of Energy and the National Science Foundation sponsored a series of workshops on data manipulation and visualization of large-scale scientific datasets. Three workshops were held in 1998, bringing together experts in high-performance computing, scientific visualization, emerging computer technologies, physics, chemistry, materials science, and engineering. These workshops were followed by two writing and review sessions, as well as numerous electronic collaborations, to synthesize the results. The results of these efforts are reported here. Across the government, mission agencies are charged with understanding scientific and engineering problems of unprecedented complexity. The DOE Accelerated Strategic Computing Initiative, for example, will soon be faced with the problem of understanding the enormous datasets created by teraops simulations, while NASA already has a severe problem in coping with the flood of data captured by earth observation satellites. Unfortunately, scientific visualization algorithms, and high-performance display hardware and software on which they depend, have not kept pace with the sheer size of emerging datasets, which threaten to overwhelm our ability to conduct research. Our capability to manipulate and explore large datasets is growing only slowly, while human cognitive and visual perception are an absolutely fixed resource. Thus, there is a pressing need for new methods of handling truly massive datasets, of exploring and visualizing them, and of communicating them over geographic distances. This report, written by representatives from academia, industry, national laboratories, and the government, is intended as a first step toward the timely creation of a comprehensive federal program in data manipulation and scientific visualization. There is, at this time, an exciting confluence of ideas on data handling, compression, telepresence, and scientific visualization. The combination of these new ideas, which we refer to as

  8. Synchronization and Causality Across Time-scales: Complex Dynamics and Extremes in El Niño/Southern Oscillation

    Science.gov (United States)

    Jajcay, N.; Kravtsov, S.; Tsonis, A.; Palus, M.

    2017-12-01

    A better understanding of dynamics in complex systems, such as the Earth's climate is one of the key challenges for contemporary science and society. A large amount of experimental data requires new mathematical and computational approaches. Natural complex systems vary on many temporal and spatial scales, often exhibiting recurring patterns and quasi-oscillatory phenomena. The statistical inference of causal interactions and synchronization between dynamical phenomena evolving on different temporal scales is of vital importance for better understanding of underlying mechanisms and a key for modeling and prediction of such systems. This study introduces and applies information theory diagnostics to phase and amplitude time series of different wavelet components of the observed data that characterizes El Niño. A suite of significant interactions between processes operating on different time scales was detected, and intermittent synchronization among different time scales has been associated with the extreme El Niño events. The mechanisms of these nonlinear interactions were further studied in conceptual low-order and state-of-the-art dynamical, as well as statistical climate models. Observed and simulated interactions exhibit substantial discrepancies, whose understanding may be the key to an improved prediction. Moreover, the statistical framework which we apply here is suitable for direct usage of inferring cross-scale interactions in nonlinear time series from complex systems such as the terrestrial magnetosphere, solar-terrestrial interactions, seismic activity or even human brain dynamics.

  9. Computer Science Research at Langley

    Science.gov (United States)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  10. History of the RSIC seminar-workshops

    International Nuclear Information System (INIS)

    Maskewitz, B.F.

    1992-01-01

    The RSIC concept of the open-quote seminar-workshop close-quote as a means to review the state-of-the-art of specific computing technology and to transmit a great deal of information to a large number of people in a short period of time evolved over a 30-year period. This paper presents the background leading to the development of the concept and details the history of the seminars and workshops organized by RSIC staff members through the years, 1965 - 1992

  11. Spatial Scaling of Global Rainfall and Flood Extremes

    Science.gov (United States)

    Devineni, Naresh; Lall, Upmanu; Xi, Chen; Ward, Philip

    2014-05-01

    Floods associated with severe storms are a significant source of risk for property, life and supply chains. These property losses tend to be determined as much by the duration and spatial extent of flooding as by the depth and velocity of inundation. High duration floods are typically induced by persistent rainfall (up to 30 day duration) as seen recently in Thailand, Pakistan, the Ohio and the Mississippi Rivers, France, and Germany. Events related to persistent and recurrent rainfall appear to correspond to the persistence of specific global climate patterns that may be identifiable from global, historical data fields, and also from climate models that project future conditions. In this paper, we investigate the statistical properties of the spatial manifestation of the rainfall exceedances and floods. We present the first ever results on a global analysis of the scaling characteristics of extreme rainfall and flood event duration, volumes and contiguous flooded areas as a result of large scale organization of long duration rainfall events. Results are organized by latitude and with reference to the phases of ENSO, and reveal surprising invariance across latitude. Speculation as to the potential relation to the dynamical factors is presented

  12. Auroral Tomography Workshop, Proceedings

    International Nuclear Information System (INIS)

    Steen, Aa.

    1993-08-01

    In ionospheric and atmospheric physics the importance of multi-station imaging has grown as a consequence of the availability of scientific grade CCD cameras with digital output and affordable massive computing power. Tomographic inversion techniques are used in many different areas, e.g. medicine, plasma research and space physics. The tomography workshop was announced to gather a limited group of people interested in auroral tomography or tomographic inversion methods in general. ALIS (Auroral Large Imaging System) is a multi-station ground-based system developed primarily for three-dimensional auroral imaging, however other non-auroral objects can be studied with ALIS, e.g. stratospheric clouds. Several of the contributions in the workshop dealt with problems related to geometries similar to the ALIS-configuration. The Proceedings contain written contributions received either in abstract form or as full papers. The Proceedings also contain contributions intended for the Workshop but not presented due to the absence of the speaker. Separate abstracts have been prepared for 15 of the 17 papers

  13. Computational biology in the cloud: methods and new insights from computing at scale.

    Science.gov (United States)

    Kasson, Peter M

    2013-01-01

    The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.

  14. Imaging sciences workshop

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.

    1994-11-15

    This workshop on the Imaging Sciences sponsored by Lawrence Livermore National Laboratory contains short abstracts/articles submitted by speakers. The topic areas covered include the following: Astronomical Imaging; biomedical imaging; vision/image display; imaging hardware; imaging software; Acoustic/oceanic imaging; microwave/acoustic imaging; computed tomography; physical imaging; imaging algorithms. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  15. Pressurized water reactor simulator. Workshop material

    International Nuclear Information System (INIS)

    2003-01-01

    The International Atomic Energy Agency (IAEA) has established an activity in nuclear reactor simulation computer programs to assist its Member States in education. The objective is to provide, for a variety of advanced reactor types, insight and practice in their operational characteristics and their response to perturbations and accident situations. To achieve this, the IAEA arranges for the development and distribution of simulation programs and educational material and sponsors courses and workshops. The workshops are in two parts: techniques and tools for reactor simulator development; and the use of reactor simulators in education. Workshop material for the first part is covered in the IAEA Training Course Series No. 12, 'Reactor Simulator Development' (2001). Course material for workshops using a WWER- 1000 reactor department simulator from the Moscow Engineering and Physics Institute, the Russian Federation is presented in the IAEA Training Course Series No. 21 'WWER-1000 Reactor Simulator' (2002). Course material for workshops using a boiling water reactor simulator developed for the IAEA by Cassiopeia Technologies Incorporated of Canada (CTI) is presented in the IAEA publication: Training Course Series No.23 'Boiling Water Reactor Simulator' (2003). This report consists of course material for workshops using a pressurized water reactor simulator

  16. Challenges in scaling NLO generators to leadership computers

    Science.gov (United States)

    Benjamin, D.; Childers, JT; Hoeche, S.; LeCompte, T.; Uram, T.

    2017-10-01

    Exascale computing resources are roughly a decade away and will be capable of 100 times more computing than current supercomputers. In the last year, Energy Frontier experiments crossed a milestone of 100 million core-hours used at the Argonne Leadership Computing Facility, Oak Ridge Leadership Computing Facility, and NERSC. The Fortran-based leading-order parton generator called Alpgen was successfully scaled to millions of threads to achieve this level of usage on Mira. Sherpa and MadGraph are next-to-leading order generators used heavily by LHC experiments for simulation. Integration times for high-multiplicity or rare processes can take a week or more on standard Grid machines, even using all 16-cores. We will describe our ongoing work to scale the Sherpa generator to thousands of threads on leadership-class machines and reduce run-times to less than a day. This work allows the experiments to leverage large-scale parallel supercomputers for event generation today, freeing tens of millions of grid hours for other work, and paving the way for future applications (simulation, reconstruction) on these and future supercomputers.

  17. More scalability, less pain: A simple programming model and its implementation for extreme computing

    International Nuclear Information System (INIS)

    Lusk, E.L.; Pieper, S.C.; Butler, R.M.

    2010-01-01

    This is the story of a simple programming model, its implementation for extreme computing, and a breakthrough in nuclear physics. A critical issue for the future of high-performance computing is the programming model to use on next-generation architectures. Described here is a promising approach: program very large machines by combining a simplified programming model with a scalable library implementation. The presentation takes the form of a case study in nuclear physics. The chosen application addresses fundamental issues in the origins of our Universe, while the library developed to enable this application on the largest computers may have applications beyond this one.

  18. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  19. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  20. Resilience Design Patterns - A Structured Approach to Resilience at Extreme Scale (version 1.1)

    Energy Technology Data Exchange (ETDEWEB)

    Hukerikar, Saurabh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Engelmann, Christian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-01

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest the prevalence of very high fault rates in future systems. The errors resulting from these faults will propagate and generate various kinds of failures, which may result in outcomes ranging from result corruptions to catastrophic application crashes. Therefore the resilience challenge for extreme-scale HPC systems requires management of various hardware and software technologies that are capable of handling a broad set of fault models at accelerated fault rates. Also, due to practical limits on power consumption in HPC systems future systems are likely to embrace innovative architectures, increasing the levels of hardware and software complexities. As a result the techniques that seek to improve resilience must navigate the complex trade-off space between resilience and the overheads to power consumption and performance. While the HPC community has developed various resilience solutions, application-level techniques as well as system-based solutions, the solution space of HPC resilience techniques remains fragmented. There are no formal methods and metrics to investigate and evaluate resilience holistically in HPC systems that consider impact scope, handling coverage, and performance & power efficiency across the system stack. Additionally, few of the current approaches are portable to newer architectures and software environments that will be deployed on future systems. In this document, we develop a structured approach to the management of HPC resilience using the concept of resilience-based design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the commonly occurring problems and solutions used to deal with faults, errors and failures in HPC systems. Each established solution is described in the form of a pattern that

  1. STREAM2016: Streaming Requirements, Experience, Applications and Middleware Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Fox, Geoffrey [Indiana Univ., Bloomington, IN (United States); Jha, Shantenu [Rutgers Univ., New Brunswick, NJ (United States); Ramakrishnan, Lavanya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-10-01

    The Department of Energy (DOE) Office of Science (SC) facilities including accelerators, light sources and neutron sources and sensors that study, the environment, and the atmosphere, are producing streaming data that needs to be analyzed for next-generation scientific discoveries. There has been an explosion of new research and technologies for stream analytics arising from the academic and private sectors. However, there has been no corresponding effort in either documenting the critical research opportunities or building a community that can create and foster productive collaborations. The two-part workshop series, STREAM: Streaming Requirements, Experience, Applications and Middleware Workshop (STREAM2015 and STREAM2016), were conducted to bring the community together and identify gaps and future efforts needed by both NSF and DOE. This report describes the discussions, outcomes and conclusions from STREAM2016: Streaming Requirements, Experience, Applications and Middleware Workshop, the second of these workshops held on March 22-23, 2016 in Tysons, VA. STREAM2016 focused on the Department of Energy (DOE) applications, computational and experimental facilities, as well software systems. Thus, the role of “streaming and steering” as a critical mode of connecting the experimental and computing facilities was pervasive through the workshop. Given the overlap in interests and challenges with industry, the workshop had significant presence from several innovative companies and major contributors. The requirements that drive the proposed research directions, identified in this report, show an important opportunity for building competitive research and development program around streaming data. These findings and recommendations are consistent with vision outlined in NRC Frontiers of Data and National Strategic Computing Initiative (NCSI) [1, 2]. The discussions from the workshop are captured as topic areas covered in this report's sections. The report

  2. Extreme-Scale Stochastic Particle Tracing for Uncertain Unsteady Flow Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Hanqi; He, Wenbin; Seo, Sangmin; Shen, Han-Wei; Peterka, Tom

    2016-11-13

    We present an efficient and scalable solution to estimate uncertain transport behaviors using stochastic flow maps (SFM,) for visualizing and analyzing uncertain unsteady flows. SFM computation is extremely expensive because it requires many Monte Carlo runs to trace densely seeded particles in the flow. We alleviate the computational cost by decoupling the time dependencies in SFMs so that we can process adjacent time steps independently and then compose them together for longer time periods. Adaptive refinement is also used to reduce the number of runs for each location. We then parallelize over tasks—packets of particles in our design—to achieve high efficiency in MPI/thread hybrid programming. Such a task model also enables CPU/GPU coprocessing. We show the scalability on two supercomputers, Mira (up to 1M Blue Gene/Q cores) and Titan (up to 128K Opteron cores and 8K GPUs), that can trace billions of particles in seconds.

  3. Improving plot- and regional-scale crop models for simulating impacts of climate variability and extremes

    Science.gov (United States)

    Tao, F.; Rötter, R.

    2013-12-01

    Many studies on global climate report that climate variability is increasing with more frequent and intense extreme events1. There are quite large uncertainties from both the plot- and regional-scale models in simulating impacts of climate variability and extremes on crop development, growth and productivity2,3. One key to reducing the uncertainties is better exploitation of experimental data to eliminate crop model deficiencies and develop better algorithms that more adequately capture the impacts of extreme events, such as high temperature and drought, on crop performance4,5. In the present study, in a first step, the inter-annual variability in wheat yield and climate from 1971 to 2012 in Finland was investigated. Using statistical approaches the impacts of climate variability and extremes on wheat growth and productivity were quantified. In a second step, a plot-scale model, WOFOST6, and a regional-scale crop model, MCWLA7, were calibrated and validated, and applied to simulate wheat growth and yield variability from 1971-2012. Next, the estimated impacts of high temperature stress, cold damage, and drought stress on crop growth and productivity based on the statistical approaches, and on crop simulation models WOFOST and MCWLA were compared. Then, the impact mechanisms of climate extremes on crop growth and productivity in the WOFOST model and MCWLA model were identified, and subsequently, the various algorithm and impact functions were fitted against the long-term crop trial data. Finally, the impact mechanisms, algorithms and functions in WOFOST model and MCWLA model were improved to better simulate the impacts of climate variability and extremes, particularly high temperature stress, cold damage and drought stress for location-specific and large area climate impact assessments. Our studies provide a good example of how to improve, in parallel, the plot- and regional-scale models for simulating impacts of climate variability and extremes, as needed for

  4. t4 Workshop Report*

    Science.gov (United States)

    Kleensang, Andre; Maertens, Alexandra; Rosenberg, Michael; Fitzpatrick, Suzanne; Lamb, Justin; Auerbach, Scott; Brennan, Richard; Crofton, Kevin M.; Gordon, Ben; Fornace, Albert J.; Gaido, Kevin; Gerhold, David; Haw, Robin; Henney, Adriano; Ma’ayan, Avi; McBride, Mary; Monti, Stefano; Ochs, Michael F.; Pandey, Akhilesh; Sharan, Roded; Stierum, Rob; Tugendreich, Stuart; Willett, Catherine; Wittwehr, Clemens; Xia, Jianguo; Patton, Geoffrey W.; Arvidson, Kirk; Bouhifd, Mounir; Hogberg, Helena T.; Luechtefeld, Thomas; Smirnova, Lena; Zhao, Liang; Adeleye, Yeyejide; Kanehisa, Minoru; Carmichael, Paul; Andersen, Melvin E.; Hartung, Thomas

    2014-01-01

    Summary Despite wide-spread consensus on the need to transform toxicology and risk assessment in order to keep pace with technological and computational changes that have revolutionized the life sciences, there remains much work to be done to achieve the vision of toxicology based on a mechanistic foundation. A workshop was organized to explore one key aspect of this transformation – the development of Pathways of Toxicity (PoT) as a key tool for hazard identification based on systems biology. Several issues were discussed in depth in the workshop: The first was the challenge of formally defining the concept of a PoT as distinct from, but complementary to, other toxicological pathway concepts such as mode of action (MoA). The workshop came up with a preliminary definition of PoT as “A molecular definition of cellular processes shown to mediate adverse outcomes of toxicants”. It is further recognized that normal physiological pathways exist that maintain homeostasis and these, sufficiently perturbed, can become PoT. Second, the workshop sought to define the adequate public and commercial resources for PoT information, including data, visualization, analyses, tools, and use-cases, as well as the kinds of efforts that will be necessary to enable the creation of such a resource. Third, the workshop explored ways in which systems biology approaches could inform pathway annotation, and which resources are needed and available that can provide relevant PoT information to the diverse user communities. PMID:24127042

  5. ES12; The 24th Annual Workshop on Recent Developments in Electronic Structure Theory

    Energy Technology Data Exchange (ETDEWEB)

    Holzwarth, Natalie [Wake Forest Univ., Winston-Salem, NC (United States); Thonhauser, Timo [Wake Forest Univ., Winston-Salem, NC (United States); Salam, Akbar [Wake Forest Univ., Winston-Salem, NC (United States)

    2012-06-29

    ES12: The 24th Annual Workshop on Recent Developments in Electronic Structure Theory was held June 5-8, 2012 at Wake Forest University in Winston-Salem, NC 27109. The program consisted of 24 oral presentations, 70 posters, and 2 panel discussions. The attendance of the Workshop was comparable to or larger than previous workshops and participation was impressively diverse. The 136 participants came from all over the world and included undergraduate students, graduate students, postdoctoral researchers, and senior scientists. The general assessment of the Workshop was extremely positive in terms of the high level of scientific presentations and discussions, and in terms of the schedule, accommodations, and affordability of the meeting.

  6. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Brunsell, Nathaniel [Univ. of Kansas, Lawrence, KS (United States); Mechem, David [Univ. of Kansas, Lawrence, KS (United States); Ma, Chunsheng [Wichita State Univ., KS (United States)

    2015-02-20

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the

  7. Extreme-scale alignments of quasar optical polarizations and Galactic dust contamination

    OpenAIRE

    Pelgrims, Vincent

    2017-01-01

    Almost twenty years ago the optical polarization vectors from quasars were shown to be aligned over extreme-scales. That evidence was later confirmed and enhanced thanks to additional optical data obtained with the ESO instrument FORS2 mounted on the VLT, in Chile. These observations suggest either Galactic foreground contamination of the data or, more interestingly, a cosmological origin. Using 353-GHz polarization data from the Planck satellite, I recently showed that the main features of t...

  8. DOE planning workshop on rf theory and computations

    International Nuclear Information System (INIS)

    1984-01-01

    The purpose of the two-day workshop-meeting was to review the status of rf heating in magnetic fusion plasmas and to determine the outstanding problems in this area. The term rf heating was understood to encompass not only bulk plasma heating by externally applied electromagnetic power but also current generation in toroidal plasmas and generation of thermal barriers in tandem mirror plasmas

  9. Reactor simulator development. Workshop material

    International Nuclear Information System (INIS)

    2001-01-01

    The International Atomic Energy Agency (IAEA) has established a programme in nuclear reactor simulation computer programs to assist its Member States in education and training. The objective is to provide, for a variety of advanced reactor types, insight and practice in reactor operational characteristics and their response to perturbations and accident situations. To achieve this, the IAEA arranges for the supply or development of simulation programs and training material, sponsors training courses and workshops, and distributes documentation and computer programs. This publication consists of course material for workshops on development of such reactor simulators. Participants in the workshops are provided with instruction and practice in the development of reactor simulation computer codes using a model development system that assembles integrated codes from a selection of pre-programmed and tested sub-components. This provides insight and understanding into the construction and assumptions of the codes that model the design and operational characteristics of various power reactor systems. The main objective is to demonstrate simple nuclear reactor dynamics with hands-on simulation experience. Using one of the modular development systems, CASSIM tm , a simple point kinetic reactor model is developed, followed by a model that simulates the Xenon/Iodine concentration on changes in reactor power. Lastly, an absorber and adjuster control rod, and a liquid zone model are developed to control reactivity. The built model is used to demonstrate reactor behavior in sub-critical, critical and supercritical states, and to observe the impact of malfunctions of various reactivity control mechanisms on reactor dynamics. Using a PHWR simulator, participants practice typical procedures for a reactor startup and approach to criticality. This workshop material consists of an introduction to systems used for developing reactor simulators, an overview of the dynamic simulation

  10. Workshop on nuclear structure and decay data evaluation. Summary report

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Nichols, A.L.

    2003-01-01

    A summary is given of the aims and contents of the Workshop on Nuclear Structure and Decay Data (NSDD) Evaluation, including the agenda, lists of participants and their presentations, general comments and recommendations. The 1-week workshop was organized by the IAEA Nuclear Data Section, and held in Vienna, Austria, from 18 to 22 November 2002. Workshop material, including participants' presentations, computer codes, manuals and other materials for NSDD evaluators, are freely available on CD-ROM on request. (author)

  11. Compact Toroid Theory Planning Workshop. A panel report to the Director, Division of Applied Plasma Physics, Office of Fusion Energy

    International Nuclear Information System (INIS)

    1980-07-01

    The purpose of the Workshop was to identify the most important physics issues that need to be addressed in the near term in order to assure the optimal design and timely interpretation of Compact Toroid (CT) experiments. The Panel was also asked to assess the levels of effort required to obtain priority information on appropriate time scales compatible with DOE plans to design a CT proof-of-principle experiment. The fiscal year cost anticipated for the effort recommended by the Workshop Panel (excluding costs for computing) is estimated to be approximately $5.7M. CT theory is currently funded at a level of approximately $2.0M per year

  12. Brief Assessment of Motor Function: Content Validity and Reliability of the Upper Extremity Gross Motor Scale

    Science.gov (United States)

    Cintas, Holly Lea; Parks, Rebecca; Don, Sarah; Gerber, Lynn

    2011-01-01

    Content validity and reliability of the Brief Assessment of Motor Function (BAMF) Upper Extremity Gross Motor Scale (UEGMS) were evaluated in this prospective, descriptive study. The UEGMS is one of five BAMF ordinal scales designed for quick documentation of gross, fine, and oral motor skill levels. Designed to be independent of age and…

  13. Proceeding of 1999-workshop on MHD computations 'study on numerical methods related to plasma confinement'

    Energy Technology Data Exchange (ETDEWEB)

    Kako, T.; Watanabe, T. [eds.

    2000-06-01

    This is the proceeding of 'study on numerical methods related to plasma confinement' held in National Institute for Fusion Science. In this workshop, theoretical and numerical analyses of possible plasma equilibria with their stability properties are presented. There are also various lectures on mathematical as well as numerical analyses related to the computational methods for fluid dynamics and plasma physics. Separate abstracts were presented for 13 of the papers in this report. The remaining 6 were considered outside the subject scope of INIS. (J.P.N.)

  14. Developing a Framework for Seamless Prediction of Sub-Seasonal to Seasonal Extreme Precipitation Events in the United States.

    Science.gov (United States)

    Rosendahl, D. H.; Ćwik, P.; Martin, E. R.; Basara, J. B.; Brooks, H. E.; Furtado, J. C.; Homeyer, C. R.; Lazrus, H.; Mcpherson, R. A.; Mullens, E.; Richman, M. B.; Robinson-Cook, A.

    2017-12-01

    Extreme precipitation events cause significant damage to homes, businesses, infrastructure, and agriculture, as well as many injures and fatalities as a result of fast-moving water or waterborne diseases. In the USA, these natural hazard events claimed the lives of more than 300 people during 2015 - 2016 alone, with total damage reaching $24.4 billion. Prior studies of extreme precipitation events have focused on the sub-daily to sub-weekly timeframes. However, many decisions for planning, preparing and resilience-building require sub-seasonal to seasonal timeframes (S2S; 14 to 90 days), but adequate forecasting tools for prediction do not exist. Therefore, the goal of this newly funded project is an enhancement in understanding of the large-scale forcing and dynamics of S2S extreme precipitation events in the United States, and improved capability for modeling and predicting such events. Here, we describe the project goals, objectives, and research activities that will take place over the next 5 years. In this project, a unique team of scientists and stakeholders will identify and understand weather and climate processes connected with the prediction of S2S extreme precipitation events by answering these research questions: 1) What are the synoptic patterns associated with, and characteristic of, S2S extreme precipitation evens in the contiguous U.S.? 2) What role, if any, do large-scale modes of climate variability play in modulating these events? 3) How predictable are S2S extreme precipitation events across temporal scales? 4) How do we create an informative prediction of S2S extreme precipitation events for policymaking and planing? This project will use observational data, high-resolution radar composites, dynamical climate models and workshops that engage stakeholders (water resource managers, emergency managers and tribal environmental professionals) in co-production of knowledge. The overarching result of this project will be predictive models to reduce of

  15. A Computer-Based Visual Analog Scale,

    Science.gov (United States)

    1992-06-01

    34 keys on the computer keyboard or other input device. The initial position of the arrow is always in the center of the scale to prevent biasing the...3 REFERENCES 1. Gift, A.G., "Visual Analogue Scales: Measurement of Subjective Phenomena." Nursing Research, Vol. 38, pp. 286-288, 1989. 2. Ltmdberg...3. Menkes, D.B., Howard, R.C., Spears, G.F., and Cairns, E.R., "Salivary THC Following Cannabis Smoking Correlates With Subjective Intoxication and

  16. Sensitivity of extreme precipitation to temperature: the variability of scaling factors from a regional to local perspective

    Science.gov (United States)

    Schroeer, K.; Kirchengast, G.

    2018-06-01

    Potential increases in extreme rainfall induced hazards in a warming climate have motivated studies to link precipitation intensities to temperature. Increases exceeding the Clausius-Clapeyron (CC) rate of 6-7%/°C-1 are seen in short-duration, convective, high-percentile rainfall at mid latitudes, but the rates of change cease or revert at regionally variable threshold temperatures due to moisture limitations. It is unclear, however, what these findings mean in term of the actual risk of extreme precipitation on a regional to local scale. When conditioning precipitation intensities on local temperatures, key influences on the scaling relationship such as from the annual cycle and regional weather patterns need better understanding. Here we analyze these influences, using sub-hourly to daily precipitation data from a dense network of 189 stations in south-eastern Austria. We find that the temperature sensitivities in the mountainous western region are lower than in the eastern lowlands. This is due to the different weather patterns that cause extreme precipitation in these regions. Sub-hourly and hourly intensities intensify at super-CC and CC-rates, respectively, up to temperatures of about 17 °C. However, we also find that, because of the regional and seasonal variability of the precipitation intensities, a smaller scaling factor can imply a larger absolute change in intensity. Our insights underline that temperature precipitation scaling requires careful interpretation of the intent and setting of the study. When this is considered, conditional scaling factors can help to better understand which influences control the intensification of rainfall with temperature on a regional scale.

  17. Proceedings of the Toronto TEAM/ACES workshop

    International Nuclear Information System (INIS)

    Turner, L.R.

    1991-03-01

    The third TEAM Workshop of the third round was held at Ontario Hydro in Toronto 25--26 October 1990, immediately following the Conference on Electromagnetic Field Computation. This was the first Joint Workshop with ACES (Applied Computational Electromagnetics Society), whose goals are similar to TEAM, but who tend to work at higher frequencies (Antennas, Propagation, and Scattering). A fusion problem, the eddy current heating of the case of the Euratom Large Coil Project Coil, was adapted as Problem 14 at the Oxford Workshop, and a solution to that problem was presented at Toronto by Oskar Biro of the Graz (Austria) University of Technology. Individual solutions were also presented for Problems 8 (Flaw in a Plate) and 9 (Moving Coil inside a Pipe). Five new solutions were presented to Problem 13 (DC Coil in a Ferromagnetic Yoke), and Koji Fujiwara of Okayama University summarized these solutions along with the similar number presented at Oxford. The solutions agreed well in the air but disagreed in the steel. Codes with a formulation in magnetic field strength or scalar potential underestimated the flux density in the steel, and codes based on flux density or vector potential overestimated it. Codes with edge elements appeared to do better than codes with nodal elements. These results stimulated considerable discussions; in my view that was the most valuable result of the workshop

  18. dV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science

    Energy Technology Data Exchange (ETDEWEB)

    Livny, Miron [Univ. of Wisconsin, Madison, WI (United States)

    2018-01-22

    This report introduces publications that report the results of a project that aimed to design a computational framework that enables computational experimentation at scale while supporting the model of “submit locally, compute globally”. The project focuses on estimating application resource needs, finding the appropriate computing resources, acquiring those resources,deploying the applications and data on the resources, managing applications and resources during run.

  19. Contribution of large-scale midlatitude disturbances to hourly precipitation extremes in the United States

    Science.gov (United States)

    Barbero, Renaud; Abatzoglou, John T.; Fowler, Hayley J.

    2018-02-01

    Midlatitude synoptic weather regimes account for a substantial portion of annual precipitation accumulation as well as multi-day precipitation extremes across parts of the United States (US). However, little attention has been devoted to understanding how synoptic-scale patterns contribute to hourly precipitation extremes. A majority of 1-h annual maximum precipitation (AMP) across the western US were found to be linked to two coherent midlatitude synoptic patterns: disturbances propagating along the jet stream, and cutoff upper-level lows. The influence of these two patterns on 1-h AMP varies geographically. Over 95% of 1-h AMP along the western coastal US were coincident with progressive midlatitude waves embedded within the jet stream, while over 30% of 1-h AMP across the interior western US were coincident with cutoff lows. Between 30-60% of 1-h AMP were coincident with the jet stream across the Ohio River Valley and southeastern US, whereas a a majority of 1-h AMP over the rest of central and eastern US were not found to be associated with either midlatitude synoptic features. Composite analyses for 1-h AMP days coincident to cutoff lows and jet stream show that an anomalous moisture flux and upper-level dynamics are responsible for initiating instability and setting up an environment conducive to 1-h AMP events. While hourly precipitation extremes are generally thought to be purely convective in nature, this study shows that large-scale dynamics and baroclinic disturbances may also contribute to precipitation extremes on sub-daily timescales.

  20. Second Greenhouse Gas Information System Workshop

    Science.gov (United States)

    Boland, S. W.; Duren, R. M.; Mitchiner, J.; Rotman, D.; Sheffner, E.; Ebinger, M. H.; Miller, C. E.; Butler, J. H.; Dimotakis, P.; Jonietz, K.

    2009-12-01

    The second Greenhouse Gas Information System (GHGIS) workshop was held May 20-22, 2009 at the Sandia National Laboratories in Albuquerque, New Mexico. The workshop brought together 74 representatives from 28 organizations including U.S. government agencies, national laboratories, and members of the academic community to address issues related to the understanding, operational monitoring, and tracking of greenhouse gas emissions and carbon offsets. The workshop was organized by an interagency collaboration between NASA centers, DOE laboratories, and NOAA. It was motivated by the perceived need for an integrated interagency, community-wide initiative to provide information about greenhouse gas sources and sinks at policy-relevant temporal and spatial scales in order to significantly enhance the ability of national and regional governments, industry, and private citizens to implement and evaluate effective climate change mitigation policies. This talk provides an overview of the second Greenhouse Gas Information System workshop, presents its key findings, and discusses current status and next steps in this interagency collaborative effort.

  1. MATHEON Workshop 2013

    CERN Document Server

    Calderbank, Robert; Kutyniok, Gitta; Vybíral, Jan

    2015-01-01

    Since publication of the initial papers in 2006, compressed sensing has captured the imagination of the international signal processing community, and the mathematical foundations are nowadays quite well understood. Parallel to the progress in mathematics, the potential applications of compressed sensing have been explored by many international groups of, in particular, engineers and applied mathematicians, achieving very promising advances in various areas such as communication theory, imaging sciences, optics, radar technology, sensor networks, or tomography. Since many applications have reached a mature state, the research center MATHEON in Berlin focusing on "Mathematics for Key Technologies", invited leading researchers on applications of compressed sensing from mathematics, computer science, and engineering to the "MATHEON Workshop 2013: Compressed Sensing and its Applications” in December 2013. It was the first workshop specifically focusing on the applications of compressed sensing. This book featur...

  2. Computer simulations for the nano-scale

    International Nuclear Information System (INIS)

    Stich, I.

    2007-01-01

    A review of methods for computations for the nano-scale is presented. The paper should provide a convenient starting point into computations for the nano-scale as well as a more in depth presentation for those already working in the field of atomic/molecular-scale modeling. The argument is divided in chapters covering the methods for description of the (i) electrons, (ii) ions, and (iii) techniques for efficient solving of the underlying equations. A fairly broad view is taken covering the Hartree-Fock approximation, density functional techniques and quantum Monte-Carlo techniques for electrons. The customary quantum chemistry methods, such as post Hartree-Fock techniques, are only briefly mentioned. Description of both classical and quantum ions is presented. The techniques cover Ehrenfest, Born-Oppenheimer, and Car-Parrinello dynamics. The strong and weak points of both principal and technical nature are analyzed. In the second part we introduce a number of applications to demonstrate the different approximations and techniques introduced in the first part. They cover a wide range of applications such as non-simple liquids, surfaces, molecule-surface interactions, applications in nano technology, etc. These more in depth presentations, while certainly not exhaustive, should provide information on technical aspects of the simulations, typical parameters used, and ways of analysis of the huge amounts of data generated in these large-scale supercomputer simulations. (author)

  3. Further outlooks: extremely uncomfortable; Die weiteren Aussichten: extrem ungemuetlich

    Energy Technology Data Exchange (ETDEWEB)

    Resenhoeft, T.

    2006-07-01

    Climate is changing extremely in the last decades. Scientists dealing with extreme weather, should not only stare at computer simulations. They have also to turn towards psyche, seriously personal experiences, knowing statistics, relativise supposed sensational reports and last not least collecting more data. (GL)

  4. Workshop on nuclear structure and decay data: Theory and evaluation, 2008

    International Nuclear Information System (INIS)

    Nichols, A.L.; McLaughlin, P.K.

    2008-06-01

    A two-week Workshop on Nuclear Structure and Decay Data under the auspices of the IAEA Nuclear Data Section was organised and held at the Abdus Salam International Centre for Theoretical Physics (ICTP) in Trieste, Italy from 28 April to 9 May 2008. This workshop constituted a further development of previous Nuclear Structure and Decay Data Workshops held in 2002, 2003, 2005 and 2006. The aims and contents of the 2008 workshop are summarized, along with the agenda, list of participants, comments and recommendations. All recent workshop material has been assembled in this INDC report, and is also freely available on CD-ROM (all relevant PowerPoint presentations and manuals along with appropriate computer codes). (author)

  5. Parallel Computational Fluid Dynamics 2007 : Implementations and Experiences on Large Scale and Grid Computing

    CERN Document Server

    2009-01-01

    At the 19th Annual Conference on Parallel Computational Fluid Dynamics held in Antalya, Turkey, in May 2007, the most recent developments and implementations of large-scale and grid computing were presented. This book, comprised of the invited and selected papers of this conference, details those advances, which are of particular interest to CFD and CFD-related communities. It also offers the results related to applications of various scientific and engineering problems involving flows and flow-related topics. Intended for CFD researchers and graduate students, this book is a state-of-the-art presentation of the relevant methodology and implementation techniques of large-scale computing.

  6. Report on a workshop on the application of thermoluminescence dosimetry to large scale individual monitoring, Ispra, 11-13 September 1985

    International Nuclear Information System (INIS)

    Barthe, J.R.; Christensen, P.; Driscoll, C.M.H.; Marshall, T.O.; Harvey, J.R.; Julius, H.W.; Marshall, M.; Oberhoffer, M.

    1987-01-01

    The workshop was held for the benefit of those actually involved in the operation of large scale automatic thermoluminescence dosimetry systems. It was organised around three overall themes subdivided into 13 subject headings: External constraints: User requirements, Quantities and Units, Legal requirements, Testing, Intercomparisons; Dosimetry systems: Materials, Dosemeter design, Reading systems, Annealing procedures, Rogue readings; Management: Distribution and organisation, Reporting and record keeping, Financial aspects. (author)

  7. Kinetic turbulence simulations at extreme scale on leadership-class systems

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Bei [Princeton Univ., Princeton, NJ (United States); Ethier, Stephane [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Tang, William [Princeton Univ., Princeton, NJ (United States); Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Williams, Timothy [Argonne National Lab. (ANL), Argonne, IL (United States); Ibrahim, Khaled Z. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Madduri, Kamesh [The Pennsylvania State Univ., University Park, PA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-01-01

    Reliable predictive simulation capability addressing confinement properties in magnetically confined fusion plasmas is critically-important for ITER, a 20 billion dollar international burning plasma device under construction in France. The complex study of kinetic turbulence, which can severely limit the energy confinement and impact the economic viability of fusion systems, requires simulations at extreme scale for such an unprecedented device size. Our newly optimized, global, ab initio particle-in-cell code solving the nonlinear equations underlying gyrokinetic theory achieves excellent performance with respect to "time to solution" at the full capacity of the IBM Blue Gene/Q on 786,432 cores of Mira at ALCF and recently of the 1,572,864 cores of Sequoia at LLNL. Recent multithreading and domain decomposition optimizations in the new GTC-P code represent critically important software advances for modern, low memory per core systems by enabling routine simulations at unprecedented size (130 million grid points ITER-scale) and resolution (65 billion particles).

  8. Critical exponents of extremal Kerr perturbations

    Science.gov (United States)

    Gralla, Samuel E.; Zimmerman, Peter

    2018-05-01

    We show that scalar, electromagnetic, and gravitational perturbations of extremal Kerr black holes are asymptotically self-similar under the near-horizon, late-time scaling symmetry of the background metric. This accounts for the Aretakis instability (growth of transverse derivatives) as a critical phenomenon associated with the emergent symmetry. We compute the critical exponent of each mode, which is equivalent to its decay rate. It follows from symmetry arguments that, despite the growth of transverse derivatives, all generally covariant scalar quantities decay to zero.

  9. Workshop Physics Activity Guide, Module 4: Electricity and Magnetism

    Science.gov (United States)

    Laws, Priscilla W.

    2004-05-01

    The Workshop Physics Activity Guide is a set of student workbooks designed to serve as the foundation for a two-semester calculus-based introductory physics course. It consists of 28 units that interweave text materials with activities that include prediction, qualitative observation, explanation, equation derivation, mathematical modeling, quantitative experiments, and problem solving. Students use a powerful set of computer tools to record, display, and analyze data, as well as to develop mathematical models of physical phenomena. The design of many of the activities is based on the outcomes of physics education research. The Workshop Physics Activity Guide is supported by an Instructor's Website that: (1) describes the history and philosophy of the Workshop Physics Project; (2) provides advice on how to integrate the Guide into a variety of educational settings; (3) provides information on computer tools (hardware and software) and apparatus; and (4) includes suggested homework assignments for each unit. Log on to the Workshop Physics Project website at http://physics.dickinson.edu/ Workshop Physics is a component of the Physics Suite--a collection of materials created by a group of educational reformers known as the Activity Based Physics Group. The Physics Suite contains a broad array of curricular materials that are based on physics education research, including: Understanding Physics, by Cummings, Laws, Redish and Cooney (an introductory textbook based on the best-selling text by Halliday/Resnick/Walker) RealTime Physics Laboratory Modules Physics by Inquiry (intended for use in a workshop setting) Interactive Lecture Demonstration Tutorials in Introductory Physics Activity Based Tutorials (designed primarily for use in recitations)

  10. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  11. WWER-1000 reactor simulator. Workshop material

    International Nuclear Information System (INIS)

    2003-01-01

    The International Atomic Energy Agency (IAEA) has established an activity in nuclear reactor simulation computer programs to assist its Member States in education. The objective is to provide, for a variety of advanced reactor types, insight and practice in their operational characteristics and their response to perturbations and accident situations. To achieve this, the IAEA arranges for the development and distribution of simulation programs and educational material and sponsors courses and workshops. The workshops are in two parts: techniques and tools for reactor simulator development; and the use of reactor simulators in education. Workshop material for the first part is covered in the IAEA publication: Training Course Series 12, 'Reactor Simulator Development' (2001). Course material for workshops using a pressurized water reactor (PWR) Simulator developed for the IAEA by Cassiopeia Technologies Inc. of Canada is presented in the IAEA publication: Training Course Series No. 22 'Pressurized Water Reactor Simulator' (2003) and Training Course Series No. 23 'Boiling Water Reactor Simulator' (2003). This report consists of course material for workshops using the WWER-1000 Reactor Department Simulator from the Moscow Engineering and Physics Institute, Russian Federation. N. V. Tikhonov and S. B. Vygovsky of the Moscow Engineering and Physics Institute prepared this report for the IAEA

  12. Modelling of spatio-temporal precipitation relevant for urban hydrology with focus on scales, extremes and climate change

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen

    -correlation lengths for sub-daily extreme precipitation besides having too low intensities. Especially the wrong spatial correlation structure is disturbing from an urban hydrological point of view as short-term extremes will cover too much ground if derived directly from bias corrected regional climate model output...... of precipitation are compared and used to rank climate models with respect to performance metrics. The four different observational data sets themselves are compared at daily temporal scale with respect to climate indices for mean and extreme precipitation. Data density seems to be a crucial parameter for good...... happening in summer and most of the daily extremes in fall. This behaviour is in good accordance with reality where short term extremes originate in convective precipitation cells that occur when it is very warm and longer term extremes originate in frontal systems that dominate the fall and winter seasons...

  13. PREFACE: Workshop on 'Buried' Interface Science with X-rays and Neutrons

    Science.gov (United States)

    Sakurai, Kenji

    2007-06-01

    The 2007 workshop on `buried' interface science with X-rays and neutrons was held at the Institute of Materials Research, Tohoku University, in Sendai, Japan, on July 22-24, 2007. The workshop was the latest in a series held since 2001; Tsukuba (December 2001), Niigata (September 2002), Nagoya (July 2003), Tsukuba (July 2004), Saitama (March 2005), Yokohama (July 2006), Kusatsu (August 2006) and Tokyo (December 2006). The 2007 workshop had 64 participants and 34 presentations. There are increasing demands for sophisticated metrology in order to observe multilayered materials with nano-structures (dots, wires, etc), which are finding applications in electronic, magnetic, optical and other devices. Unlike many other surface-sensitive methods, X-ray and neutron analysis is known for its ability to see even `buried' function interfaces as well as the surface. It is highly reliable in practice, because the information, which ranges from the atomic to mesoscopic scale, is quantitative and reproducible. The non-destructive nature of this type of analytical method ensures that the same specimen can be measured by other techniques. However, we now realize that the method should be upgraded further to cope with more realistic problems in nano sciences and technologies. In the case of the reflectivity technique and other related methods, which have been the main topics in our workshops over the past 7 years, there are three important directions as illustrated in the Figure. Current X-ray methods can give atomic-scale information for quite a large area on a scale of mm2-cm2. These methods can deliver good statistics for an average, but sometimes we need to be able to see a specific part in nano-scale rather than an average structure. In addition, there is a need to see unstable changing structures and related phenomena in order to understand more about the mechanism of the functioning of nano materials. Quick measurements are therefore important. Furthermore, in order to apply

  14. Research Directions for Cyber Experimentation: Workshop Discussion Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    DeWaard, Elizabeth [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Deccio, Casey [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fritz, David Jakob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Sandia National Laboratories hosted a workshop on August 11, 2017 entitled "Research Directions for Cyber Experimentation," which focused on identifying and addressing research gaps within the field of cyber experimentation , particularly emulation testbeds . This report mainly documents the discussion toward the end of the workshop, which included research gaps such as developing a sustainable research infrastructure, exp anding cyber experimentation, and making the field more accessible to subject matter experts who may not have a background in computer science . Other gaps include methodologies for rigorous experimentation, validation, and uncertainty quantification, which , if addressed, also have the potential to bridge the gap between cyber experimentation and cyber engineering. Workshop attendees presented various ways to overcome these research gaps, however the main conclusion for overcoming these gaps is better commun ication through increased workshops, conferences, email lists, and slack chann els, among other opportunities.

  15. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    Science.gov (United States)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    dynamics (NAO, ENSO) on total ozone is a global feature in the northern mid-latitudes (Rieder et al., 2010c). In a next step frequency distributions of extreme events are analyzed on global scale (northern and southern mid-latitudes). A specific focus here is whether findings gained through analysis of long-term European ground based stations can be clearly identified as a global phenomenon. By showing results from these three types of studies an overview of extreme events in total ozone (and the dynamical and chemical features leading to those) will be presented from local to global scales. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa

  16. Workshop on Subcritical Neutron Production

    International Nuclear Information System (INIS)

    Walter Sadowski; Roald Sagdeev

    2006-01-01

    Executive Summary of the Workshop on Subcritical Neutron Production A workshop on Subcritical Neutron Production was sponsored by the East-West Center of the University of Maryland on October 11-13, 2004. The subject of the workshop was the application of subcritical neutrons to transmutation of actinides. The workshop was attended by members of the fission, accelerator and fusion communities. Papers on the state of development of neutron production by accelerators, fusion devices, and fission reactors were presented. Discussions were held on the potential of these technologies to solve the problems of spent nuclear waste storage and nuclear non-proliferation presented by current and future nuclear power reactors. A list of participants including their affiliation and their E-Mail addresses is attached. The workshop concluded that the technologies, presently available or under development, hold out the exciting possibility of improving the environmental quality and long term energy resources of nuclear power while strengthening proliferation resistance. The workshop participants agreed on the following statements. The workshop considered a number of technologies to deal with spent nuclear fuels and current actinide inventories. The conclusion was reached that substantial increase in nuclear power production will require that the issue of spent nuclear fuel be resolved. The Workshop concluded that 14 MeV fusion neutrons can be used to destroy nuclear reactor by-products, some of which would otherwise have to be stored for geologic periods of time. The production of 14 MeV neutrons is based on existing fusion technologies at different research institutions in several countries around the world. At the present time this technology is used to produce 14 MeV neutrons in JET. More development work will be required, however, to bring fusion technology to the level where it can be used for actinide burning on an industrial scale. The workshop concluded that the potential

  17. Summary of the CSRI Workshop on Combinatorial Algebraic Topology (CAT): Software, Applications, & Algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Visualization and Scientific Computing Dept.; Day, David Minot [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Applied Mathematics and Applications Dept.; Mitchell, Scott A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computer Science and Informatics Dept.

    2009-11-20

    This report summarizes the Combinatorial Algebraic Topology: software, applications & algorithms workshop (CAT Workshop). The workshop was sponsored by the Computer Science Research Institute of Sandia National Laboratories. It was organized by CSRI staff members Scott Mitchell and Shawn Martin. It was held in Santa Fe, New Mexico, August 29-30. The CAT Workshop website has links to some of the talk slides and other information, http://www.cs.sandia.gov/CSRI/Workshops/2009/CAT/index.html. The purpose of the report is to summarize the discussions and recap the sessions. There is a special emphasis on technical areas that are ripe for further exploration, and the plans for follow-up amongst the workshop participants. The intended audiences are the workshop participants, other researchers in the area, and the workshop sponsors.

  18. Proceedings of the workshop on data acquisition system for high energy physics

    International Nuclear Information System (INIS)

    Hayano, R.S.

    1984-09-01

    The workshop ''Data acquisition system for high energy experiment'' was held by the on-line electronics group of KEK in cooperation with the data processing section on May 28-29, 1984, at KEK. This year, the proton synchrotron is prepared for the work after shutdown, and in the TRISTAN, the full scale construction of the data acquisition system is advanced. One of the large topics was the TKO box proposed by the data acquisition development group of KEK, and its specification is included in this book. This workshop was the meeting with very wide range from front end electronics to large computers. The talks on flash analog/digital converters and latest data communication were held. As a new trial, the wish lists on the future development and support of on-line electronics and others were collected from the participants, and these were deliberated by all the members. The contents of the discussion at this time are given in this book. The summaries of the lectures presented at the meeting are collected in this book. The interchange with the experimental group is indispensable for the activities of the on-line electronics group, accordingly, the workshop like this will be held hereafter. (Kako, I.)

  19. Cloud computing as a new technology trend in education

    OpenAIRE

    Шамина, Ольга Борисовна; Буланова, Татьяна Валентиновна

    2014-01-01

    The construction and operation of extremely large-scale, commodity-computer datacenters was the key necessary enabler of Cloud Computing. Cloud Computing could offer services make a good profit for using in education. With Cloud Computing it is possible to increase the quality of education, improve communicative culture and give to teachers and students new application opportunities.

  20. Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop

    Science.gov (United States)

    Morrison, Joseph H.

    2013-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  1. Improving Nigerian health policymakers' capacity to access and utilize policy relevant evidence: outcome of information and communication technology training workshop.

    Science.gov (United States)

    Uneke, Chigozie Jesse; Ezeoha, Abel Ebeh; Uro-Chukwu, Henry; Ezeonu, Chinonyelum Thecla; Ogbu, Ogbonnaya; Onwe, Friday; Edoga, Chima

    2015-01-01

    Information and communication technology (ICT) tools are known to facilitate communication and processing of information and sharing of knowledge by electronic means. In Nigeria, the lack of adequate capacity on the use of ICT by health sector policymakers constitutes a major impediment to the uptake of research evidence into the policymaking process. The objective of this study was to improve the knowledge and capacity of policymakers to access and utilize policy relevant evidence. A modified "before and after" intervention study design was used in which outcomes were measured on the target participants both before the intervention is implemented and after. A 4-point likert scale according to the degree of adequacy; 1 = grossly inadequate, 4 = very adequate was employed. This study was conducted in Ebonyi State, south-eastern Nigeria and the participants were career health policy makers. A two-day intensive ICT training workshop was organized for policymakers who had 52 participants in attendance. Topics covered included: (i). intersectoral partnership/collaboration; (ii). Engaging ICT in evidence-informed policy making; use of ICT for evidence synthesis; (iv) capacity development on the use of computer, internet and other ICT. The pre-workshop mean of knowledge and capacity for use of ICT ranged from 2.19-3.05, while the post-workshop mean ranged from 2.67-3.67 on 4-point scale. The percentage increase in mean of knowledge and capacity at the end of the workshop ranged from 8.3%-39.1%. Findings of this study suggest that policymakers' ICT competence relevant to evidence-informed policymaking can be enhanced through training workshop.

  2. Multi-scale analysis of lung computed tomography images

    CERN Document Server

    Gori, I; Fantacci, M E; Preite Martinez, A; Retico, A; De Mitri, I; Donadio, S; Fulcheri, C

    2007-01-01

    A computer-aided detection (CAD) system for the identification of lung internal nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 project. The three modules of our lung CAD system, a segmentation algorithm for lung internal region identification, a multi-scale dot-enhancement filter for nodule candidate selection and a multi-scale neural technique for false positive finding reduction, are described. The results obtained on a dataset of low-dose and thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.

  3. Reports on the 2013 Workshop Program of the Seventh International AAAI Conference on Weblogs and Social Media

    OpenAIRE

    Archambault, Daniel; Celli, Fabio; Daly, Elizabeth M.; Erickson, Ingrid; Geyer, Werner; Halegoua, Germaine; Keegan, Brian; Millen, David R.; Schwartz, Raz; Shami, N. Sadat

    2013-01-01

    The Workshop Program of the Program of the Seventh International AAAI Conference on Weblogs and Social Media was held July 11, 2013, in Cambridge, Massachusetts. The program included four workshops, Computational Personality Recognition (Shared Task) (WS-13-01), Social Computing for Workforce 2.0 (WS-13-02), Social Media Visualization 2 (WS-13-03), and When the City Meets the Citizen (WS-13-04). This report summarizes the activities of the four workshops.

  4. Large Scale Influences on Summertime Extreme Precipitation in the Northeastern United States

    Science.gov (United States)

    Collow, Allison B. Marquardt; Bosilovich, Michael G.; Koster, Randal Dean

    2016-01-01

    Observations indicate that over the last few decades there has been a statistically significant increase in precipitation in the northeastern United States and that this can be attributed to an increase in precipitation associated with extreme precipitation events. Here a state-of-the-art atmospheric reanalysis is used to examine such events in detail. Daily extreme precipitation events defined at the 75th and 95th percentile from gridded gauge observations are identified for a selected region within the Northeast. Atmospheric variables from the Modern-Era Retrospective Analysis for Research and Applications, version 2 (MERRA-2), are then composited during these events to illustrate the time evolution of associated synoptic structures, with a focus on vertically integrated water vapor fluxes, sea level pressure, and 500-hectopascal heights. Anomalies of these fields move into the region from the northwest, with stronger anomalies present in the 95th percentile case. Although previous studies show tropical cyclones are responsible for the most intense extreme precipitation events, only 10 percent of the events in this study are caused by tropical cyclones. On the other hand, extreme events resulting from cutoff low pressure systems have increased. The time period of the study was divided in half to determine how the mean composite has changed over time. An arc of lower sea level pressure along the East Coast and a change in the vertical profile of equivalent potential temperature suggest a possible increase in the frequency or intensity of synoptic-scale baroclinic disturbances.

  5. Report of the workshop on Aviation Safety/Automation Program

    Science.gov (United States)

    Morello, Samuel A. (Editor)

    1990-01-01

    As part of NASA's responsibility to encourage and facilitate active exchange of information and ideas among members of the aviation community, an Aviation Safety/Automation workshop was organized and sponsored by the Flight Management Division of NASA Langley Research Center. The one-day workshop was held on October 10, 1989, at the Sheraton Beach Inn and Conference Center in Virginia Beach, Virginia. Participants were invited from industry, government, and universities to discuss critical questions and issues concerning the rapid introduction and utilization of advanced computer-based technology into the flight deck and air traffic controller workstation environments. The workshop was attended by approximately 30 discipline experts, automation and human factors researchers, and research and development managers. The goal of the workshop was to address major issues identified by the NASA Aviation Safety/Automation Program. Here, the results of the workshop are documented. The ideas, thoughts, and concepts were developed by the workshop participants. The findings, however, have been synthesized into a final report primarily by the NASA researchers.

  6. A direct method for computing extreme value (Gumbel) parameters for gapped biological sequence alignments.

    Science.gov (United States)

    Quinn, Terrance; Sinkala, Zachariah

    2014-01-01

    We develop a general method for computing extreme value distribution (Gumbel, 1958) parameters for gapped alignments. Our approach uses mixture distribution theory to obtain associated BLOSUM matrices for gapped alignments, which in turn are used for determining significance of gapped alignment scores for pairs of biological sequences. We compare our results with parameters already obtained in the literature.

  7. Maintaining scale as a realiable computational system for criticality safety analysis

    International Nuclear Information System (INIS)

    Bowmann, S.M.; Parks, C.V.; Martin, S.K.

    1995-01-01

    Accurate and reliable computational methods are essential for nuclear criticality safety analyses. The SCALE (Standardized Computer Analyses for Licensing Evaluation) computer code system was originally developed at Oak Ridge National Laboratory (ORNL) to enable users to easily set up and perform criticality safety analyses, as well as shielding, depletion, and heat transfer analyses. Over the fifteen-year life of SCALE, the mainstay of the system has been the criticality safety analysis sequences that have featured the KENO-IV and KENO-V.A Monte Carlo codes and the XSDRNPM one-dimensional discrete-ordinates code. The criticality safety analysis sequences provide automated material and problem-dependent resonance processing for each criticality calculation. This report details configuration management which is essential because SCALE consists of more than 25 computer codes (referred to as modules) that share libraries of commonly used subroutines. Changes to a single subroutine in some cases affect almost every module in SCALE exclamation point Controlled access to program source and executables and accurate documentation of modifications are essential to maintaining SCALE as a reliable code system. The modules and subroutine libraries in SCALE are programmed by a staff of approximately ten Code Managers. The SCALE Software Coordinator maintains the SCALE system and is the only person who modifies the production source, executables, and data libraries. All modifications must be authorized by the SCALE Project Leader prior to implementation

  8. VI International Workshop on Nature Inspired Cooperative Strategies for Optimization

    CERN Document Server

    Otero, Fernando; Masegosa, Antonio

    2014-01-01

    Biological and other natural processes have always been a source of inspiration for computer science and information technology. Many emerging problem solving techniques integrate advanced evolution and cooperation strategies, encompassing a range of spatio-temporal scales for visionary conceptualization of evolutionary computation. This book is a collection of research works presented in the VI International Workshop on Nature Inspired Cooperative Strategies for Optimization (NICSO) held in Canterbury, UK. Previous editions of NICSO were held in Granada, Spain (2006 & 2010), Acireale, Italy (2007), Tenerife, Spain (2008), and Cluj-Napoca, Romania (2011). NICSO 2013 and this book provides a place where state-of-the-art research, latest ideas and emerging areas of nature inspired cooperative strategies for problem solving are vigorously discussed and exchanged among the scientific community. The breadth and variety of articles in this book report on nature inspired methods and applications such as Swarm In...

  9. National Postirradiation Examination Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Schulthess, Jason L

    2011-06-01

    A National Post-Irradiation-Examination (PIE) Workshop was held March 29-30, 2011, in Washington D.C., stimulated by the DOE Acting Assistant Secretary for Nuclear Energy approval on January 31, 2011 of the “Mission Need Statement for Advanced Post-Irradiation Examination Capability”. As stated in the Mission Need, “A better understanding of nuclear fuels and material performance in the nuclear environment, at the nanoscale and lower, is critical to the development of innovative fuels and materials required for tomorrow’s nuclear energy systems.” (2011) Developing an advanced post-irradiation capability is the most important thing we can do to advance nuclear energy as an option to meeting national energy goals. Understanding the behavior of fuels and materials in a nuclear reactor irradiation environment is the limiting factor in nuclear plant safety, longevity, efficiency, and economics. The National PIE Workshop is part of fulfilling or addressing Department of Energy (DOE) missions in safe and publically acceptable nuclear energy. Several presentations were given during the opening of the workshop. Generally speaking, these presentations established that we cannot continue to rely on others in the world to provide the capabilities we need to move forward with nuclear energy technology. These presentations also generally identified the need for increased microstructural understanding of fuels and materials to be coupled with modeling and simulation, and increased accessibility and infrastructure to facilitate the interaction between national laboratories and participating organizations. The overall results of the work of the presenters and panels was distilled into four primary needs 1. Understanding material changes in the extreme nuclear environment at the nanoscale. Nanoscale studies have significant importance due to the mechanisms that cause materials to degrade, which actually occur on the nanoscale. 2. Enabling additional proficiency in

  10. Support for the Core Research Activities and Studies of the Computer Science and Telecommunications Board (CSTB)

    Energy Technology Data Exchange (ETDEWEB)

    Jon Eisenberg, Director, CSTB

    2008-05-13

    The Computer Science and Telecommunications Board of the National Research Council considers technical and policy issues pertaining to computer science (CS), telecommunications, and information technology (IT). The functions of the board include: (1) monitoring and promoting the health of the CS, IT, and telecommunications fields, including attention as appropriate to issues of human resources and funding levels and program structures for research; (2) initiating studies involving CS, IT, and telecommunications as critical resources and sources of national economic strength; (3) responding to requests from the government, non-profit organizations, and private industry for expert advice on CS, IT, and telecommunications issues; and to requests from the government for expert advice on computer and telecommunications systems planning, utilization, and modernization; (4) fostering interaction among CS, IT, and telecommunications researchers and practitioners, and with other disciplines; and providing a base of expertise in the National Research Council in the areas of CS, IT, and telecommunications. This award has supported the overall operation of CSTB. Reports resulting from the Board's efforts have been widely disseminated in both electronic and print form, and all CSTB reports are available at its World Wide Web home page at cstb.org. The following reports, resulting from projects that were separately funded by a wide array of sponsors, were completed and released during the award period: 2007: * Summary of a Workshop on Software-Intensive Systems and Uncertainty at Scale * Social Security Administration Electronic Service Provision: A Strategic Assessment * Toward a Safer and More Secure Cyberspace * Software for Dependable Systems: Sufficient Evidence? * Engaging Privacy and Information Technology in a Digital Age * Improving Disaster Management: The Role of IT in Mitigation, Preparedness, Response, and Recovery 2006: * Renewing U.S. Telecommunications

  11. Highlights of the Workshop

    Science.gov (United States)

    Noor, Ahmed K.

    1997-01-01

    Economic stresses are forcing many industries to reduce cost and time-to-market, and to insert emerging technologies into their products. Engineers are asked to design faster, ever more complex systems. Hence, there is a need for novel design paradigms and effective design tools to reduce the design and development times. Several computational tools and facilities have been developed to support the design process. Some of these are described in subsequent presentations. The focus of the workshop is on the computational tools and facilities which have high potential for use in future design environment for aerospace systems. The outline for the introductory remarks is given. First, the characteristics and design drivers for future aerospace systems are outlined; second, simulation-based design environment, and some of its key modules are described; third, the vision for the next-generation design environment being planned by NASA, the UVA ACT Center and JPL is presented. The anticipated major benefits of the planned environment are listed; fourth, some of the government-supported programs related to simulation-based design are listed; and fifth, the objectives and format of the workshop are presented.

  12. Presentations from the Aeroelastic Workshop – latest results from AeroOpt

    DEFF Research Database (Denmark)

    Hansen, Morten Hartvig

    This report contains the slides of the presentations at the Aeroelastic Workshop held at Risø-DTU for the wind energy industry in Denmark on January 27, 2011. The scientific part of the agenda at this workshop was • Anisotropic beam element in HAWC2 for modelling of composite lay-ups (Taeseong Kim...... (Robert Mikkelsen) • Potential of fatigue and extreme load reductions on swept blades using HAWC2 (David Verelst) • Aeroelastic modal analysis of backward swept blades using HAWCStab2 (Morten H. Hansen) • Aeroelastic rotor design minimizing the loads (Christian Bak) • A small study of flat back airfoils...

  13. Proceedings of the workshop on high resolution computed microtomography (CMT)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-02-01

    The purpose of the workshop was to determine the status of the field, to define instrumental and computational requirements, and to establish minimum specifications required by possible users. The most important message sent by implementers was the remainder that CMT is a tool. It solves a wide spectrum of scientific problems and is complementary to other microscopy techniques, with certain important advantages that the other methods do not have. High-resolution CMT can be used non-invasively and non-destructively to study a variety of hierarchical three-dimensional microstructures, which in turn control body function. X-ray computed microtomography can also be used at the frontiers of physics, in the study of granular systems, for example. With high-resolution CMT, for example, three-dimensional pore geometries and topologies of soils and rocks can be obtained readily and implemented directly in transport models. In turn, these geometries can be used to calculate fundamental physical properties, such as permeability and electrical conductivity, from first principles. Clearly, use of the high-resolution CMT technique will contribute tremendously to the advancement of current R and D technologies in the production, transport, storage, and utilization of oil and natural gas. It can also be applied to problems related to environmental pollution, particularly to spilling and seepage of hazardous chemicals into the Earth's subsurface. Applications to energy and environmental problems will be far-ranging and may soon extend to disciplines such as materials science--where the method can be used in the manufacture of porous ceramics, filament-resin composites, and microelectronics components--and to biomedicine, where it could be used to design biocompatible materials such as artificial bones, contact lenses, or medication-releasing implants. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  14. Topic 14+16: High-performance and scientific applications and extreme-scale computing (Introduction)

    KAUST Repository

    Downes, Turlough P.; Roller, Sabine P.; Seitsonen, Ari Paavo; Valcke, Sophie; Keyes, David E.; Sawley, Marie Christine; Schulthess, Thomas C.; Shalf, John M.

    2013-01-01

    and algorithms to address the varied, complex and increasing challenges of modern research throughout both the "hard" and "soft" sciences. This necessitates being able to use large numbers of compute nodes, many of which are equipped with accelerators

  15. HEPVIS96 workshop on visualization in high-energy physics. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L; Vandoni, C E [eds.

    1997-01-29

    This report constitutes the formal proceedings of the HEPVIS96 workshop on visualization in high-energy physics, which was held at CERN from 2nd to 4th of September 1996. The workshop, which is in the HEPVVIS series, covered the topics of event visualization, computer graphics technologies and standards, and data analysis and visualization in high-energy physics. (orig.).

  16. HEPVIS96 workshop on visualization in high-energy physics. Proceedings

    International Nuclear Information System (INIS)

    Taylor, L.; Vandoni, C.E.

    1997-01-01

    This report constitutes the formal proceedings of the HEPVIS96 workshop on visualization in high-energy physics, which was held at CERN from 2nd to 4th of September 1996. The workshop, which is in the HEPVVIS series, covered the topics of event visualization, computer graphics technologies and standards, and data analysis and visualization in high-energy physics. (orig.)

  17. Toward Improving Predictability of Extreme Hydrometeorological Events: the Use of Multi-scale Climate Modeling in the Northern High Plains

    Science.gov (United States)

    Munoz-Arriola, F.; Torres-Alavez, J.; Mohamad Abadi, A.; Walko, R. L.

    2014-12-01

    Our goal is to investigate possible sources of predictability of hydrometeorological extreme events in the Northern High Plains. Hydrometeorological extreme events are considered the most costly natural phenomena. Water deficits and surpluses highlight how the water-climate interdependence becomes crucial in areas where single activities drive economies such as Agriculture in the NHP. Nonetheless we recognize the Water-Climate interdependence and the regulatory role that human activities play, we still grapple to identify what sources of predictability could be added to flood and drought forecasts. To identify the benefit of multi-scale climate modeling and the role of initial conditions on flood and drought predictability on the NHP, we use the Ocean Land Atmospheric Model (OLAM). OLAM is characterized by a dynamic core with a global geodesic grid with hexagonal (and variably refined) mesh cells and a finite volume discretization of the full compressible Navier Stokes equations, a cut-grid cell method for topography (that reduces error in computational gradient computation and anomalous vertical dispersion). Our hypothesis is that wet conditions will drive OLAM's simulations of precipitation to wetter conditions affecting both flood forecast and drought forecast. To test this hypothesis we simulate precipitation during identified historical flood events followed by drought events in the NHP (i.e. 2011-2012 years). We initialized OLAM with CFS-data 1-10 days previous to a flooding event (as initial conditions) to explore (1) short-term and high-resolution and (2) long-term and coarse-resolution simulations of flood and drought events, respectively. While floods are assessed during a maximum of 15-days refined-mesh simulations, drought is evaluated during the following 15 months. Simulated precipitation will be compared with the Sub-continental Observation Dataset, a gridded 1/16th degree resolution data obtained from climatological stations in Canada, US, and

  18. The Second International Workshop on Bioprinting, Biopatterning and Bioassembly.

    Science.gov (United States)

    Mironov, Vladimir

    2005-08-01

    The Second International Workshop on Bioprinting, Biopatterning and Bioassembly was held at the Medical University of South Carolina (MUSC), located in the beautiful, historic city of Charleston. The workshop attracted > 50 participants from 10 different countries, including mechanical and chemical engineers, molecular, cell and developmental biologists, biophysicists, mathematicians, clinicians, humanists and artists. Bioprinting can be defined as computer-aided, automatic, layer-by-layer deposition, transfer and patterning of biologically relevant materials. The workshop goal was to gather the world's experts and leaders, present the latest results, assess future trends, explore new applications, and promote international collaborations and academic-industrial partnerships. The workshop demonstrated the multidisciplinary and global character of ongoing efforts in the development of bioprinting technology, galvanised an evolving community of bioprintists, and demonstrated feasibility as well as strong potential for a broad spectrum of applications of bioprinting technology. The Third International Workshop on Bioprinting, Biopatterning and Bioassembly is planned for Japan in 2006.

  19. REPORT ON FIRST INTERNATIONAL WORKSHOP ON ROBOTIC SURGERY IN THORACIC ONCOLOGY

    Directory of Open Access Journals (Sweden)

    Giulia Veronesi

    2016-10-01

    Full Text Available A workshop of experts from France, Germany, Italy and the United States took place at Humanitas Research Hospital Milan, Italy, on 10-11 February 2016, to examine techniques for and applications of robotic surgery to thoracic oncology. The main topics of presentation and discussion were: robotic surgery for lung resection; robot-assisted thymectomy; minimally invasive surgery for esophageal cancer; new developments in computer-assisted surgery and medical applications of robots; the challenge of costs; and future clinical research in robotic thoracic surgery. The following article summarizes the main contributions to the workshop. The Workshop consensus was that, since video-assisted thoracoscopic surgery (VATS is becoming the mainstream approach to resectable lung cancer in North America and Europe, robotic surgery for thoracic oncology is likely to be embraced by an increasing numbers of thoracic surgeons, since it has technical advantages over VATS, including intuitive movements, tremor filtration, more degrees of manipulative freedom, motion scaling, and high definition stereoscopic vision. These advantages may make robotic surgery more accessible than VATS to trainees and experienced surgeons, and also lead to expanded indications. However the high costs of robotic surgery and absence of tactile feedback remain obstacles to widespread dissemination. A prospective multicentric randomized trial (NCT02804893 to compare robotic and VATS approaches to stage I and II lung cancer will start shortly.

  20. Report on First International Workshop on Robotic Surgery in Thoracic Oncology.

    Science.gov (United States)

    Veronesi, Giulia; Cerfolio, Robert; Cingolani, Roberto; Rueckert, Jens C; Soler, Luc; Toker, Alper; Cariboni, Umberto; Bottoni, Edoardo; Fumagalli, Uberto; Melfi, Franca; Milli, Carlo; Novellis, Pierluigi; Voulaz, Emanuele; Alloisio, Marco

    2016-01-01

    A workshop of experts from France, Germany, Italy, and the United States took place at Humanitas Research Hospital Milan, Italy, on February 10 and 11, 2016, to examine techniques for and applications of robotic surgery to thoracic oncology. The main topics of presentation and discussion were robotic surgery for lung resection; robot-assisted thymectomy; minimally invasive surgery for esophageal cancer; new developments in computer-assisted surgery and medical applications of robots; the challenge of costs; and future clinical research in robotic thoracic surgery. The following article summarizes the main contributions to the workshop. The Workshop consensus was that since video-assisted thoracoscopic surgery (VATS) is becoming the mainstream approach to resectable lung cancer in North America and Europe, robotic surgery for thoracic oncology is likely to be embraced by an increasing numbers of thoracic surgeons, since it has technical advantages over VATS, including intuitive movements, tremor filtration, more degrees of manipulative freedom, motion scaling, and high-definition stereoscopic vision. These advantages may make robotic surgery more accessible than VATS to trainees and experienced surgeons and also lead to expanded indications. However, the high costs of robotic surgery and absence of tactile feedback remain obstacles to widespread dissemination. A prospective multicentric randomized trial (NCT02804893) to compare robotic and VATS approaches to stages I and II lung cancer will start shortly.

  1. First Django Girls workshop in Geneva

    CERN Multimedia

    Julliard, Laure

    2016-01-01

    A Django girls workshop organised by the R0SEH1PSters community from Geneva and supported by the CERN diversity team and the IT department took place at IdeaSquare on 26th and 27th February. Django Girls is a volunteer-run organisation with hundreds of people contributing to bring more women without prior IT backgrounds to the Python and Django community. Python is a widely used general-purpose and dynamic programming language while Django is a high-level Python Web framework that makes it easier to build better Web apps more quickly and with less code. Over 155 free workshops in 125 cities and 57 countries have been organised worldwide regularly since 2014. The aim of the workshop was to introduce participants to the world of computer programming and technology by teaching them how to successfully create a blog application and deploy it to the internet.

  2. Advanced Computational Materials Science: Application to Fusion and Generation IV Fission Reactors (Workshop Report)

    Energy Technology Data Exchange (ETDEWEB)

    Stoller, RE

    2004-07-15

    The ''Workshop on Advanced Computational Materials Science: Application to Fusion and Generation IV Fission Reactors'' was convened to determine the degree to which an increased effort in modeling and simulation could help bridge the gap between the data that is needed to support the implementation of these advanced nuclear technologies and the data that can be obtained in available experimental facilities. The need to develop materials capable of performing in the severe operating environments expected in fusion and fission (Generation IV) reactors represents a significant challenge in materials science. There is a range of potential Gen-IV fission reactor design concepts and each concept has its own unique demands. Improved economic performance is a major goal of the Gen-IV designs. As a result, most designs call for significantly higher operating temperatures than the current generation of LWRs to obtain higher thermal efficiency. In many cases, the desired operating temperatures rule out the use of the structural alloys employed today. The very high operating temperature (up to 1000 C) associated with the NGNP is a prime example of an attractive new system that will require the development of new structural materials. Fusion power plants represent an even greater challenge to structural materials development and application. The operating temperatures, neutron exposure levels and thermo-mechanical stresses are comparable to or greater than those for proposed Gen-IV fission reactors. In addition, the transmutation products created in the structural materials by the high energy neutrons produced in the DT plasma can profoundly influence the microstructural evolution and mechanical behavior of these materials. Although the workshop addressed issues relevant to both Gen-IV and fusion reactor materials, much of the discussion focused on fusion; the same focus is reflected in this report. Most of the physical models and computational methods

  3. Simple, parallel, high-performance virtual machines for extreme computations

    International Nuclear Information System (INIS)

    Chokoufe Nejad, Bijan; Ohl, Thorsten; Reuter, Jurgen

    2014-11-01

    We introduce a high-performance virtual machine (VM) written in a numerically fast language like Fortran or C to evaluate very large expressions. We discuss the general concept of how to perform computations in terms of a VM and present specifically a VM that is able to compute tree-level cross sections for any number of external legs, given the corresponding byte code from the optimal matrix element generator, O'Mega. Furthermore, this approach allows to formulate the parallel computation of a single phase space point in a simple and obvious way. We analyze hereby the scaling behaviour with multiple threads as well as the benefits and drawbacks that are introduced with this method. Our implementation of a VM can run faster than the corresponding native, compiled code for certain processes and compilers, especially for very high multiplicities, and has in general runtimes in the same order of magnitude. By avoiding the tedious compile and link steps, which may fail for source code files of gigabyte sizes, new processes or complex higher order corrections that are currently out of reach could be evaluated with a VM given enough computing power.

  4. Characterization and prediction of extreme events in turbulence

    Science.gov (United States)

    Fonda, Enrico; Iyer, Kartik P.; Sreenivasan, Katepalli R.

    2017-11-01

    Extreme events in Nature such as tornadoes, large floods and strong earthquakes are rare but can have devastating consequences. The predictability of these events is very limited at present. Extreme events in turbulence are the very large events in small scales that are intermittent in character. We examine events in energy dissipation rate and enstrophy which are several tens to hundreds to thousands of times the mean value. To this end we use our DNS database of homogeneous and isotropic turbulence with Taylor Reynolds numbers spanning a decade, computed with different small scale resolutions and different box sizes, and study the predictability of these events using machine learning. We start with an aggressive data augmentation to virtually increase the number of these rare events by two orders of magnitude and train a deep convolutional neural network to predict their occurrence in an independent data set. The goal of the work is to explore whether extreme events can be predicted with greater assurance than can be done by conventional methods (e.g., D.A. Donzis & K.R. Sreenivasan, J. Fluid Mech. 647, 13-26, 2010).

  5. Optimization and large scale computation of an entropy-based moment closure

    Science.gov (United States)

    Kristopher Garrett, C.; Hauck, Cory; Hill, Judith

    2015-12-01

    We present computational advances and results in the implementation of an entropy-based moment closure, MN, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as PN, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which are used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. These results show, in particular, load balancing issues in scaling the MN algorithm that do not appear for the PN algorithm. We also observe that in weak scaling tests, the ratio in time to solution of MN to PN decreases.

  6. Cryptography and computational number theory

    CERN Document Server

    Shparlinski, Igor; Wang, Huaxiong; Xing, Chaoping; Workshop on Cryptography and Computational Number Theory, CCNT'99

    2001-01-01

    This volume contains the refereed proceedings of the Workshop on Cryptography and Computational Number Theory, CCNT'99, which has been held in Singapore during the week of November 22-26, 1999. The workshop was organized by the Centre for Systems Security of the Na­ tional University of Singapore. We gratefully acknowledge the financial support from the Singapore National Science and Technology Board under the grant num­ ber RP960668/M. The idea for this workshop grew out of the recognition of the recent, rapid development in various areas of cryptography and computational number the­ ory. The event followed the concept of the research programs at such well-known research institutions as the Newton Institute (UK), Oberwolfach and Dagstuhl (Germany), and Luminy (France). Accordingly, there were only invited lectures at the workshop with plenty of time for informal discussions. It was hoped and successfully achieved that the meeting would encourage and stimulate further research in information and computer s...

  7. PREFACE: Proceedings of the International Workshop on Current Challenges in Liquid and Glass Science, (The Cosener's House, Abingdon 10 12 January 2007)

    Science.gov (United States)

    Hannon, Alex C.; Salmon, Philip S.; Soper, Alan K.

    2007-10-01

    The workshop was held to discuss current experimental and theoretical challenges in liquid and glass science and to honour the contribution made by Spencer Howells (ISIS, UK) to the field of neutron scattering from liquids and glasses. The meeting was attended by 70 experimentalists, theorists and computer simulators from Europe, Japan and North America and comprised 34 oral presentations together with two lively poster sessions. Three major themes were discussed, namely (i) the glass transition and properties of liquids and glasses under extreme conditions; (ii) the complementarity of neutron and x-ray scattering techniques with other experimental methods; and (iii) the modelling of liquid and glass structure. These themes served to highlight (a) recent advances in neutron and x-ray instrumentation used to investigate liquid and glassy materials under extreme conditions; (b) the relationship between the results obtained from different experimental and theoretical/computational methods; and (c) the modern methods used to interpret experimental results. The presentations ranged from polyamorphism in liquids and glasses to protein folding in aqueous solution and included the dynamics of fresh and freeze-dried strawberries and red onions. The properties of liquid phosphorus were also memorably demonstrated! The formal highlight was the 'Spencerfest' dinner where Neil Cowlam (Sheffield, UK) gave an excellent after dinner speech. The organisation of the workshop benefited tremendously from the secretarial skills of Carole Denning (ISIS, UK). The financial support of the Council for the Central Laboratory of the Research Councils (CCLRC), the Liquids and Complex Fluids Group of the Institute of Physics, The ISIS Disordered Materials Group, the CCLRC Centre for Materials Physics and Chemistry and the CCLRC Centre for Molecular Structure and Dynamics is gratefully acknowledged. Finally, it is a pleasure to thank all the workshop participants whose lively contributions led

  8. Measurement Properties of the Lower Extremity Functional Scale: A Systematic Review.

    Science.gov (United States)

    Mehta, Saurabh P; Fulton, Allison; Quach, Cedric; Thistle, Megan; Toledo, Cesar; Evans, Neil A

    2016-03-01

    Systematic review of measurement properties. Many primary studies have examined the measurement properties, such as reliability, validity, and sensitivity to change, of the Lower Extremity Functional Scale (LEFS) in different clinical populations. A systematic review summarizing these properties for the LEFS may provide an important resource. To locate and synthesize evidence on the measurement properties of the LEFS and to discuss the clinical implications of the evidence. A literature search was conducted in 4 databases (PubMed, MEDLINE, Embase, and CINAHL), using predefined search terms. Two reviewers performed a critical appraisal of the included studies using a standardized assessment form. A total of 27 studies were included in the review, of which 18 achieved a very good to excellent methodological quality level. The LEFS scores demonstrated excellent test-retest reliability (intraclass correlation coefficients ranging between 0.85 and 0.99) and demonstrated the expected relationships with measures assessing similar constructs (Pearson correlation coefficient values of greater than 0.7). The responsiveness of the LEFS scores was excellent, as suggested by consistently high effect sizes (greater than 0.8) in patients with different lower extremity conditions. Minimal detectable change at the 90% confidence level (MDC90) for the LEFS scores varied between 8.1 and 15.3 across different reassessment intervals in a wide range of patient populations. The pooled estimate of the MDC90 was 6 points and the minimal clinically important difference was 9 points in patients with lower extremity musculoskeletal conditions, which are indicative of true change and clinically meaningful change, respectively. The results of this review support the reliability, validity, and responsiveness of the LEFS scores for assessing functional impairment in a wide array of patient groups with lower extremity musculoskeletal conditions.

  9. Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-07-24

    The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing was not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.

  10. Development of a small-scale computer cluster

    Science.gov (United States)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  11. Combinations of large-scale circulation anomalies conducive to precipitation extremes in the Czech Republic

    Czech Academy of Sciences Publication Activity Database

    Kašpar, Marek; Müller, Miloslav

    2014-01-01

    Roč. 138, March 2014 (2014), s. 205-212 ISSN 0169-8095 R&D Projects: GA ČR(CZ) GAP209/11/1990 Institutional support: RVO:68378289 Keywords : precipitation extreme * synoptic-scale cause * re-analysis * circulation anomaly Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 2.844, year: 2014 http://www.sciencedirect.com/science/article/pii/S0169809513003372

  12. Report of the workshop on particle process (first meeting). A report of the Yayoi study meeting

    International Nuclear Information System (INIS)

    1994-09-01

    In the Nuclear Engineering Research Laboratory of University of Tokyo, more than 10 short period workshops called Yayoi workshop have been held yearly as one of the activities of the joint utilization of the reactor 'Yayoi' and an electron linear accelerator by universities. In this report, the gists of the lectures given at the workshop on particle process which was held on August 8, 1994, are summarized. The development of scientific and technological computations in atomic energy field is briefly mentioned. The recent advance of numerical fluid dynamics is conspicuous, but still it includes many unsatisfactory points. This workshop was held, collecting the computation method using particles and the computation method without using grids for the application to fluids. Lectures were given on the SPH method in astrophysics, fragmentation of isothermal sheet-like clouds, lattice Bhatnagar-Gross-Krook method for fluid dynamics and compressible, thermal and multi-phase models, the analysis techniques for compressible and incompressible fluids including movable boundary by PIC method, the numerical computation of high Reynolds number flow by gridless method, the development of particle method for analyzing incompressible viscous flow accompanied by breaker, the calculation of neutron and photon transport by Monte Carlo method using vector and parallel computers and the paradigm of super-parallel computation. (K.I.)

  13. Front-end vision and multi-scale image analysis multi-scale computer vision theory and applications, written in Mathematica

    CERN Document Server

    Romeny, Bart M Haar

    2008-01-01

    Front-End Vision and Multi-Scale Image Analysis is a tutorial in multi-scale methods for computer vision and image processing. It builds on the cross fertilization between human visual perception and multi-scale computer vision (`scale-space') theory and applications. The multi-scale strategies recognized in the first stages of the human visual system are carefully examined, and taken as inspiration for the many geometric methods discussed. All chapters are written in Mathematica, a spectacular high-level language for symbolic and numerical manipulations. The book presents a new and effective

  14. Preliminary Computational Analysis of the (HIRENASD) Configuration in Preparation for the Aeroelastic Prediction Workshop

    Science.gov (United States)

    Chwalowski, Pawel; Florance, Jennifer P.; Heeg, Jennifer; Wieseman, Carol D.; Perry, Boyd P.

    2011-01-01

    This paper presents preliminary computational aeroelastic analysis results generated in preparation for the first Aeroelastic Prediction Workshop (AePW). These results were produced using FUN3D software developed at NASA Langley and are compared against the experimental data generated during the HIgh REynolds Number Aero- Structural Dynamics (HIRENASD) Project. The HIRENASD wind-tunnel model was tested in the European Transonic Windtunnel in 2006 by Aachen University0s Department of Mechanics with funding from the German Research Foundation. The computational effort discussed here was performed (1) to obtain a preliminary assessment of the ability of the FUN3D code to accurately compute physical quantities experimentally measured on the HIRENASD model and (2) to translate the lessons learned from the FUN3D analysis of HIRENASD into a set of initial guidelines for the first AePW, which includes test cases for the HIRENASD model and its experimental data set. This paper compares the computational and experimental results obtained at Mach 0.8 for a Reynolds number of 7 million based on chord, corresponding to the HIRENASD test conditions No. 132 and No. 159. Aerodynamic loads and static aeroelastic displacements are compared at two levels of the grid resolution. Harmonic perturbation numerical results are compared with the experimental data using the magnitude and phase relationship between pressure coefficients and displacement. A dynamic aeroelastic numerical calculation is presented at one wind-tunnel condition in the form of the time history of the generalized displacements. Additional FUN3D validation results are also presented for the AGARD 445.6 wing data set. This wing was tested in the Transonic Dynamics Tunnel and is commonly used in the preliminary benchmarking of computational aeroelastic software.

  15. Pressurized water reactor simulator. Workshop material. 2. ed

    International Nuclear Information System (INIS)

    2005-01-01

    The International Atomic Energy Agency (IAEA) has established an activity in nuclear reactor simulation computer programs to assist its Member States in education. The objective is to provide, for a variety of advanced reactor types, insight and practice in their operational characteristics and their response to perturbations and accident situations. To achieve this, the IAEA arranges for the development and distribution of simulation programs and educational material and sponsors courses and workshops. The workshops are in two parts: techniques and tools for reactor simulator development. And the use of reactor simulators in education. Workshop material for the first part is covered in the IAEA Training Course Series No. 12, 'Reactor Simulator Development' (2001). Course material for workshops using a WWER- 1000 reactor department simulator from the Moscow Engineering and Physics Institute, the Russian Federation is presented in the IAEA Training Course Series No. 21, 2nd edition, 'WWER-1000 Reactor Simulator' (2005). Course material for workshops using a boiling water reactor simulator developed for the IAEA by Cassiopeia Technologies Incorporated of Canada (CTI) is presented in the IAEA publication: Training Course Series No.23, 2nd edition, 'Boiling Water Reactor Simulator' (2005). This report consists of course material for workshops using a pressurized water reactor simulator

  16. Virginia Tech to host virtual reality, robotics, and web workshops for middle school students

    OpenAIRE

    Felker, Susan B.

    2004-01-01

    Three summer workshops on web development, virtual reality, and robotics will offer aspiring middle school web designers, writers, and computer scientists a high-tech learning adventure designed to teach skills in math, science, computers, and oral and written communication. Virginia Tech's Continuing and Professional Education and the Center for Instructional Technology Solutions in Industry and Education developed the workshops with support from Montgomery County and Salem schools. Classes ...

  17. Predicting Benefit from a Gestalt Therapy Marathon Workshop.

    Science.gov (United States)

    Healy, James; Dowd, E. Thomas

    1981-01-01

    Tested the utility of the Personal Orientation Inventory (POI), the Myers-Briggs Type Indicator, and the Girona Affect Scale in predicting the outcomes of a marathon Gestalt therapy workshop. Signigicant predictive equations were generated that use the POI to predict gains on the Girona Affect Scale. (Author/RC)

  18. Workshops and problems for benchmarking eddy current codes

    Energy Technology Data Exchange (ETDEWEB)

    Turner, L.R.; Davey, K.; Ida, N.; Rodger, D.; Kameari, A.; Bossavit, A.; Emson, C.R.I.

    1988-08-01

    A series of six workshops was held in 1986 and 1987 to compare eddy current codes, using six benchmark problems. The problems included transient and steady-state ac magnetic fields, close and far boundary conditions, magnetic and non-magnetic materials. All the problems were based either on experiments or on geometries that can be solved analytically. The workshops and solutions to the problems are described. Results show that many different methods and formulations give satisfactory solutions, and that in many cases reduced dimensionality or coarse discretization can give acceptable results while reducing the computer time required. A second two-year series of TEAM (Testing Electromagnetic Analysis Methods) workshops, using six more problems, is underway. 12 refs., 15 figs., 4 tabs.

  19. Workshops and problems for benchmarking eddy current codes

    International Nuclear Information System (INIS)

    Turner, L.R.; Davey, K.; Ida, N.; Rodger, D.; Kameari, A.; Bossavit, A.; Emson, C.R.I.

    1988-08-01

    A series of six workshops was held in 1986 and 1987 to compare eddy current codes, using six benchmark problems. The problems included transient and steady-state ac magnetic fields, close and far boundary conditions, magnetic and non-magnetic materials. All the problems were based either on experiments or on geometries that can be solved analytically. The workshops and solutions to the problems are described. Results show that many different methods and formulations give satisfactory solutions, and that in many cases reduced dimensionality or coarse discretization can give acceptable results while reducing the computer time required. A second two-year series of TEAM (Testing Electromagnetic Analysis Methods) workshops, using six more problems, is underway. 12 refs., 15 figs., 4 tabs

  20. The spatial return level of aggregated hourly extreme rainfall in Peninsular Malaysia

    Science.gov (United States)

    Shaffie, Mardhiyyah; Eli, Annazirin; Wan Zin, Wan Zawiah; Jemain, Abdul Aziz

    2015-07-01

    This paper is intended to ascertain the spatial pattern of extreme rainfall distribution in Peninsular Malaysia at several short time intervals, i.e., on hourly basis. Motivation of this research is due to historical records of extreme rainfall in Peninsular Malaysia, whereby many hydrological disasters at this region occur within a short time period. The hourly periods considered are 1, 2, 3, 6, 12, and 24 h. Many previous hydrological studies dealt with daily rainfall data; thus, this study enables comparison to be made on the estimated performances between daily and hourly rainfall data analyses so as to identify the impact of extreme rainfall at a shorter time scale. Return levels based on the time aggregate considered are also computed. Parameter estimation using L-moment method for four probability distributions, namely, the generalized extreme value (GEV), generalized logistic (GLO), generalized Pareto (GPA), and Pearson type III (PE3) distributions were conducted. Aided with the L-moment diagram test and mean square error (MSE) test, GLO was found to be the most appropriate distribution to represent the extreme rainfall data. At most time intervals (10, 50, and 100 years), the spatial patterns revealed that the rainfall distribution across the peninsula differ for 1- and 24-h extreme rainfalls. The outcomes of this study would provide additional information regarding patterns of extreme rainfall in Malaysia which may not be detected when considering only a higher time scale such as daily; thus, appropriate measures for shorter time scales of extreme rainfall can be planned. The implementation of such measures would be beneficial to the authorities to reduce the impact of any disastrous natural event.

  1. Final Report Extreme Computing and U.S. Competitiveness DOE Award. DE-FG02-11ER26087/DE-SC0008764

    Energy Technology Data Exchange (ETDEWEB)

    Mustain, Christopher J. [Council on Competitiveness, Washington, DC (United States)

    2016-01-13

    The Council has acted on each of the grant deliverables during the funding period. The deliverables are: (1) convening the Council’s High Performance Computing Advisory Committee (HPCAC) on a bi-annual basis; (2) broadening public awareness of high performance computing (HPC) and exascale developments; (3) assessing the industrial applications of extreme computing; and (4) establishing a policy and business case for an exascale economy.

  2. Proceedings of the 2004 NASA/ONR Circulation Control Workshop, Part 2

    Science.gov (United States)

    Jones, Gregory S. (Editor); Joslin, Ronald D. (Editor)

    2005-01-01

    This conference proceeding is comprised of papers that were presented at the NASA/ONR Circulation Control Workshop held 16-17 March 2004 at the Radisson-Hampton in Hampton, VA. Over two full days, 30 papers and 4 posters were presented with 110 scientists and engineers in attendance, representing 3 countries. As technological advances influence the efficiency and effectiveness of aerodynamic and hydrodynamic applications, designs, and operations, this workshop was intended to address the technologies, systems, challenges and successes specific to Coanda driven circulation control in aerodynamics and hydrodynamics. A major goal of this workshop was to determine the state-of-the-art in circulation control and to assess the future directions and applications for circulation control. The 2004 workshop addressed applications, experiments, computations, and theories related to circulation control, emphasizing fundamental physics, systems analysis, and applied research. The workshop consisted of single session oral presentations, posters, and written papers that are documented in this unclassified conference proceeding. The format of this written proceeding follows the agenda of the workshop. Each paper is followed with the presentation given at the workshop. the editors compiled brief summaries for each effort that is at the end of this proceeding. These summaries include the paper, oral presentation, and questions or comments that occurred during the workshop. The 2004 Circulation Control Workshop focused on applications including Naval vehicles (Surface and Underwater vehicles), Fixed Wing Aviation (general aviation, commercial, cargo, and business aircraft); V/STOL platforms (helicopters, military aircraft, tilt rotors); propulsion systems (propellers, jet engines, gas turbines), and ground vehicles (automotive, trucks, and other); wind turbines, and other nontraditional applications (e.g., vacuum cleaner, ceiling fan). As part of the CFD focus area of the 2004 CC

  3. Does one workshop on respecting cultural differences increase health professionals' confidence to improve the care of Australian Aboriginal patients with cancer? An evaluation.

    Science.gov (United States)

    Durey, Angela; Halkett, Georgia; Berg, Melissa; Lester, Leanne; Kickett, Marion

    2017-09-15

    Aboriginal Australians have worse cancer survival rates than other Australians. Reasons include fear of a cancer diagnosis, reluctance to attend mainstream health services and discrimination from health professionals. Offering health professionals education in care focusing on Aboriginal patients' needs is important. The aim of this paper was to evaluate whether participating in a workshop improved the confidence of radiation oncology health professionals in their knowledge, communication and ability to offer culturally safe healthcare to Aboriginal Australians with cancer. Mixed methods using pre and post workshop online surveys, and one delivered 2 months later, were evaluated. Statistical analysis determined the relative proportion of participants who changed from not at all/a little confident at baseline to fairly/extremely confident immediately and 2 months after the workshop. Factor analysis identified underlying dimensions in the items and nonparametric tests recorded changes in mean dimension scores over and between times. Qualitative data was analysed for emerging themes. Fifty-nine participants attended the workshops, 39 (66% response rate) completed pre-workshop surveys, 32 (82% of study participants) completed post-workshop surveys and 25 (64% of study participants) completed surveys 2 months later. A significant increase in the proportion of attendees who reported fair/extreme confidence within 2 days of the workshop was found in nine of 14 items, which was sustained for all but one item 2 months later. Two additional items had a significant increase in the proportion of fair/extremely confident attendees 2 months post workshop compared to baseline. An exploratory factor analysis identified three dimensions: communication; relationships; and awareness. All dimensions' mean scores significantly improved within 2 days (p Aboriginal Australians that in some cases resulted in improved care. Single workshops co-delivered by an Aboriginal and non

  4. Developing a New Computer Game Attitude Scale for Taiwanese Early Adolescents

    Science.gov (United States)

    Liu, Eric Zhi-Feng; Lee, Chun-Yi; Chen, Jen-Huang

    2013-01-01

    With ever increasing exposure to computer games, gaining an understanding of the attitudes held by young adolescents toward such activities is crucial; however, few studies have provided scales with which to accomplish this. This study revisited the Computer Game Attitude Scale developed by Chappell and Taylor in 1997, reworking the overall…

  5. Summary from the epistemic uncertainty workshop: consensus amid diversity

    International Nuclear Information System (INIS)

    Ferson, Scott; Joslyn, Cliff A.; Helton, Jon C.; Oberkampf, William L.; Sentz, Kari

    2004-01-01

    The 'Epistemic Uncertainty Workshop' sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6-7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster-Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of

  6. Effect of Variable Spatial Scales on USLE-GIS Computations

    Science.gov (United States)

    Patil, R. J.; Sharma, S. K.

    2017-12-01

    Use of appropriate spatial scale is very important in Universal Soil Loss Equation (USLE) based spatially distributed soil erosion modelling. This study aimed at assessment of annual rates of soil erosion at different spatial scales/grid sizes and analysing how changes in spatial scales affect USLE-GIS computations using simulation and statistical variabilities. Efforts have been made in this study to recommend an optimum spatial scale for further USLE-GIS computations for management and planning in the study area. The present research study was conducted in Shakkar River watershed, situated in Narsinghpur and Chhindwara districts of Madhya Pradesh, India. Remote Sensing and GIS techniques were integrated with Universal Soil Loss Equation (USLE) to predict spatial distribution of soil erosion in the study area at four different spatial scales viz; 30 m, 50 m, 100 m, and 200 m. Rainfall data, soil map, digital elevation model (DEM) and an executable C++ program, and satellite image of the area were used for preparation of the thematic maps for various USLE factors. Annual rates of soil erosion were estimated for 15 years (1992 to 2006) at four different grid sizes. The statistical analysis of four estimated datasets showed that sediment loss dataset at 30 m spatial scale has a minimum standard deviation (2.16), variance (4.68), percent deviation from observed values (2.68 - 18.91 %), and highest coefficient of determination (R2 = 0.874) among all the four datasets. Thus, it is recommended to adopt this spatial scale for USLE-GIS computations in the study area due to its minimum statistical variability and better agreement with the observed sediment loss data. This study also indicates large scope for use of finer spatial scales in spatially distributed soil erosion modelling.

  7. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  8. Extremely Severe Space Weather and Geomagnetically Induced Currents in Regions with Locally Heterogeneous Ground Resistivity

    Science.gov (United States)

    Fujita, Shigeru; Kataoka, Ryuho; Pulkkinen, Antti; Watari, Shinichi

    2016-01-01

    Large geomagnetically induced currents (GICs) triggered by extreme space weather events are now regarded as one of the serious natural threats to the modern electrified society. The risk is described in detail in High-Impact, Low-Frequency Event Risk, A Jointly-Commissioned Summary Report of the North American Electric Reliability Corporation and the US Department of Energy's November 2009 Workshop, June 2010. For example, the March 13-14,1989 storm caused a large-scale blackout affecting about 6 million people in Quebec, Canada, and resulting in substantial economic losses in Canada and the USA (Bolduc 2002). Therefore, European and North American nations have invested in GIC research such as the Solar Shield project in the USA (Pulkkinen et al. 2009, 2015a). In 2015, the Japanese government (Ministry of Economy, Trade and Industry, METI) acknowledged the importance of GIC research in Japan. After reviewing the serious damages caused by the 2011 Tohoku-Oki earthquake, METI recognized the potential risk to the electric power grid posed by extreme space weather. During extreme events, GICs can be concerning even in mid- and low-latitude countries and have become a global issue.

  9. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  10. The 1995 Science Information Management and Data Compression Workshop

    Science.gov (United States)

    Tilton, James C. (Editor)

    1995-01-01

    This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on October 26-27, 1995, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival, and retrieval of large quantities of data in future Earth and space science missions. It consisted of fourteen presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The Workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center.

  11. 77 FR 31371 - Public Workshop: Privacy Compliance Workshop

    Science.gov (United States)

    2012-05-25

    ... presentations, including the privacy compliance fundamentals, privacy and data security, and the privacy... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary Public Workshop: Privacy Compliance... Homeland Security Privacy Office will host a public workshop, ``Privacy Compliance Workshop.'' DATES: The...

  12. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  13. 2016 MICCAI Workshop

    CERN Document Server

    Ghosh, Aurobrata; Kaden, Enrico; Rathi, Yogesh; Reisert, Marco

    2017-01-01

    This volume offers a valuable starting point for anyone interested in learning computational diffusion MRI and mathematical methods for brain connectivity, while also sharing new perspectives and insights on the latest research challenges for those currently working in the field. Over the last decade, interest in diffusion MRI has virtually exploded. The technique provides unique insights into the microstructure of living tissue and enables in-vivo connectivity mapping of the brain. Computational techniques are key to the continued success and development of diffusion MRI and to its widespread transfer into the clinic, while new processing methods are essential to addressing issues at each stage of the diffusion MRI pipeline: acquisition, reconstruction, modeling and model fitting, image processing, fiber tracking, connectivity mapping, visualization, group studies and inference. These papers from the 2016 MICCAI WorkshopComputational Diffusion MRI” – which was intended to provide a snapshot of the la...

  14. 2014 MICCAI Workshop

    CERN Document Server

    Nedjati-Gilani, Gemma; Rathi, Yogesh; Reisert, Marco; Schneider, Torben

    2014-01-01

    This book contains papers presented at the 2014 MICCAI Workshop on Computational Diffusion MRI, CDMRI’14. Detailing new computational methods applied to diffusion magnetic resonance imaging data, it offers readers a snapshot of the current state of the art and covers a wide range of topics from fundamental theoretical work on mathematical modeling to the development and evaluation of robust algorithms and applications in neuroscientific studies and clinical practice.   Inside, readers will find information on brain network analysis, mathematical modeling for clinical applications, tissue microstructure imaging, super-resolution methods, signal reconstruction, visualization, and more. Contributions include both careful mathematical derivations and a large number of rich full-color visualizations.   Computational techniques are key to the continued success and development of diffusion MRI and to its widespread transfer into the clinic. This volume will offer a valuable starting point for anyone interested i...

  15. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  16. A computer literacy scale for newly enrolled nursing college students: development and validation.

    Science.gov (United States)

    Lin, Tung-Cheng

    2011-12-01

    Increasing application and use of information systems and mobile technologies in the healthcare industry require increasing nurse competency in computer use. Computer literacy is defined as basic computer skills, whereas computer competency is defined as the computer skills necessary to accomplish job tasks. Inadequate attention has been paid to computer literacy and computer competency scale validity. This study developed a computer literacy scale with good reliability and validity and investigated the current computer literacy of newly enrolled students to develop computer courses appropriate to students' skill levels and needs. This study referenced Hinkin's process to develop a computer literacy scale. Participants were newly enrolled first-year undergraduate students, with nursing or nursing-related backgrounds, currently attending a course entitled Information Literacy and Internet Applications. Researchers examined reliability and validity using confirmatory factor analysis. The final version of the developed computer literacy scale included six constructs (software, hardware, multimedia, networks, information ethics, and information security) and 22 measurement items. Confirmatory factor analysis showed that the scale possessed good content validity, reliability, convergent validity, and discriminant validity. This study also found that participants earned the highest scores for the network domain and the lowest score for the hardware domain. With increasing use of information technology applications, courses related to hardware topic should be increased to improve nurse problem-solving abilities. This study recommends that emphases on word processing and network-related topics may be reduced in favor of an increased emphasis on database, statistical software, hospital information systems, and information ethics.

  17. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Daily, Jeffrey A. [Washington State Univ., Pullman, WA (United States)

    2015-05-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore’s law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or “homologous”) on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K cores

  18. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    International Nuclear Information System (INIS)

    Daily, Jeffrey A.

    2015-01-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore's law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or 'homologous') on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K

  19. Audit and Evaluation of Computer Security. Computer Science and Technology.

    Science.gov (United States)

    Ruthberg, Zella G.

    This is a collection of consensus reports, each produced at a session of an invitational workshop sponsored by the National Bureau of Standards. The purpose of the workshop was to explore the state-of-the-art and define appropriate subjects for future research in the audit and evaluation of computer security. Leading experts in the audit and…

  20. Workshops som forskningsmetode

    OpenAIRE

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learnin...

  1. [Clinical and communication simulation workshop for fellows in gastroenterology: the trainees' perspective].

    Science.gov (United States)

    Lang, Alon; Melzer, Ehud; Bar-Meir, Simon; Eliakim, Rami; Ziv, Amitai

    2006-11-01

    The continuing development in computer-based medical simulators provides an ideal platform for simulator-assisted training programs for medical trainees. Computer-based endoscopic simulators provide a virtual reality environment for training endoscopic procedures. This study illustrates the use of a comprehensive training model combining the use of endoscopic simulators with simulated (actor) patients (SP). To evaluate the effectiveness of a comprehensive simulation workshop from the trainee perspective. Four case studies were developed with emphasis on communication skills. Three workshops with 10 fellows in each were conducted. During each workshop the trainees spent half of the time in SP case studies and the remaining half working with computerized endoscopic simulators with continuous guidance by an expert endoscopist. Questionnaires were completed by the fellows at the end of the workshop. Seventy percent of the fellows felt that the endoscopic simulator was close or very close to reality for gastroscopy and 63% for colonoscopy. Eighty eight percent thought the close guidance was important for the learning process with the simulator. Eighty percent felt that the case studies were an important learning experience for risk management. Further evaluation of multi-modality simulation workshops in gastroenterologist training is needed to identify how best to incorporate this form of instruction into training for gastroenterologists.

  2. Power-law scaling of extreme dynamics near higher-order exceptional points

    Science.gov (United States)

    Zhong, Q.; Christodoulides, D. N.; Khajavikhan, M.; Makris, K. G.; El-Ganainy, R.

    2018-02-01

    We investigate the extreme dynamics of non-Hermitian systems near higher-order exceptional points in photonic networks constructed using the bosonic algebra method. We show that strong power oscillations for certain initial conditions can occur as a result of the peculiar eigenspace geometry and its dimensionality collapse near these singularities. By using complementary numerical and analytical approaches, we show that, in the parity-time (PT ) phase near exceptional points, the logarithm of the maximum optical power amplification scales linearly with the order of the exceptional point. We focus in our discussion on photonic systems, but we note that our results apply to other physical systems as well.

  3. "Teaching students how to wear their Computer"

    DEFF Research Database (Denmark)

    Guglielmi, Michel; Johannesen, Hanne Louise

    2005-01-01

    to address this question trough the angle of what we called ‘Physical Computing’ and asked ourselves and the students if new fields like ‘tangible media’ or ‘wearable computers’ can contribute to improvements of life? And whose life improvement are we aiming for? Computers are a ubiquitous part....... Through the workshop the students were encouraged to disrupt the myth of how a computer should be used and to focus on the human-human interaction (HHI) through the computer rather than human-computer interaction (HCI). The physical computing approach offered furthermore a unique opportunity to break down......This paper intends to present the goal, results and methodology of a workshop run in collaboration with Visual Culture (humanities), University of Copenhagen, the Danish academy of Design in Copenhagen and Media lab Aalborg, University of Aalborg. The workshop was related to a design competition...

  4. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    Science.gov (United States)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  5. Computer assisted radiology and surgery. CARS 2010

    International Nuclear Information System (INIS)

    Anon.

    2010-01-01

    The conference proceedings include contributions to the following topics: (1) CARS Clinical Day: minimally invasive spiral surgery, interventional radiology; (2) CARS - computer assisted radiology and surgery: ophthalmology, stimulation methods, new approaches to diagnosis and therapy; (3) Computer assisted radiology 24th International congress and exhibition: computer tomography and magnetic resonance, digital angiographic imaging, digital radiography, ultrasound, computer assisted radiation therapy, medical workstations, image processing and display; (4) 14th Annual conference of the International Society for computer aided surgery; ENT-CMF head and neck surgery computer-assisted neurosurgery, cardiovascular surgery, image guided liver surgery, abdominal and laparoscopic surgery, computer-assisted orthopedic surgery, image processing and visualization, surgical robotics and instrumentation, surgical modeling, simulation and education; (5) 28th International EuroPACS meeting: image distribution and integration strategies, planning and evaluation, telemedicine and standards, workflow and data flow in radiology; (6) 11th CARS/SPIE/EuroPACS joint workshop on surgical PACS and the digital operating, management and assessment of OR systems and integration; (7) 12th International workshop on computer-aided diagnosis: special session on breast CAD, special session on thoracic CAD, special session on abdominal brain, lumbar spine CAD; (8) 16th computed Maxillofacial imaging congress: computed maxillofacial imaging in dental implantology, orthodontics and dentofacial orthopedics; approaches to 3D maxillofacial imaging; surgical navigation; (9) 2nd EuroNOTES/CARS workshop on NOTES: an interdisciplinary challenge; (10) 2nd EPMA/CARS workshop on personalized medicine and ICT.; (11)poster sessions.

  6. Computer assisted radiology and surgery. CARS 2010

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-06-15

    The conference proceedings include contributions to the following topics: (1) CARS Clinical Day: minimally invasive spiral surgery, interventional radiology; (2) CARS - computer assisted radiology and surgery: ophthalmology, stimulation methods, new approaches to diagnosis and therapy; (3) Computer assisted radiology 24th International congress and exhibition: computer tomography and magnetic resonance, digital angiographic imaging, digital radiography, ultrasound, computer assisted radiation therapy, medical workstations, image processing and display; (4) 14th Annual conference of the International Society for computer aided surgery; ENT-CMF head and neck surgery computer-assisted neurosurgery, cardiovascular surgery, image guided liver surgery, abdominal and laparoscopic surgery, computer-assisted orthopedic surgery, image processing and visualization, surgical robotics and instrumentation, surgical modeling, simulation and education; (5) 28th International EuroPACS meeting: image distribution and integration strategies, planning and evaluation, telemedicine and standards, workflow and data flow in radiology; (6) 11th CARS/SPIE/EuroPACS joint workshop on surgical PACS and the digital operating, management and assessment of OR systems and integration; (7) 12th International workshop on computer-aided diagnosis: special session on breast CAD, special session on thoracic CAD, special session on abdominal brain, lumbar spine CAD; (8) 16th computed Maxillofacial imaging congress: computed maxillofacial imaging in dental implantology, orthodontics and dentofacial orthopedics; approaches to 3D maxillofacial imaging; surgical navigation; (9) 2nd EuroNOTES/CARS workshop on NOTES: an interdisciplinary challenge; (10) 2nd EPMA/CARS workshop on personalized medicine and ICT.; (11)poster sessions.

  7. Rain Characteristics and Large-Scale Environments of Precipitation Objects with Extreme Rain Volumes from TRMM Observations

    Science.gov (United States)

    Zhou, Yaping; Lau, William K M.; Liu, Chuntao

    2013-01-01

    This study adopts a "precipitation object" approach by using 14 years of Tropical Rainfall Measuring Mission (TRMM) Precipitation Feature (PF) and National Centers for Environmental Prediction (NCEP) reanalysis data to study rainfall structure and environmental factors associated with extreme heavy rain events. Characteristics of instantaneous extreme volumetric PFs are examined and compared to those of intermediate and small systems. It is found that instantaneous PFs exhibit a much wider scale range compared to the daily gridded precipitation accumulation range. The top 1% of the rainiest PFs contribute over 55% of total rainfall and have 2 orders of rain volume magnitude greater than those of the median PFs. We find a threshold near the top 10% beyond which the PFs grow exponentially into larger, deeper, and colder rain systems. NCEP reanalyses show that midlevel relative humidity and total precipitable water increase steadily with increasingly larger PFs, along with a rapid increase of 500 hPa upward vertical velocity beyond the top 10%. This provides the necessary moisture convergence to amplify and sustain the extreme events. The rapid increase in vertical motion is associated with the release of convective available potential energy (CAPE) in mature systems, as is evident in the increase in CAPE of PFs up to 10% and the subsequent dropoff. The study illustrates distinct stages in the development of an extreme rainfall event including: (1) a systematic buildup in large-scale temperature and moisture, (2) a rapid change in rain structure, (3) explosive growth of the PF size, and (4) a release of CAPE before the demise of the event.

  8. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, George [Argonne National Lab. (ANL), Argonne, IL (United States); Glotzer, Sharon [University of Michigan; McCurdy, Bill [University of California Davis; Roberto, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2010-07-26

    brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of

  9. PRODEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP : HIGH PERFORMANCE COMPUTING WITH QCDOC AND BLUEGENE.

    Energy Technology Data Exchange (ETDEWEB)

    CHRIST,N.; DAVENPORT,J.; DENG,Y.; GARA,A.; GLIMM,J.; MAWHINNEY,R.; MCFADDEN,E.; PESKIN,A.; PULLEYBLANK,W.

    2003-03-11

    Staff of Brookhaven National Laboratory, Columbia University, IBM and the RIKEN BNL Research Center organized a one-day workshop held on February 28, 2003 at Brookhaven to promote the following goals: (1) To explore areas other than QCD applications where the QCDOC and BlueGene/L machines can be applied to good advantage, (2) To identify areas where collaboration among the sponsoring institutions can be fruitful, and (3) To expose scientists to the emerging software architecture. This workshop grew out of an informal visit last fall by BNL staff to the IBM Thomas J. Watson Research Center that resulted in a continuing dialog among participants on issues common to these two related supercomputers. The workshop was divided into three sessions, addressing the hardware and software status of each system, prospective applications, and future directions.

  10. Earth Exploration Toolbook Workshops: Helping Teachers and Students Analyze Web-based Scientific Data

    Science.gov (United States)

    McAuliffe, C.; Ledley, T.; Dahlman, L.; Haddad, N.

    2007-12-01

    One of the challenges faced by Earth science teachers, particularly in K-12 settings, is that of connecting scientific research to classroom experiences. Helping teachers and students analyze Web-based scientific data is one way to bring scientific research to the classroom. The Earth Exploration Toolbook (EET) was developed as an online resource to accomplish precisely that. The EET consists of chapters containing step-by-step instructions for accessing Web-based scientific data and for using a software analysis tool to explore issues or concepts in science, technology, and mathematics. For example, in one EET chapter, users download Earthquake data from the USGS and bring it into a geographic information system (GIS), analyzing factors affecting the distribution of earthquakes. The goal of the EET Workshops project is to provide professional development that enables teachers to incorporate Web-based scientific data and analysis tools in ways that meet their curricular needs. In the EET Workshops project, Earth science teachers participate in a pair of workshops that are conducted in a combined teleconference and Web-conference format. In the first workshop, the EET Data Analysis Workshop, participants are introduced to the National Science Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). They also walk through an Earth Exploration Toolbook (EET) chapter and discuss ways to use Earth science datasets and tools with their students. In a follow-up second workshop, the EET Implementation Workshop, teachers share how they used these materials in the classroom by describing the projects and activities that they carried out with students. The EET Workshops project offers unique and effective professional development. Participants work at their own Internet-connected computers, and dial into a toll-free group teleconference for step-by-step facilitation and interaction. They also receive support via Elluminate, a Web

  11. Coherent structures in tokamak plasmas workshop: Proceedings

    International Nuclear Information System (INIS)

    Koniges, A.E.; Craddock, G.G.

    1992-08-01

    Coherent structures have the potential to impact a variety of theoretical and experimental aspects of tokamak plasma confinement. This includes the basic processes controlling plasma transport, propagation and efficiency of external mechanisms such as wave heating and the accuracy of plasma diagnostics. While the role of coherent structures in fluid dynamics is better understood, this is a new topic for consideration by plasma physicists. This informal workshop arose out of the need to identify the magnitude of structures in tokamaks and in doing so, to bring together for the first time the surprisingly large number of plasma researchers currently involved in work relating to coherent structures. The primary purpose of the workshop, in addition to the dissemination of information, was to develop formal and informal collaborations, set the stage for future formation of a coherent structures working group or focus area under the heading of the Tokamak Transport Task Force, and to evaluate the need for future workshops on coherent structures. The workshop was concentrated in four basic areas with a keynote talk in each area as well as 10 additional presentations. The issues of discussion in each of these areas was as follows: Theory - Develop a definition of structures and coherent as it applies to plasmas. Experiment - Review current experiments looking for structures in tokamaks, discuss experimental procedures for finding structures, discuss new experiments and techniques. Fluids - Determine how best to utilize the resource of information available from the fluids community both on the theoretical and experimental issues pertaining to coherent structures in plasmas. Computation - Discuss computational aspects of studying coherent structures in plasmas as they relate to both experimental detection and theoretical modeling

  12. Computer Graphics for Student Engagement in Science Learning.

    Science.gov (United States)

    Cifuentes, Lauren; Hsieh, Yi-Chuan Jane

    2001-01-01

    Discusses student use of computer graphics software and presents documentation from a visualization workshop designed to help learners use computer graphics to construct meaning while they studied science concepts. Describes problems and benefits when delivering visualization workshops in the natural setting of a middle school. (Author/LRW)

  13. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  14. Effort-reward imbalance and one-year change in neck-shoulder and upper extremity pain among call center computer operators.

    Science.gov (United States)

    Krause, Niklas; Burgel, Barbara; Rempel, David

    2010-01-01

    The literature on psychosocial job factors and musculoskeletal pain is inconclusive in part due to insufficient control for confounding by biomechanical factors. The aim of this study was to investigate prospectively the independent effects of effort-reward imbalance (ERI) at work on regional musculoskeletal pain of the neck and upper extremities of call center operators after controlling for (i) duration of computer use both at work and at home, (ii) ergonomic workstation design, (iii) physical activities during leisure time, and (iv) other individual worker characteristics. This was a one-year prospective study among 165 call center operators who participated in a randomized ergonomic intervention trial that has been described previously. Over an approximate four-week period, we measured ERI and 28 potential confounders via a questionnaire at baseline. Regional upper-body pain and computer use was measured by weekly surveys for up to 12 months following the implementation of ergonomic interventions. Regional pain change scores were calculated as the difference between average weekly pain scores pre- and post intervention. A significant relationship was found between high average ERI ratios and one-year increases in right upper-extremity pain after adjustment for pre-intervention regional mean pain score, current and past physical workload, ergonomic workstation design, and anthropometric, sociodemographic, and behavioral risk factors. No significant associations were found with change in neck-shoulder or left upper-extremity pain. This study suggests that ERI predicts regional upper-extremity pain in -computer operators working >or=20 hours per week. Control for physical workload and ergonomic workstation design was essential for identifying ERI as a risk factor.

  15. HTTR workshop (workshop on hydrogen production technology)

    International Nuclear Information System (INIS)

    Shiina, Yasuaki; Takizuka, Takakazu

    2004-12-01

    Various research and development efforts have been performed to solve the global energy and environmental problems caused by large consumption of fossil fuels. Research activities on advanced hydrogen production technology by the use of nuclear heat from high temperature gas cooled reactors, for example, have been flourished in universities, research institutes and companies in many countries. The Department of HTTR Project and the Department of Advanced Nuclear Heat Technology of JAERI held the HTTR Workshop (Workshop on Hydrogen Production Technology) on July 5 and 6, 2004 to grasp the present status of R and D about the technology of HTGR and the nuclear hydrogen production in the world and to discuss about necessity of the nuclear hydrogen production and technical problems for the future development of the technology. More than 110 participants attended the Workshop including foreign participants from USA, France, Korea, Germany, Canada and United Kingdom. In the Workshop, the presentations were made on such topics as R and D programs for nuclear energy and hydrogen production technologies by thermo-chemical or other processes. Also, the possibility of the nuclear hydrogen production in the future society was discussed. The workshop showed that the R and D for the hydrogen production by the thermo-chemical process has been performed in many countries. The workshop affirmed that nuclear hydrogen production could be one of the competitive supplier of hydrogen in the future. The second HTTR Workshop will be held in the autumn next year. (author)

  16. Validity and Reliability of the Upper Extremity Work Demands Scale.

    Science.gov (United States)

    Jacobs, Nora W; Berduszek, Redmar J; Dijkstra, Pieter U; van der Sluis, Corry K

    2017-12-01

    Purpose To evaluate validity and reliability of the upper extremity work demands (UEWD) scale. Methods Participants from different levels of physical work demands, based on the Dictionary of Occupational Titles categories, were included. A historical database of 74 workers was added for factor analysis. Criterion validity was evaluated by comparing observed and self-reported UEWD scores. To assess structural validity, a factor analysis was executed. For reliability, the difference between two self-reported UEWD scores, the smallest detectable change (SDC), test-retest reliability and internal consistency were determined. Results Fifty-four participants were observed at work and 51 of them filled in the UEWD twice with a mean interval of 16.6 days (SD 3.3, range = 10-25 days). Criterion validity of the UEWD scale was moderate (r = .44, p = .001). Factor analysis revealed that 'force and posture' and 'repetition' subscales could be distinguished with Cronbach's alpha of .79 and .84, respectively. Reliability was good; there was no significant difference between repeated measurements. An SDC of 5.0 was found. Test-retest reliability was good (intraclass correlation coefficient for agreement = .84) and all item-total correlations were >.30. There were two pairs of highly related items. Conclusion Reliability of the UEWD scale was good, but criterion validity was moderate. Based on current results, a modified UEWD scale (2 items removed, 1 item reworded, divided into 2 subscales) was proposed. Since observation appeared to be an inappropriate gold standard, we advise to investigate other types of validity, such as construct validity, in further research.

  17. Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop: August 4-5, 2015, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Sprague, Michael A. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Boldyrev, Stanislav [Univ. of Wisconsin, Madison, WI (United States); Fischer, Paul [Argonne National Lab. (ANL), Argonne, IL (United States); Univ. of Illinois, Urbana-Champaign, IL (United States); Grout, Ray [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gustafson, Jr., William I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moser, Robert [Univ. of Texas, Austin, TX (United States)

    2017-01-01

    This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.

  18. 7th International Workshop on Advanced Optical Imaging and Metrology

    CERN Document Server

    2014-01-01

    In continuation of the FRINGE Workshop Series this Proceeding contains all contributions presented at the 7. International Workshop on Advanced Optical Imaging and Metrology. The FRINGE Workshop Series is dedicated to the presentation, discussion and dissemination of recent results in Optical Imaging and Metrology. Topics of particular interest for the 7. Workshop are: - New methods and tools for the generation, acquisition, processing, and evaluation of data in Optical Imaging and Metrology (digital wavefront engineering, computational imaging, model-based reconstruction, compressed sensing, inverse problems solution) - Application-driven technologies in Optical Imaging and Metrology (high-resolution, adaptive, active, robust, reliable, flexible, in-line, real-time) - High-dynamic range solutions in Optical Imaging and Metrology (from macro to nano) - Hybrid technologies in Optical Imaging and Metrology (hybrid optics, sensor and data fusion, model-based solutions, multimodality) - New optical sensors, imagi...

  19. Large scale computing in theoretical physics: Example QCD

    International Nuclear Information System (INIS)

    Schilling, K.

    1986-01-01

    The limitations of the classical mathematical analysis of Newton and Leibniz appear to be more and more overcome by the power of modern computers. Large scale computing techniques - which resemble closely the methods used in simulations within statistical mechanics - allow to treat nonlinear systems with many degrees of freedom such as field theories in nonperturbative situations, where analytical methods do fail. The computation of the hadron spectrum within the framework of lattice QCD sets a demanding goal for the application of supercomputers in basic science. It requires both big computer capacities and clever algorithms to fight all the numerical evils that one encounters in the Euclidean world. The talk will attempt to describe both the computer aspects and the present state of the art of spectrum calculations within lattice QCD. (orig.)

  20. Proceedings of the Thirteenth Annual Software Engineering Workshop

    Science.gov (United States)

    1988-01-01

    Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.

  1. Statistical Analysis of CFD Solutions from the Fourth AIAA Drag Prediction Workshop

    Science.gov (United States)

    Morrison, Joseph H.

    2010-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from the U.S., Europe, Asia, and Russia using a variety of grid systems and turbulence models for the June 2009 4th Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was a new subsonic transport model, the Common Research Model, designed using a modern approach for the wing and included a horizontal tail. The fourth workshop focused on the prediction of both absolute and incremental drag levels for wing-body and wing-body-horizontal tail configurations. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with earlier workshops using the statistical framework.

  2. FASTBUS software workshop

    International Nuclear Information System (INIS)

    1985-01-01

    FASTBUS is a standard for modular high-speed data acquisition, data-processing and control, development for use in high-energy physics experiments incorporating different types of computers and microprocessors. This Workshop brought together users from different laboratories for a review of current software activities, using the standard both in experiments and for test equipment. There are also papers on interfacing and the present state of systems being developed for use in future LEP experiments. Also included is a discussion on the proposed revision of FASTBUS Standard Routines. (orig.)

  3. Extreme scale multi-physics simulations of the tsunamigenic 2004 Sumatra megathrust earthquake

    Science.gov (United States)

    Ulrich, T.; Gabriel, A. A.; Madden, E. H.; Wollherr, S.; Uphoff, C.; Rettenberger, S.; Bader, M.

    2017-12-01

    SeisSol (www.seissol.org) is an open-source software package based on an arbitrary high-order derivative Discontinuous Galerkin method (ADER-DG). It solves spontaneous dynamic rupture propagation on pre-existing fault interfaces according to non-linear friction laws, coupled to seismic wave propagation with high-order accuracy in space and time (minimal dispersion errors). SeisSol exploits unstructured meshes to account for complex geometries, e.g. high resolution topography and bathymetry, 3D subsurface structure, and fault networks. We present the up-to-date largest (1500 km of faults) and longest (500 s) dynamic rupture simulation modeling the 2004 Sumatra-Andaman earthquake. We demonstrate the need for end-to-end-optimization and petascale performance of scientific software to realize realistic simulations on the extreme scales of subduction zone earthquakes: Considering the full complexity of subduction zone geometries leads inevitably to huge differences in element sizes. The main code improvements include a cache-aware wave propagation scheme and optimizations of the dynamic rupture kernels using code generation. In addition, a novel clustered local-time-stepping scheme for dynamic rupture has been established. Finally, asynchronous output has been implemented to overlap I/O and compute time. We resolve the frictional sliding process on the curved mega-thrust and a system of splay faults, as well as the seismic wave field and seafloor displacement with frequency content up to 2.2 Hz. We validate the scenario by geodetic, seismological and tsunami observations. The resulting rupture dynamics shed new light on the activation and importance of splay faults.

  4. Promoting Cultural Awareness: A Faculty Development Workshop on Cultural Competency.

    Science.gov (United States)

    Carnevale, Franco A; Macdonald, Mary Ellen; Razack, Saleem; Steinert, Yvonne

    2015-06-01

    An interdisciplinary faculty development workshop on cultural competency (CC) was implemented and evaluated for the Faculty of Medicine at McGill University. It consisted of a 4-hour workshop and 2 follow-up sessions. A reflective practice framework was used. The project was evaluated using the Multicultural Assessment Questionnaire (MAQ), evaluation forms completed by participants, and detailed field notes taken during the sessions. The workshop was attended by 49 faculty members with diverse professional backgrounds. Statistically significant improvements were measured using the MAQ. On a scale of 1 to 5 (5 = very useful) on the evaluation form, the majority of participants (76.1%) gave the workshop a score of 4 or 5 for overall usefulness. A thematic analysis of field-note data highlighted participant responses to specific activities in the workshop. Participants expressed a need for faculty development initiatives on CC such as this one. Copyright© by Ingram School of Nursing, McGill University.

  5. Understanding extreme sea levels for broad-scale coastal impact and adaptation analysis

    Science.gov (United States)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A. B. A.

    2017-07-01

    One of the main consequences of mean sea level rise (SLR) on human settlements is an increase in flood risk due to an increase in the intensity and frequency of extreme sea levels (ESL). While substantial research efforts are directed towards quantifying projections and uncertainties of future global and regional SLR, corresponding uncertainties in contemporary ESL have not been assessed and projections are limited. Here we quantify, for the first time at global scale, the uncertainties in present-day ESL estimates, which have by default been ignored in broad-scale sea-level rise impact assessments to date. ESL uncertainties exceed those from global SLR projections and, assuming that we meet the Paris agreement goals, the projected SLR itself by the end of the century in many regions. Both uncertainties in SLR projections and ESL estimates need to be understood and combined to fully assess potential impacts and adaptation needs.

  6. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    OpenAIRE

    Sanggoo Kang; Kiwon Lee

    2016-01-01

    Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-bas...

  7. Energy and nuclear power planning using the IAEA's ENPEP computer package. Proceedings of a workshop

    International Nuclear Information System (INIS)

    1997-09-01

    The Regional (Europe) Technical Co-operation Project on the Study of Energy Options Using the IAEA Planning Methodologies was first implemented by the IAEA in 1995. The project aims at improving national capabilities for energy, electricity and nuclear power planning and promoting regional co-operation among participating countries in the European region. The project includes the organization of workshops, training activities at the regional and national levels, scientific visits, etc. The proceedings of a workshop held in Warsaw, Poland, from 4 to 8 September 1995 are contained herein. The workshop had as a basic objective the analysis of the specific problems encountered by the represented countries during application of the IAEA's ENPEP package in the conduct of national studies and to provide a forum for further co-operation among participating countries. A second objective of the workshop was to make proposals for future activities to be organized within the project. This publication is intended to serve as reference for the users of the IAEA's ENPEP package, as well as for energy and electricity planners in general. Refs, figs, tabs

  8. The scaling of population persistence with carrying capacity does not asymptote in populations of a fish experiencing extreme climate variability.

    Science.gov (United States)

    White, Richard S A; Wintle, Brendan A; McHugh, Peter A; Booker, Douglas J; McIntosh, Angus R

    2017-06-14

    Despite growing concerns regarding increasing frequency of extreme climate events and declining population sizes, the influence of environmental stochasticity on the relationship between population carrying capacity and time-to-extinction has received little empirical attention. While time-to-extinction increases exponentially with carrying capacity in constant environments, theoretical models suggest increasing environmental stochasticity causes asymptotic scaling, thus making minimum viable carrying capacity vastly uncertain in variable environments. Using empirical estimates of environmental stochasticity in fish metapopulations, we showed that increasing environmental stochasticity resulting from extreme droughts was insufficient to create asymptotic scaling of time-to-extinction with carrying capacity in local populations as predicted by theory. Local time-to-extinction increased with carrying capacity due to declining sensitivity to demographic stochasticity, and the slope of this relationship declined significantly as environmental stochasticity increased. However, recent 1 in 25 yr extreme droughts were insufficient to extirpate populations with large carrying capacity. Consequently, large populations may be more resilient to environmental stochasticity than previously thought. The lack of carrying capacity-related asymptotes in persistence under extreme climate variability reveals how small populations affected by habitat loss or overharvesting, may be disproportionately threatened by increases in extreme climate events with global warming. © 2017 The Author(s).

  9. Changes in intensity of precipitation extremes in Romania on very hight temporal scale and implications on the validity of the Clausius-Clapeyron relation

    Science.gov (United States)

    Busuioc, Aristita; Baciu, Madalina; Breza, Traian; Dumitrescu, Alexandru; Stoica, Cerasela; Baghina, Nina

    2016-04-01

    Many observational, theoretical and based on climate model simulation studies suggested that warmer climates lead to more intense precipitation events, even when the total annual precipitation is slightly reduced. In this way, it was suggested that extreme precipitation events may increase at Clausius-Clapeyron (CC) rate under global warming and constraint of constant relative humidity. However, recent studies show that the relationship between extreme rainfall intensity and atmospheric temperature is much more complex than would be suggested by the CC relationship and is mainly dependent on precipitation temporal resolution, region, storm type and whether the analysis is conducted on storm events rather than fixed data. The present study presents the dependence between the very hight temporal scale extreme rainfall intensity and daily temperatures, with respect to the verification of the CC relation. To solve this objective, the analysis is conducted on rainfall event rather than fixed interval using the rainfall data based on graphic records including intensities (mm/min.) calculated over each interval with permanent intensity per minute. The annual interval with available a such data (April to October) is considered at 5 stations over the interval 1950-2007. For Bucuresti-Filaret station the analysis is extended over the longer interval (1898-2007). For each rainfall event, the maximum intensity (mm/min.) is retained and these time series are considered for the further analysis (abbreviated in the following as IMAX). The IMAX data were divided based on the daily mean temperature into bins 2oC - wide. The bins with less than 100 values were excluded. The 90th, 99th and 99.9th percentiles were computed from the binned data using the empirical distribution and their variability has been compared to the CC scaling (e.g. exponential relation given by a 7% increase per temperature degree rise). The results show a dependence close to double the CC relation for

  10. XACC - eXtreme-scale Accelerator Programming Framework

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.

  11. Preparing Residents for Teaching Careers: The Faculty for Tomorrow Resident Workshop.

    Science.gov (United States)

    Lin, Steven; Gordon, Paul

    2017-03-01

    Progress toward growing the primary care workforce is at risk of being derailed by an emerging crisis: a critical shortage of family medicine faculty. In response to the faculty shortage, the Society of Teachers of Family Medicine (STFM) launched a 2-year initiative called "Faculty for Tomorrow" (F4T). The F4T Task Force created a workshop designed to increase residents' interest in, and prepare them for, careers in academic family medicine. We aimed to evaluate the effectiveness of this workshop. Participants were family medicine residents who preregistered for and attended the F4T Resident Workshop at the 2016 STFM Annual Spring Conference. The intervention was a full-day, 9-hour preconference workshop delivered by a multi-institutional faculty team. Participants were asked to complete a questionnaire before and immediately after the workshop. Data collected included demographics, residency program characteristics, future career plans, self-reported confidence in skills, and general knowledge relevant to becoming faculty. A total of 75 participants attended the workshop. The proportion of those who were "extremely likely" to pursue a career in academic family medicine increased from 58% to 72%. Participants reported statistically significant improvements in their confidence in clinical teaching, providing feedback to learners, writing an effective CV, knowledge about the structure of academic family medicine, and knowledge about applying for a faculty position. The STFM F4T Resident Workshop was effective at increasing participants' interest in academic careers, as well as self-reported confidence in skills and knowledge relevant to becoming faculty. The data collected from participants regarding their career plans may inform future interventions.

  12. European Workshop on Renewable Rural Energy Applications in North-East Europe

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop is a part of the E.C. Thermie B project `Dissemination of Promising Renewable Rural Energy Applications in North-Eastern Europe`. The presentations held in the workshop are collected in this publication. The subjects are: TEKES (Technology Development Centre) Boost Technology; Renewable Energy in Latvia; Rural Renewable energy (Prospects) in Estonia; Renewable energy from Rural Electrification; Techno-Economic Analysis published as a summary; Practical Experiences of Small-Scale Heat Generation from Fuelwood in Finland; Solar systems for Domestic Hot Water and Space Heating; Biomass for Energy: Small-Scale Technologies; Photovoltaic Applications for Rural Areas in the North-East Europe

  13. European Workshop on Renewable Rural Energy Applications in North-East Europe

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    This workshop is a part of the E.C. Thermie B project `Dissemination of Promising Renewable Rural Energy Applications in North-Eastern Europe`. The presentations held in the workshop are collected in this publication. The subjects are: TEKES (Technology Development Centre) Boost Technology; Renewable Energy in Latvia; Rural Renewable energy (Prospects) in Estonia; Renewable energy from Rural Electrification; Techno-Economic Analysis published as a summary; Practical Experiences of Small-Scale Heat Generation from Fuelwood in Finland; Solar systems for Domestic Hot Water and Space Heating; Biomass for Energy: Small-Scale Technologies; Photovoltaic Applications for Rural Areas in the North-East Europe

  14. Imaging Sciences Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.

    1996-11-21

    This report contains the proceedings of the Imaging Sciences Workshop sponsored by C.A.S.LS., the Center for Advanced Signal & Image Sciences. The Center, established primarily to provide a forum where researchers can freely exchange ideas on the signal and image sciences in a comfortable intellectual environment, has grown over the last two years with the opening of a Reference Library (located in Building 272). The Technical Program for the 1996 Workshop include a variety of efforts in the Imaging Sciences including applications in the Microwave Imaging, highlighted by the Micro-Impulse Radar (MIR) system invented at LLNL, as well as other applications in this area. Special sessions organized by various individuals in Speech, Acoustic Ocean Imaging, Radar Ocean Imaging, Ultrasonic Imaging, and Optical Imaging discuss various applica- tions of real world problems. For the more theoretical, sessions on Imaging Algorithms and Computed Tomography were organized as well as for the more pragmatic featuring a session on Imaging Systems.

  15. Workshop on Learning Technology for Education in Cloud

    CERN Document Server

    Rodríguez, Emilio; Santana, Juan; Prieta, Fernando

    2012-01-01

    Learning Technology for Education in Cloud investigates how cloud computing can be used to design applications to support real time on demand learning using technologies. The workshop proceedings provide opportunities for delegates to discuss the latest research in TEL (Technology Enhanced Learning) and its impacts for learners and institutions, using cloud.   The Workshop on Learning Technology for Education in Cloud (LTEC '12) is a forum where researchers, educators and practitioners came together to discuss ideas, projects and lessons learned related to the use of learning technology in cloud, on the 11th-13th July at Salamanca in Spain.

  16. Workshops and problems for benchmarking eddy current codes

    International Nuclear Information System (INIS)

    Turner, L.R.; Davey, K.; Ida, N.; Rodger, D.; Kameari, A.; Bossavit, A.; Emson, C.R.I.

    1988-02-01

    A series of six workshops was held to compare eddy current codes, using six benchmark problems. The problems include transient and steady-state ac magnetic fields, close and far boundary conditions, magnetic and non-magnetic materials. All the problems are based either on experiments or on geometries that can be solved analytically. The workshops and solutions to the problems are described. Results show that many different methods and formulations give satisfactory solutions, and that in many cases reduced dimensionality or coarse discretization can give acceptable results while reducing the computer time required. 13 refs., 1 tab

  17. Proceedings of the Third International Workshop on Jointed Structures.

    Energy Technology Data Exchange (ETDEWEB)

    Starr, Michael James; Brake, Matthew Robert; Segalman, Daniel Joseph; Bergman, Lawrence A.; Ewins, David J.

    2013-08-01

    The Third International Workshop on Jointed Structures was held from August 16th to 17th, 2012, in Chicago Illinois, following the ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. Thirty two researchers from both the United States and international locations convened to discuss the recent progress of mechanical joints related research and associated efforts in addition to developing a roadmap for the challenges to be addressed over the next five to ten years. These proceedings from the workshop include the minutes of the discussions and follow up from the 2009 workshop [1], presentations, and outcomes of the workshop. Specifically, twelve challenges were formulated from the discussions at the workshop, which focus on developing a better understanding of uncertainty and variability in jointed structures, incorporating high fidelity models of joints in simulations that are tractable/efficient, motivating a new generation of researchers and funding agents as to the importance of joint mechanics research, and developing new insights into the physical phenomena that give rise to energy dissipation in jointed structures. The ultimate goal of these research efforts is to develop a predictive model of joint mechanics.

  18. PREFACE: 4th Workshop on Theory, Modelling and Computational Methods for Semiconductors (TMCSIV)

    Science.gov (United States)

    Tomić, Stanko; Probert, Matt; Migliorato, Max; Pal, Joydeep

    2014-06-01

    These conference proceedings contain the written papers of the contributions presented at the 4th International Conference on Theory, Modelling and Computational Methods for Semiconductor materials and nanostructures. The conference was held at the MediaCityUK, University of Salford, Manchester, UK on 22-24 January 2014. The previous conferences in this series took place in 2012 at the University of Leeds, in 2010 at St William's College, York and in 2008 at the University of Manchester, UK. The development of high-performance computer architectures is finally allowing the routine use of accurate methods for calculating the structural, thermodynamic, vibrational, optical and electronic properties of semiconductors and their hetero- and nano-structures. The scope of this conference embraces modelling, theory and the use of sophisticated computational tools in semiconductor science and technology, where there is substantial potential for time-saving in R&D. Theoretical approaches represented in this meeting included: Density Functional Theory, Semi-empirical Electronic Structure Methods, Multi-scale Approaches, Modelling of PV devices, Electron Transport, and Graphene. Topics included, but were not limited to: Optical Properties of Quantum Nanostructures including Colloids and Nanotubes, Plasmonics, Magnetic Semiconductors, Photonic Structures, and Electronic Devices. This workshop ran for three days, with the objective of bringing together UK and international leading experts in the theoretical modelling of Group IV, III-V and II-VI semiconductors, as well as students, postdocs and early-career researchers. The first day focused on providing an introduction and overview of this vast field, aimed particularly at students, with several lectures given by recognized experts in various theoretical approaches. The following two days showcased some of the best theoretical research carried out in the UK in this field, with several contributions also from representatives of

  19. SCALE-4 [Standardized Computer Analyses for Licensing Evaluation]: An improved computational system for spent-fuel cask analysis

    International Nuclear Information System (INIS)

    Parks, C.V.

    1989-01-01

    The purpose of this paper is to provide specific information regarding improvements available with Version 4.0 of the SCALE system and discuss the future of SCALE within the current computing and regulatory environment. The emphasis focuses on the improvements in SCALE-4 over that available in SCALE-3. 10 refs., 1 fig., 1 tab

  20. Engineering of an Extreme Rainfall Detection System using Grid Computing

    Directory of Open Access Journals (Sweden)

    Olivier Terzo

    2012-10-01

    Full Text Available This paper describes a new approach for intensive rainfall data analysis. ITHACA's Extreme Rainfall Detection System (ERDS is conceived to provide near real-time alerts related to potential exceptional rainfalls worldwide, which can be used by WFP or other humanitarian assistance organizations to evaluate the event and understand the potentially floodable areas where their assistance is needed. This system is based on precipitation analysis and it uses rainfall data from satellite at worldwide extent. This project uses the Tropical Rainfall Measuring Mission Multisatellite Precipitation Analysis dataset, a NASA-delivered near real-time product for current rainfall condition monitoring over the world. Considering the great deal of data to process, this paper presents an architectural solution based on Grid Computing techniques. Our focus is on the advantages of using a distributed architecture in terms of performances for this specific purpose.

  1. 7th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Jung, Jason; Badica, Costin

    2014-01-01

    This book represents the combined peer-reviewed proceedings of the Seventh International Symposium on Intelligent Distributed Computing - IDC-2013, of the Second Workshop on Agents for Clouds - A4C-2013, of the Fifth International Workshop on Multi-Agent Systems Technology and Semantics - MASTS-2013, and of the International Workshop on Intelligent Robots - iR-2013. All the events were held in Prague, Czech Republic during September 4-6, 2013. The 41 contributions published in this book address many topics related to theory and applications of intelligent distributed computing and multi-agent systems, including: agent-based data processing, ambient intelligence, bio-informatics, collaborative systems, cryptography and security, distributed algorithms, grid and cloud computing, information extraction, intelligent robotics, knowledge management, linked data, mobile agents, ontologies, pervasive computing, self-organizing systems, peer-to-peer computing, social networks and trust, and swarm intelligence.  .

  2. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  3. Scaling and clustering effects of extreme precipitation distributions

    Science.gov (United States)

    Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Li, Jianfeng

    2012-08-01

    SummaryOne of the impacts of climate change and human activities on the hydrological cycle is the change in the precipitation structure. Closely related to the precipitation structure are two characteristics: the volume (m) of wet periods (WPs) and the time interval between WPs or waiting time (t). Using daily precipitation data for a period of 1960-2005 from 590 rain gauge stations in China, these two characteristics are analyzed, involving scaling and clustering of precipitation episodes. Our findings indicate that m and t follow similar probability distribution curves, implying that precipitation processes are controlled by similar underlying thermo-dynamics. Analysis of conditional probability distributions shows a significant dependence of m and t on their previous values of similar volumes, and the dependence tends to be stronger when m is larger or t is longer. It indicates that a higher probability can be expected when high-intensity precipitation is followed by precipitation episodes with similar precipitation intensity and longer waiting time between WPs is followed by the waiting time of similar duration. This result indicates the clustering of extreme precipitation episodes and severe droughts or floods are apt to occur in groups.

  4. Game-Coding Workshops in New Zealand Public Libraries: Evaluation of a Pilot Project

    Science.gov (United States)

    Bolstad, Rachel

    2016-01-01

    This report evaluates a game coding workshop offered to young people and adults in seven public libraries round New Zealand. Participants were taken step by step through the process of creating their own simple 2D videogame, learning the basics of coding, computational thinking, and digital game design. The workshops were free and drew 426 people…

  5. Towards an integrated multiscale simulation of turbulent clouds on PetaScale computers

    International Nuclear Information System (INIS)

    Wang Lianping; Ayala, Orlando; Parishani, Hossein; Gao, Guang R; Kambhamettu, Chandra; Li Xiaoming; Rossi, Louis; Orozco, Daniel; Torres, Claudio; Grabowski, Wojciech W; Wyszogrodzki, Andrzej A; Piotrowski, Zbigniew

    2011-01-01

    The development of precipitating warm clouds is affected by several effects of small-scale air turbulence including enhancement of droplet-droplet collision rate by turbulence, entrainment and mixing at the cloud edges, and coupling of mechanical and thermal energies at various scales. Large-scale computation is a viable research tool for quantifying these multiscale processes. Specifically, top-down large-eddy simulations (LES) of shallow convective clouds typically resolve scales of turbulent energy-containing eddies while the effects of turbulent cascade toward viscous dissipation are parameterized. Bottom-up hybrid direct numerical simulations (HDNS) of cloud microphysical processes resolve fully the dissipation-range flow scales but only partially the inertial subrange scales. it is desirable to systematically decrease the grid length in LES and increase the domain size in HDNS so that they can be better integrated to address the full range of scales and their coupling. In this paper, we discuss computational issues and physical modeling questions in expanding the ranges of scales realizable in LES and HDNS, and in bridging LES and HDNS. We review our on-going efforts in transforming our simulation codes towards PetaScale computing, in improving physical representations in LES and HDNS, and in developing better methods to analyze and interpret the simulation results.

  6. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  7. Dynamics of group knowledge production in facilitated modelling workshops

    DEFF Research Database (Denmark)

    Tavella, Elena; Franco, L. Alberto

    2015-01-01

    by which models are jointly developed with group members interacting face-to-face, with or without computer support. The models produced are used to inform negotiations about the nature of the issues faced by the group, and how to address them. While the facilitated modelling literature is impressive......, the workshop. Drawing on the knowledge-perspective of group communication, we conducted a micro-level analysis of a transcript of a facilitated modelling workshop held with the management team of an Alternative Food Network in the UK. Our analysis suggests that facilitated modelling interactions can take...

  8. The Astronomy Workshop

    Science.gov (United States)

    Hamilton, D. P.

    2005-05-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive online educational tools developed for use by students, educators, and the general public. The more than 20 tools in the Astronomy Workshop are rated for ease-of-use, and have been extensively tested in large university survey courses, classes for undergraduate majors, and High Schools. Here we briefly describe a few of the more popular tools. The Life of the Sun (New!): The history of the Sun is animated as a movie, showing students how the size and color of our star has evolved and will evolve in time. Animated Orbits of Planets and Moons: The orbital motions of planets, moons, asteroids, and comets are animated at their correct relative speeds in accurate to-scale drawings. Solar System Collisions: This most popular of our applications shows what happens when an asteroid or comet with user-defined size and speed impacts a given planet. The program calculates many effects, including the country of impact (if Earth is the target), energy of explosion, crater size, and magnitude of the ``planetquake'' generated. It also displays a relevant image (e.g. terrestrial crater, lunar crater, etc.). Astronomical Distances: Travel away from the Earth at a chosen speed and see how long it takes to reach other planets, stars and galaxies. This tool helps students visualize astronomical distances in an intuitive way. Funding for the Astronomy Workshop is provided by a NASA EPO grant.

  9. International Workshop on Carbon Cycling and Coral Reef Metabolism; Sangosho no tanso junkan ni kansuru kokusai workshop hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-10-16

    The paper described the International Workshop on Carbon Cycling and Coral Reef Metabolism which was held at Miyako-jima, Okinawa Pref. on October 17-24, 1995. In the workshop, researchers got together which are involved in marine chemistry, marine biology, coral ecology, and environmental science, and discussed the carbon cycling and metabolism of coral reef. Discussions were made on what the coral reef ecosystem is, and what the definition of a sink or a source for CO2 is. Also discussed were scales of how much time and space should be considered to make these issues clear. Further, it was proposed that it was necessary to investigate carbon balance of both the whole system and the components of the system and to keep track of mass transfer among neighboring components of the system. Seventeen presentations were given. The workshop obtained a definite consensus on carbon balance of the coral reef system. 123 refs., 39 figs., 9 tabs.

  10. Extreme weather events in southern Germany - Climatological risk and development of a large-scale identification procedure

    Science.gov (United States)

    Matthies, A.; Leckebusch, G. C.; Rohlfing, G.; Ulbrich, U.

    2009-04-01

    Extreme weather events such as thunderstorms, hail and heavy rain or snowfall can pose a threat to human life and to considerable tangible assets. Yet there is a lack of knowledge about present day climatological risk and its economic effects, and its changes due to rising greenhouse gas concentrations. Therefore, parts of economy particularly sensitve to extreme weather events such as insurance companies and airports require regional risk-analyses, early warning and prediction systems to cope with such events. Such an attempt is made for southern Germany, in close cooperation with stakeholders. Comparing ERA40 and station data with impact records of Munich Re and Munich Airport, the 90th percentile was found to be a suitable threshold for extreme impact relevant precipitation events. Different methods for the classification of causing synoptic situations have been tested on ERA40 reanalyses. An objective scheme for the classification of Lamb's circulation weather types (CWT's) has proved to be most suitable for correct classification of the large-scale flow conditions. Certain CWT's have been turned out to be prone to heavy precipitation or on the other side to have a very low risk of such events. Other large-scale parameters are tested in connection with CWT's to find out a combination that has the highest skill to identify extreme precipitation events in climate model data (ECHAM5 and CLM). For example vorticity advection in 700 hPa shows good results, but assumes knowledge of regional orographic particularities. Therefore ongoing work is focused on additional testing of parameters that indicate deviations of a basic state of the atmosphere like the Eady Growth Rate or the newly developed Dynamic State Index. Evaluation results will be used to estimate the skill of the regional climate model CLM concerning the simulation of frequency and intensity of the extreme weather events. Data of the A1B scenario (2000-2050) will be examined for a possible climate change

  11. Identification of discrete vascular lesions in the extremities using post-mortem computed tomography angiography – Case reports

    NARCIS (Netherlands)

    Haakma, Wieke; Rohde, Marianne; Uhrenholt, Lars; Pedersen, Michael; Boel, Lene Warner Thorup

    2017-01-01

    In this case report, we introduced post-mortem computed tomography angiography (PMCTA) in three cases suffering from vascular lesions in the upper extremities. In each subject, the third part of the axillary arteries and veins were used to catheterize the arms. The vessels were filled with a barium

  12. Emerging and Future Computing Paradigms and Their Impact on the Research, Training, and Design Environments of the Aerospace Workforce

    Science.gov (United States)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.

  13. Learning to consult with computers.

    Science.gov (United States)

    Liaw, S T; Marty, J J

    2001-07-01

    To develop and evaluate a strategy to teach skills and issues associated with computers in the consultation. An overview lecture plus a workshop before and a workshop after practice placements, during the 10-week general practice (GP) term in the 5th year of the University of Melbourne medical course. Pre- and post-intervention study using a mix of qualitative and quantitative methods within a strategic evaluation framework. Self-reported attitudes and skills with clinical applications before, during and after the intervention. Most students had significant general computer experience but little in the medical area. They found the workshops relevant, interesting and easy to follow. The role-play approach facilitated students' learning of relevant communication and consulting skills and an appreciation of issues associated with using the information technology tools in simulated clinical situations to augment and complement their consulting skills. The workshops and exposure to GP systems were associated with an increase in the use of clinical software, more realistic expectations of existing clinical and medical record software and an understanding of the barriers to the use of computers in the consultation. The educational intervention assisted students to develop and express an understanding of the importance of consulting and communication skills in teaching and learning about medical informatics tools, hardware and software design, workplace issues and the impact of clinical computer systems on the consultation and patient care.

  14. Report on the SNL/AWE/NSF international workshop on joint mechanics, Dartington, United Kingdom, 2729 April 2009.

    Energy Technology Data Exchange (ETDEWEB)

    Ewins, David J. (University of Bristol, UK); Bergman, Lawrence A. (University of Illinois, Urbana, IL); Segalman, Daniel Joseph

    2010-08-01

    The SNL/AWE joint mechanics workshop, held in Dartington Hall, Totnes, Devon, UK 26-29 April 2009 was a follow up to another international joints workshop held in Arlington, Virginia, in October 2006. The preceding workshop focused on identifying what length scales and interactions would be necessary to provide a scientific basis for analyzing and understanding joint mechanics from the atomistic scale on upward. In contrast, the workshop discussed in this report, focused much more on identification and development of methods at longer length scales that can have a nearer term impact on engineering analysis, design, and prediction of the dynamics of jointed structures. Also, the 2009 meeting employed less technical presentation and more break out sessions for developing focused strategies than was the case with the early workshop. Several 'challenges' were identified and assignments were made to teams to develop approaches to address those challenges.

  15. Ground Motion Saturation Evaluation (GMSE) Data Needs Workshop

    International Nuclear Information System (INIS)

    NA

    2004-01-01

    The objective of the data needs workshop is to identify potential near-term (12-18 month) studies that would reduce uncertainty in extremely low probability ( -5 /yr) earthquake ground motions at Yucca Mountain. Recommendations made at the workshop will be considered by BSC and DOE management in formulating plans for FY05 seismic-related investigations. Based on studies done earlier this year, a bound on peak ground velocities (PGVs), consisting of a uniform distribution from 150 cm/s to 500 cm/s, has been applied to the existing PGV hazard curve for the underground repository horizon, for use in the forthcoming License Application. The technical basis for this bounding distribution is being documented, along with the basis for a slightly less conservative bound in the form of a roughly triangular distribution from 153 cm/s to 451 cm/s. The objective of the GMSE studies is to provide a technical basis for reducing remaining excessive conservatism, if any, in the extremely low probability ground motions that are used in postclosure performance assessments. Potential studies that have already been suggested include: (1) Additional tests of failure-strains of repository rocks, at, above, and below the repository horizon; (2) Identification and evaluation of nuclear explosion data that may help establish strain limits in tuff; (3) Numerical modeling of seismic wave propagation through repository rock column to test hypothesis that nonwelded tuffs below the repository horizon would fail in tension and prevent extreme strains from being transmitted to the repository; (4) Evaluation of seismic failure threshold of bladed, fragile-appearing lithophysal crystals; (5) Evaluation of whether a ground motion parameter other than PGV would correlate better with calculated drip-shield and waste-package damage states; (6) Qualification and use of finite seismic-source model to evaluate probabilities of extreme ground motions from extreme scenario earthquakes (e.g., magnitude 6

  16. The Convergence of High Performance Computing and Large Scale Data Analytics

    Science.gov (United States)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  17. AN AUTOMATIC DETECTION METHOD FOR EXTREME-ULTRAVIOLET DIMMINGS ASSOCIATED WITH SMALL-SCALE ERUPTION

    Energy Technology Data Exchange (ETDEWEB)

    Alipour, N.; Safari, H. [Department of Physics, University of Zanjan, P.O. Box 45195-313, Zanjan (Iran, Islamic Republic of); Innes, D. E. [Max-Planck Institut fuer Sonnensystemforschung, 37191 Katlenburg-Lindau (Germany)

    2012-02-10

    Small-scale extreme-ultraviolet (EUV) dimming often surrounds sites of energy release in the quiet Sun. This paper describes a method for the automatic detection of these small-scale EUV dimmings using a feature-based classifier. The method is demonstrated using sequences of 171 Angstrom-Sign images taken by the STEREO/Extreme UltraViolet Imager (EUVI) on 2007 June 13 and by Solar Dynamics Observatory/Atmospheric Imaging Assembly on 2010 August 27. The feature identification relies on recognizing structure in sequences of space-time 171 Angstrom-Sign images using the Zernike moments of the images. The Zernike moments space-time slices with events and non-events are distinctive enough to be separated using a support vector machine (SVM) classifier. The SVM is trained using 150 events and 700 non-event space-time slices. We find a total of 1217 events in the EUVI images and 2064 events in the AIA images on the days studied. Most of the events are found between latitudes -35 Degree-Sign and +35 Degree-Sign . The sizes and expansion speeds of central dimming regions are extracted using a region grow algorithm. The histograms of the sizes in both EUVI and AIA follow a steep power law with slope of about -5. The AIA slope extends to smaller sizes before turning over. The mean velocity of 1325 dimming regions seen by AIA is found to be about 14 km s{sup -1}.

  18. Workshop on nuclear structure and decay data: Theory and evaluation manual - Pt. 2

    International Nuclear Information System (INIS)

    Nichols, A.L.; McLaughlin, P.K.; p.mclaughlin@iaea.org

    2004-11-01

    A two-week Workshop on Nuclear Structure and Decay Data: Theory and Evaluation was organized and administrated by the IAEA Nuclear Data Section, and hosted at the Abdus Salam International Centre for Theoretical Physics (ICTP) in Trieste, Italy from 17 to 28 November 2003. The aims and contents of this workshop are summarized, along with the agenda, list of participants, comments and recommendations. Workshop materials are also included that are freely available on CD-ROM (all relevant PowerPoint presentations and manuals along with appropriate computer codes). (author)

  19. Workshop on nuclear structure and decay data: Theory and evaluation manual - Pt. 1

    International Nuclear Information System (INIS)

    Nichols, A.L.; McLaughlin, P.K.; p.mclaughlin@iaea.org

    2004-11-01

    A two-week Workshop on Nuclear Structure and Decay Data: Theory and Evaluation was organized and administrated by the IAEA Nuclear Data Section, and hosted at the Abdus Salam International Centre for Theoretical Physics (ICTP) in Trieste, Italy from 17 to 28 November 2003. The aims and contents of this workshop are summarized, along with the agenda, list of participants, comments and recommendations. Workshop materials are also included that are freely available on CD-ROM (all relevant PowerPoint presentations and manuals along with appropriate computer codes). (author)

  20. 22nd Italian Workshop on Neural Nets

    CERN Document Server

    Bassis, Simone; Esposito, Anna; Morabito, Francesco

    2013-01-01

    This volume collects a selection of contributions which has been presented at the 22nd Italian Workshop on Neural Networks, the yearly meeting of the Italian Society for Neural Networks (SIREN). The conference was held in Italy, Vietri sul Mare (Salerno), during May 17-19, 2012. The annual meeting of SIREN is sponsored by International Neural Network Society (INNS), European Neural Network Society (ENNS) and IEEE Computational Intelligence Society (CIS). The book – as well as the workshop-  is organized in three main components, two special sessions and a group of regular sessions featuring different aspects and point of views of artificial neural networks and natural intelligence, also including applications of present compelling interest.

  1. Frontiers of interfacial water research :workshop report.

    Energy Technology Data Exchange (ETDEWEB)

    Cygan, Randall Timothy; Greathouse, Jeffery A.

    2005-10-01

    Water is the critical natural resource of the new century. Significant improvements in traditional water treatment processes require novel approaches based on a fundamental understanding of nanoscale and atomic interactions at interfaces between aqueous solution and materials. To better understand these critical issues and to promote an open dialog among leading international experts in water-related specialties, Sandia National Laboratories sponsored a workshop on April 24-26, 2005 in Santa Fe, New Mexico. The ''Frontiers of Interfacial Water Research Workshop'' provided attendees with a critical review of water technologies and emphasized the new advances in surface and interfacial microscopy, spectroscopy, diffraction, and computer simulation needed for the development of new materials for water treatment.

  2. Talks in CNLS workshop: "Plasma Energization: Exchanges Between Fluid and Kinetic Scales"

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Fan [Los Alamos National Laboratory

    2015-08-03

    This document is a collection of the slides presented during the talks presented at the workshop held in Los Alamos, New Mexico, May 4 - 2015. Most of the presentations have been indexed individually for inclusion in the database.

  3. Final Report Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    Energy Technology Data Exchange (ETDEWEB)

    O' Leary, Patrick [Kitware, Inc., Clifton Park, NY (United States)

    2017-09-13

    The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.

  4. Can Tablet Computers Enhance Faculty Teaching?

    Science.gov (United States)

    Narayan, Aditee P; Whicker, Shari A; Benjamin, Robert W; Hawley, Jeffrey; McGann, Kathleen A

    2015-06-01

    Learner benefits of tablet computer use have been demonstrated, yet there is little evidence regarding faculty tablet use for teaching. Our study sought to determine if supplying faculty with tablet computers and peer mentoring provided benefits to learners and faculty beyond that of non-tablet-based teaching modalities. We provided faculty with tablet computers and three 2-hour peer-mentoring workshops on tablet-based teaching. Faculty used tablets to teach, in addition to their current, non-tablet-based methods. Presurveys, postsurveys, and monthly faculty surveys assessed feasibility, utilization, and comparisons to current modalities. Learner surveys assessed perceived effectiveness and comparisons to current modalities. All feedback received from open-ended questions was reviewed by the authors and organized into categories. Of 15 eligible faculty, 14 participated. Each participant attended at least 2 of the 3 workshops, with 10 to 12 participants at each workshop. All participants found the workshops useful, and reported that the new tablet-based teaching modality added value beyond that of current teaching methods. Respondents developed the following tablet-based outputs: presentations, photo galleries, evaluation tools, and online modules. Of the outputs, 60% were used in the ambulatory clinics, 33% in intensive care unit bedside teaching rounds, and 7% in inpatient medical unit bedside teaching rounds. Learners reported that common benefits of tablet computers were: improved access/convenience (41%), improved interactive learning (38%), and improved bedside teaching and patient care (13%). A common barrier faculty identified was inconsistent wireless access (14%), while no barriers were identified by the majority of learners. Providing faculty with tablet computers and having peer-mentoring workshops to discuss their use was feasible and added value.

  5. Technology transfer - insider protection workshop (Safeguards Evaluation Method - Insider Threat)

    International Nuclear Information System (INIS)

    Strait, R.S.; Renis, T.A.

    1986-01-01

    The Safeguards Evaluation Method - Insider Threat, developed by Lawrence Livermore National Laboratory, is a field-applicable tool to evaluate facility safeguards against theft or diversion of special nuclear material (SNM) by nonviolent insiders. To ensure successful transfer of this technology from the laboratory to DOE field offices and contractors, LLNL developed a three-part package. The package includes a workbook, user-friendly microcomputer software, and a three-day training program. The workbook guides an evaluation team through the Safeguards Evaluation Method and provides forms for gathering data. The microcomputer software assists in the evaluation of safeguards effectiveness. The software is designed for safeguards analysts with no previous computer experience. It runs on an IBM Personal Computer or any compatible machine. The three-day training program is called the Insider Protection Workshop. The workshop students learn how to use the workbook and the computer software to assess insider vulnerabilities and to evaluate the benefits and costs of potential improvements. These activities increase the students' appreciation of the insider threat. The workshop format is informal and interactive, employing four different instruction modes: classroom presentations, small-group sessions, a practical exercise, and ''hands-on'' analysis using microcomputers. This approach to technology transfer has been successful: over 100 safeguards planners and analysts have been trained in the method, and it is being used at facilities through the DOE complex

  6. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  7. Standardizing Scale Height Computation of Maven Ngims Neutral Data and Variations Between Exobase and Homeopause Scale Heights

    Science.gov (United States)

    Elrod, M. K.; Slipski, M.; Curry, S.; Williamson, H. N.; Benna, M.; Mahaffy, P. R.

    2017-12-01

    The MAVEN NGIMS team produces a level 3 product which includes the computation of Ar scale height an atmospheric temperatures at 200 km. In the latest version (v05_r01) this has been revised to include scale height fits for CO2, N2 O and CO. Members of the MAVEN team have used various methods to compute scale heights leading to significant variations in scale height values depending on fits and techniques within a few orbits even, occasionally, the same pass. Additionally fitting scale heights in a very stable atmosphere like the day side vs night side can have different results based on boundary conditions. Currently, most methods only compute Ar scale heights as it is most stable and reacts least with the instrument. The NGIMS team has chosen to expand these fitting techniques to include fitted scale heights for CO2, N2, CO, and O. Having compared multiple techniques, the method found to be most reliable for most conditions was determined to be a simple fit method. We have focused this to a fitting method that determines the exobase altidude of the CO2 atmosphere as a maximum altitude for the highest point for fitting, and uses the periapsis as the lowest point and then fits the altitude versus log(density). The slope of altitude vs log(density) is -1/H where H is the scale height of the atmosphere for each species. Since this is between the homeopause and the exobase, each species will have a different scale height by this point. This is being released as a new standardization for the level 3 product, with the understanding that scientists and team members will continue to compute more precise scale heights and temperatures as needed based on science and model demands. This is being released in the PDS NGIMS level 3 v05 files for August 2017. Additionally, we are examining these scale heights for variations seasonally, diurnally, and above and below the exobase. The atmosphere is significantly more stable on the dayside than on the nightside. We have also found

  8. Numerical Ship Hydrodynamics: An Assessment of the Gothenburg 2010 Workshop

    National Research Council Canada - National Science Library

    Larsson, Lars; Stern, Frederick (Professor of engineering); Visonneau, Michel

    2014-01-01

    "This book assesses the state-of-the-art in computational fluid dynamics (CFD) applied to ship hydrodynamics and provides guidelines for the future developments in the field based on the Gothenburg 2010 Workshop...

  9. Proceedings of the SLAC/KEK ATF lattice workshop

    International Nuclear Information System (INIS)

    Urakawa, Junji

    1993-04-01

    The SLAC/KEK ATF Lattice Workshop was held on December 8-11, 1992 at KEK, National Laboratory for High Energy Physics. The purpose of this workshop is to critically review the ATF lattice design for any possible improvements, and also to bring SLAC colleagues up to date on recent progress at KEK. At KEK studies on intense multi-bunch beam acceleration and emittance reduction have been actively pursued, evolving into the ATF project since 1990. In 1991 we have launched a large scale reconstruction of the experimental hall. This is to build the shielded housing for the 1.54 GeV injector linac and the test damping ring. Our plan is to begin construction of the linac in March 1993. Some results from the discussions during the Workshop have been already incorporated in the revised ATF lattice design. (J.P.N.)

  10. 16th International Conference on Medical Image Computing and Computer Assisted Intervention

    CERN Document Server

    Klinder, Tobias; Li, Shuo

    2014-01-01

    This book contains the full papers presented at the MICCAI 2013 workshop Computational Methods and Clinical Applications for Spine Imaging. The workshop brought together researchers representing several fields, such as Biomechanics, Engineering, Medicine, Mathematics, Physics and Statistic. The works included in this book present and discuss new trends in those fields, using several methods and techniques in order to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modelling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis.

  11. Reliability in Warehouse-Scale Computing: Why Low Latency Matters

    DEFF Research Database (Denmark)

    Nannarelli, Alberto

    2015-01-01

    , the limiting factor of these warehouse-scale data centers is the power dissipation. Power is dissipated not only in the computation itself, but also in heat removal (fans, air conditioning, etc.) to keep the temperature of the devices within the operating ranges. The need to keep the temperature low within......Warehouse sized buildings are nowadays hosting several types of large computing systems: from supercomputers to large clusters of servers to provide the infrastructure to the cloud. Although the main target, especially for high-performance computing, is still to achieve high throughput...

  12. Scholarship for Service: IA Tutorials and Workshops for Educators

    National Research Council Canada - National Science Library

    Irvine, Cynthia E; Falby, Naomi B

    2005-01-01

    ... of Information Assurance (IA) and computer security. The target audience of the workshops has been 2-year college, 4-year college, and university-level educators who have responsibility for teaching curricula that are, or could be, related to IA issues...

  13. Dark Sectors 2016 Workshop: Community Report

    CERN Document Server

    Alexander, Jim; Echenard, Bertrand; Essig, Rouven; Graham, Matthew; Izaguirre, Eder; Jaros, John; Krnjaic, Gordan; Mardon, Jeremy; Morrissey, David; Nelson, Tim; Perelstein, Maxim; Pyle, Matt; Ritz, Adam; Schuster, Philip; Shuve, Brian; Toro, Natalia; Van De Water, Richard G.; Akerib, Daniel; An, Haipeng; Aniol, Konrad; Arnquist, Isaac J.; Asner, David M.; Back, Henning O.; Baker, Keith; Baltzell, Nathan; Banerjee, Dipanwita; Batell, Brian; Bauer, Daniel; Beacham, James; Benesch, Jay; Bjorken, James; Blinov, Nikita; Boehm, Celine; Bondi, Mariangela; Bonivento, Walter; Bossi, Fabio; Brodsky, Stanley J.; Budnik, Ran; Bueltmann, Stephen; Bukhari, Masroor H.; Bunker, Raymond; Carpinelli, Massimo; Cartaro, Concetta; Cassel, David; Cavoto, Gianluca; Celentano, Andrea; Chaterjee, Animesh; Chaudhuri, Saptarshi; Chiodini, Gabriele; Cho, Hsiao-Mei Sherry; Church, Eric D.; Cooke, D.A.; Cooley, Jodi; Cooper, Robert; Corliss, Ross; Crivelli, Paolo; Curciarello, Francesca; D'Angelo, Annalisa; Davoudiasl, Hooman; De Napoli, Marzio; De Vita, Raffaella; Denig, Achim; deNiverville, Patrick; Deshpande, Abhay; Dharmapalan, Ranjan; Dobrescu, Bogdan; Donskov, Sergey; Dupre, Raphael; Estrada, Juan; Fegan, Stuart; Ferber, Torben; Field, Clive; Figueroa-Feliciano, Enectali; Filippi, Alessandra; Fornal, Bartosz; Freyberger, Arne; Friedland, Alexander; Galon, Iftach; Gardner, Susan; Girod, Francois-Xavier; Gninenko, Sergei; Golutvin, Andrey; Gori, Stefania; Grab, Christoph; Graziani, Enrico; Griffioen, Keith; Haas, Andrew; Harigaya, Keisuke; Hearty, Christopher; Hertel, Scott; Hewett, JoAnne; Hime, Andrew; Hitlin, David; Hochberg, Yonit; Holt, Roy J.; Holtrop, Maurik; Hoppe, Eric W.; Hossbach, Todd W.; Hsu, Lauren; Ilten, Phil; Incandela, Joe; Inguglia, Gianluca; Irwin, Kent; Jaegle, Igal; Johnson, Robert P.; Kahn, Yonatan; Kalicy, Grzegorz; Kang, Zhong-Bo; Khachatryan, Vardan; Kozhuharov, Venelin; Krasnikov, N.V.; Kubarovsky, Valery; Kuflik, Eric; Kurinsky, Noah; Laha, Ranjan; Lanfranchi, Gaia; Li, Dale; Lin, Tongyan; Lisanti, Mariangela; Liu, Kun; Liu, Ming; Loer, Ben; Loomba, Dinesh; Lyubovitskij, Valery E.; Manalaysay, Aaron; Mandaglio, Giuseppe; Mans, Jeremiah; Marciano, W.J.; Markiewicz, Thomas; Marsicano, Luca; Maruyama, Takashi; Matveev, Victor A.; McKeen, David; McKinnon, Bryan; McKinsey, Dan; Merkel, Harald; Mock, Jeremy; Monzani, Maria Elena; Moreno, Omar; Nantais, Corina; Paul, Sebouh; Peskin, Michael; Poliakov, Vladimir; Polosa, Antonio D.; Pospelov, Maxim; Rachek, Igor; Radics, Balint; Raggi, Mauro; Randazzo, Nunzio; Ratcliff, Blair; Rizzo, Alessandro; Rizzo, Thomas; Robinson, Alan; Rubbia, Andre; Rubin, David; Rueter, Dylan; Saab, Tarek; Santopinto, Elena; Schnee, Richard; Shelton, Jessie; Simi, Gabriele; Simonyan, Ani; Sipala, Valeria; Slone, Oren; Smith, Elton; Snowden-Ifft, Daniel; Solt, Matthew; Sorensen, Peter; Soreq, Yotam; Spagnolo, Stefania; Spencer, James; Stepanyan, Stepan; Strube, Jan; Sullivan, Michael; Tadepalli, Arun S.; Tait, Tim; Taiuti, Mauro; Tanedo, Philip; Tayloe, Rex; Thaler, Jesse; Tran, Nhan V.; Tulin, Sean; Tully, Christopher G.; Uemura, Sho; Ungaro, Maurizio; Valente, Paolo; Vance, Holly; Vavra, Jerry; Volansky, Tomer; von Krosigk, Belina; Whitbeck, Andrew; Williams, Mike; Wittich, Peter; Wojtsekhowski, Bogdan; Xue, Wei; Yoon, Jong Min; Yu, Hai-Bo; Yu, Jaehoon; Yu, Tien-Tien; Zhang, Yue; Zhao, Yue; Zhong, Yiming; Zurek, Kathryn

    2016-01-01

    This report, based on the Dark Sectors workshop at SLAC in April 2016, summarizes the scientific importance of searches for dark sector dark matter and forces at masses beneath the weak-scale, the status of this broad international field, the important milestones motivating future exploration, and promising experimental opportunities to reach these milestones over the next 5-10 years.

  14. Dark Sectors 2016 Workshop: Community Report

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, Jim; et al.

    2016-08-30

    This report, based on the Dark Sectors workshop at SLAC in April 2016, summarizes the scientific importance of searches for dark sector dark matter and forces at masses beneath the weak-scale, the status of this broad international field, the important milestones motivating future exploration, and promising experimental opportunities to reach these milestones over the next 5-10 years.

  15. The Fifth Workshop on HPC Best Practices: File Systems and Archives

    Energy Technology Data Exchange (ETDEWEB)

    Hick, Jason; Hules, John; Uselton, Andrew

    2011-11-30

    The workshop on High Performance Computing (HPC) Best Practices on File Systems and Archives was the fifth in a series sponsored jointly by the Department Of Energy (DOE) Office of Science and DOE National Nuclear Security Administration. The workshop gathered technical and management experts for operations of HPC file systems and archives from around the world. Attendees identified and discussed best practices in use at their facilities, and documented findings for the DOE and HPC community in this report.

  16. Computing the universe: how large-scale simulations illuminate galaxies and dark energy

    Science.gov (United States)

    O'Shea, Brian

    2015-04-01

    High-performance and large-scale computing is absolutely to understanding astronomical objects such as stars, galaxies, and the cosmic web. This is because these are structures that operate on physical, temporal, and energy scales that cannot be reasonably approximated in the laboratory, and whose complexity and nonlinearity often defies analytic modeling. In this talk, I show how the growth of computing platforms over time has facilitated our understanding of astrophysical and cosmological phenomena, focusing primarily on galaxies and large-scale structure in the Universe.

  17. WORKSHOP: Stable particle motion

    International Nuclear Information System (INIS)

    Ruggiero, Alessandro G.

    1993-01-01

    Full text: Particle beam stability is crucial to any accelerator or collider, particularly big ones, such as Brookhaven's RHIC heavy ion collider and the larger SSC and LHC proton collider schemes. A workshop on the Stability of Particle Motion in Storage Rings held at Brookhaven in October dealt with the important issue of determining the short- and long-term stability of single particle motion in hadron storage rings and colliders, and explored new methods for ensuring it. In the quest for realistic environments, the imperfections of superconducting magnets and the effects of field modulation and noise were taken into account. The workshop was divided into three study groups: Short-Term Stability in storage rings, including chromatic and geometric effects and correction strategies; Long-Term Stability, including modulation and random noise effects and slow varying effects; and Methods for determining the stability of particle motion. The first two were run in parallel, but the third was attended by everyone. Each group considered analytical, computational and experimental methods, reviewing work done so far, comparing results and approaches and underlining outstanding issues. By resolving conflicts, it was possible to identify problems of common interest. The workshop reaffirmed the validity of methods proposed several years ago. Major breakthroughs have been in the rapid improvement of computer capacity and speed, in the development of more sophisticated mathematical packages, and in the introduction of more powerful analytic approaches. In a typical storage ring, a particle may be required to circulate for about a billion revolutions. While ten years ago it was only possible to predict accurately stability over about a thousand revolutions, it is now possible to predict over as many as one million turns. If this trend continues, in ten years it could become feasible to predict particle stability over the entire storage period. About ninety participants

  18. Intelligent Distributed Computing VI : Proceedings of the 6th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Badica, Costin; Malgeri, Michele; Unland, Rainer

    2013-01-01

    This book represents the combined peer-reviewed proceedings of the Sixth International Symposium on Intelligent Distributed Computing -- IDC~2012, of the International Workshop on Agents for Cloud -- A4C~2012 and of the Fourth International Workshop on Multi-Agent Systems Technology and Semantics -- MASTS~2012. All the events were held in Calabria, Italy during September 24-26, 2012. The 37 contributions published in this book address many topics related to theory and applications of intelligent distributed computing and multi-agent systems, including: adaptive and autonomous distributed systems, agent programming, ambient assisted living systems, business process modeling and verification, cloud computing, coalition formation, decision support systems, distributed optimization and constraint satisfaction, gesture recognition, intelligent energy management in WSNs, intelligent logistics, machine learning, mobile agents, parallel and distributed computational intelligence, parallel evolutionary computing, trus...

  19. Preface to Proceedings of the 12th European Workshop on Natural Language Generation (ENLG 2009)

    NARCIS (Netherlands)

    Krahmer, E.; Krahmer, E.; Theune, Mariet

    We are pleased to present the Proceedings of the 12th European Workshop on Natural Language Generation (ENLG 2009). ENLG 2009 was held in Athens, Greece, as a workshop at the 12th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2009). Following our call, we

  20. Developing workshop module of realistic mathematics education: Follow-up workshop

    Science.gov (United States)

    Palupi, E. L. W.; Khabibah, S.

    2018-01-01

    Realistic Mathematics Education (RME) is a learning approach which fits the aim of the curriculum. The success of RME in teaching mathematics concepts, triggering students’ interest in mathematics and teaching high order thinking skills to the students will make teachers start to learn RME. Hence, RME workshop is often offered and done. This study applied development model proposed by Plomp. Based on the study by RME team, there are three kinds of RME workshop: start-up workshop, follow-up workshop, and quality boost. However, there is no standardized or validated module which is used in that workshops. This study aims to develop a module of RME follow-up workshop which is valid and can be used. Plopm’s developmental model includes materials analysis, design, realization, implementation, and evaluation. Based on the validation, the developed module is valid. While field test shows that the module can be used effectively.

  1. Detecting Silent Data Corruption for Extreme-Scale Applications through Data Mining

    Energy Technology Data Exchange (ETDEWEB)

    Bautista-Gomez, Leonardo [Argonne National Lab. (ANL), Argonne, IL (United States); Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-16

    Supercomputers allow scientists to study natural phenomena by means of computer simulations. Next-generation machines are expected to have more components and, at the same time, consume several times less energy per operation. These trends are pushing supercomputer construction to the limits of miniaturization and energy-saving strategies. Consequently, the number of soft errors is expected to increase dramatically in the coming years. While mechanisms are in place to correct or at least detect some soft errors, a significant percentage of those errors pass unnoticed by the hardware. Such silent errors are extremely damaging because they can make applications silently produce wrong results. In this work we propose a technique that leverages certain properties of high-performance computing applications in order to detect silent errors at the application level. Our technique detects corruption solely based on the behavior of the application datasets and is completely application-agnostic. We propose multiple corruption detectors, and we couple them to work together in a fashion transparent to the user. We demonstrate that this strategy can detect the majority of the corruptions, while incurring negligible overhead. We show that with the help of these detectors, applications can have up to 80% of coverage against data corruption.

  2. Influence of climate variability versus change at multi-decadal time scales on hydrological extremes

    Science.gov (United States)

    Willems, Patrick

    2014-05-01

    Recent studies have shown that rainfall and hydrological extremes do not randomly occur in time, but are subject to multidecadal oscillations. In addition to these oscillations, there are temporal trends due to climate change. Design statistics, such as intensity-duration-frequency (IDF) for extreme rainfall or flow-duration-frequency (QDF) relationships, are affected by both types of temporal changes (short term and long term). This presentation discusses these changes, how they influence water engineering design and decision making, and how this influence can be assessed and taken into account in practice. The multidecadal oscillations in rainfall and hydrological extremes were studied based on a technique for the identification and analysis of changes in extreme quantiles. The statistical significance of the oscillations was evaluated by means of a non-parametric bootstrapping method. Oscillations in large scale atmospheric circulation were identified as the main drivers for the temporal oscillations in rainfall and hydrological extremes. They also explain why spatial phase shifts (e.g. north-south variations in Europe) exist between the oscillation highs and lows. Next to the multidecadal climate oscillations, several stations show trends during the most recent decades, which may be attributed to climate change as a result of anthropogenic global warming. Such attribution to anthropogenic global warming is, however, uncertain. It can be done based on simulation results with climate models, but it is shown that the climate model results are too uncertain to enable a clear attribution. Water engineering design statistics, such as extreme rainfall IDF or peak or low flow QDF statistics, obviously are influenced by these temporal variations (oscillations, trends). It is shown in the paper, based on the Brussels 10-minutes rainfall data, that rainfall design values may be about 20% biased or different when based on short rainfall series of 10 to 15 years length, and

  3. Creating Fantastic PI Workshops

    Energy Technology Data Exchange (ETDEWEB)

    Biedermann, Laura B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Clark, Blythe G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Colbert, Rachel S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dagel, Amber Lynn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gupta, Vipin P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hibbs, Michael R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perkins, David Nikolaus [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); West, Roger Derek [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The goal of this SAND report is to provide guidance for other groups hosting workshops and peerto-peer learning events at Sandia. Thus this SAND report provides detail about our team structure, how we brainstormed workshop topics and developed the workshop structure. A Workshop “Nuts and Bolts” section provides our timeline and check-list for workshop activities. The survey section provides examples of the questions we asked and how we adapted the workshop in response to the feedback.

  4. Increasing Students' Motivation by Using Computers

    Directory of Open Access Journals (Sweden)

    Rodríguez Aura Stella

    2000-08-01

    Full Text Available The lack of motivation in the 9th grade students of Tomás Rueda Vargas School was the objective of this project, so we planned a series of workshops in Microsoft Word to apply in the computer lab. We observed that by working in groups of four in the computer lab, the students did the activities with enthusiasm. It could also be noticed that the workshops were effective in reinforcing English learning.

  5. A multiple-scaling method of the computation of threaded structures

    International Nuclear Information System (INIS)

    Andrieux, S.; Leger, A.

    1989-01-01

    The numerical computation of threaded structures usually leads to very large finite elements problems. It was therefore very difficult to carry out some parametric studies, especially in non-linear cases involving plasticity or unilateral contact conditions. Nevertheless, these parametric studies are essential in many industrial problems, for instance for the evaluation of various repairing processes of the closure studs of PWR. It is well known that such repairing generally involves several modifications of the thread geometry, of the number of active threads, of the flange clamping conditions, and so on. This paper is devoted to the description of a two-scale method, which easily allows parametric studies. The main idea of this method consists of dividing the problem into a global part, and a local part. The local problem is solved by F.E.M. on the precise geometry of the thread of some elementary loadings. The global one is formulated on the gudgeon scale and is reduced to a monodimensional one. The resolution of this global problem leads to the unsignificant computational cost. Then, a post-processing gives the stress field at the thread scale anywhere in the assembly. After recalling some principles of the two-scales approach, the method is described. The validation by comparison with a direct F.E. computation and some further applications are presented

  6. Energy and nuclear power planning using the IAEA`s ENPEP computer package. Proceedings of a workshop

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-09-01

    The Regional (Europe) Technical Co-operation Project on the Study of Energy Options Using the IAEA Planning Methodologies was first implemented by the IAEA in 1995. The project aims at improving national capabilities for energy, electricity and nuclear power planning and promoting regional co-operation among participating countries in the European region. The project includes the organization of workshops, training activities at the regional and national levels, scientific visits, etc. The proceedings of a workshop held in Warsaw, Poland, from 4 to 8 September 1995 are contained herein. The workshop had as a basic objective the analysis of the specific problems encountered by the represented countries during application of the IAEA`s ENPEP package in the conduct of national studies and to provide a forum for further co-operation among participating countries. A second objective of the workshop was to make proposals for future activities to be organized within the project. This publication is intended to serve as reference for the users of the IAEA`s ENPEP package, as well as for energy and electricity planners in general. Refs, figs, tabs.

  7. Parallel multiple instance learning for extremely large histopathology image analysis.

    Science.gov (United States)

    Xu, Yan; Li, Yeshu; Shen, Zhengyang; Wu, Ziwei; Gao, Teng; Fan, Yubo; Lai, Maode; Chang, Eric I-Chao

    2017-08-03

    Histopathology images are critical for medical diagnosis, e.g., cancer and its treatment. A standard histopathology slice can be easily scanned at a high resolution of, say, 200,000×200,000 pixels. These high resolution images can make most existing imaging processing tools infeasible or less effective when operated on a single machine with limited memory, disk space and computing power. In this paper, we propose an algorithm tackling this new emerging "big data" problem utilizing parallel computing on High-Performance-Computing (HPC) clusters. Experimental results on a large-scale data set (1318 images at a scale of 10 billion pixels each) demonstrate the efficiency and effectiveness of the proposed algorithm for low-latency real-time applications. The framework proposed an effective and efficient system for extremely large histopathology image analysis. It is based on the multiple instance learning formulation for weakly-supervised learning for image classification, segmentation and clustering. When a max-margin concept is adopted for different clusters, we obtain further improvement in clustering performance.

  8. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  9. Report of the Advanced Neutron Source (ANS) safety workshop, Knoxville, Tennessee, October 25--26, 1988

    International Nuclear Information System (INIS)

    Buchanan, J.R.; Dumont, J.N.; Kendrick, C.M.; Row, T.H.; Thompson, P.B.; West, C.D.; Marchaterre, J.F.; Muhlheim, M.D.; McBee, M.R.

    1988-12-01

    On October 25--26, 1988, about 60 people took part in an Advanced Neutron Source (ANS) Safety Workshop, organized in cooperation with the Oak Ridge Operations (ORO) Office of the Department of Energy (DOE) and held in Knoxville, Tennessee. After a plenary session at which ANS Project staff presented status reports on the ANS design, research and development (R and D), and safety analysis efforts, the workshop broke into three working groups, each covering a different topic: Environmental and Waste Management, Applicable Regulatory Safety Criteria and Goals, and Reactor Concepts. Each group was asked to review the Project's approach to safety-related issues and to provide guidance on future reactor safety needs or directions for the Project. With the help of able chairmen, assisted by reporters and secretarial support, the working groups were extremely successful. Draft reports from each group were prepared before the workshop closed, and the major findings of each group were presented for review and discussion by the entire workshop attendance. This report contains the final version of the group reports, incorporating the results of the overall review by all the workshop participants

  10. Towards short wavelengths FELs workshop

    International Nuclear Information System (INIS)

    Ben-Zvi, I.; Winick, H.

    1993-01-01

    This workshop was caged because of the growing perception in the FEL source community that recent advances have made it possible to extend FEL operation to wavelengths about two orders of magnitude shorter than the 240 nm that has been achieved to date. In addition short wavelength FELs offer the possibilities of extremely high peak power (several gigawatts) and very short pulses (of the order of 100 fs). Several groups in the USA are developing plans for such short wavelength FEL facilities. However, reviewers of these plans have pointed out that it would be highly desirable to first carry out proof-of-principle experiments at longer wavelengths to increase confidence that the shorter wavelength devices will indeed perform as calculated. The need for such experiments has now been broadly accepted by the FEL community. Such experiments were the main focus of this workshop as described in the following objectives distributed to attendees: (1) Define measurements needed to gain confidence that short wavelength FELs will perform as calculated. (2) List possible hardware that could be used to carry out these measurements in the near term. (3) Define a prioritized FEL physics experimental program and suggested timetable. (4) Form collaborative teams to carry out this program

  11. Towards short wavelengths FELs workshop

    Science.gov (United States)

    Ben-Zvi, I.; Winick, H.

    1993-11-01

    This workshop was caged because of the growing perception in the FEL source community that recent advances have made it possible to extend FEL operation to wavelengths about two orders of magnitude shorter than the 240 nm that has been achieved to date. In addition short wavelength FEL's offer the possibilities of extremely high peak power (several gigawatts) and very short pulses (of the order of 100 fs). Several groups in the USA are developing plans for such short wavelength FEL facilities. However, reviewers of these plans have pointed out that it would be highly desirable to first carry out proof-of-principle experiments at longer wavelengths to increase confidence that the shorter wavelength devices will indeed perform as calculated. The need for such experiments has now been broadly accepted by the FEL community. Such experiments were the main focus of this workshop as described in the following objectives distributed to attendees: (1) Define measurements needed to gain confidence that short wavelength FEL's will perform as calculated. (2) List possible hardware that could be used to carry out these measurements in the near term. (3) Define a prioritized FEL physics experimental program and suggested timetable. (4) Form collaborative teams to carry out this program.

  12. Workshop on ROVs and deep submergence

    Science.gov (United States)

    The deep-submergence community has an opportunity on March 6 to participate in a unique teleconferencing demonstration of a state-of-the-art, remotely operated underwater research vehicle known as the Jason-Medea System. Jason-Medea has been developed over the past decade by scientists, engineers, and technicians at the Deep Submergence Laboratory at Woods Hole Oceanographic Institution. The U.S. Navy, the Office of the Chief of Naval Research, and the National Science Foundation are sponsoring the workshop to explore the roles that modern computational, communications, and robotics technologies can play in deep-sea oceanographic research.Through the cooperation of Electronic Data Systems, Inc., the Jason Foundation, and Turner Broadcasting System, Inc., 2-1/2 hours of air time will be available from 3:00 to 5:30 PM EST on March 6. Twenty-seven satellite downlink sites will link one operating research vessel and the land-based operation with workshop participants in the United States, Canada, the United Kingdom, and Bermuda. The research ship Laney Chouest will be in the midst of a 3-week educational/research program in the Sea of Cortez, between Baja California and mainland Mexico. This effort is focused on active hydrothermal vents driven by heat flow from the volcanically active East Pacific Rise, which underlies the sediment-covered Guaymas Basin. The project combines into a single-operation, newly-developed robotic systems, state-of-the-art mapping and sampling tools, fiber-optic data transmission from the seafloor, instantaneous satellite communication from ship to shore, and a sophisticated array of computational and telecommunications networks. During the workshop, land-based scientists will observe and participate directly with their seagoing colleagues as they conduct seafloor research.

  13. Spatial extreme value analysis to project extremes of large-scale indicators for severe weather.

    Science.gov (United States)

    Gilleland, Eric; Brown, Barbara G; Ammann, Caspar M

    2013-09-01

    Concurrently high values of the maximum potential wind speed of updrafts ( W max ) and 0-6 km wind shear (Shear) have been found to represent conducive environments for severe weather, which subsequently provides a way to study severe weather in future climates. Here, we employ a model for the product of these variables (WmSh) from the National Center for Atmospheric Research/United States National Center for Environmental Prediction reanalysis over North America conditioned on their having extreme energy in the spatial field in order to project the predominant spatial patterns of WmSh. The approach is based on the Heffernan and Tawn conditional extreme value model. Results suggest that this technique estimates the spatial behavior of WmSh well, which allows for exploring possible changes in the patterns over time. While the model enables a method for inferring the uncertainty in the patterns, such analysis is difficult with the currently available inference approach. A variation of the method is also explored to investigate how this type of model might be used to qualitatively understand how the spatial patterns of WmSh correspond to extreme river flow events. A case study for river flows from three rivers in northwestern Tennessee is studied, and it is found that advection of WmSh from the Gulf of Mexico prevails while elsewhere, WmSh is generally very low during such extreme events. © 2013 The Authors. Environmetrics published by JohnWiley & Sons, Ltd.

  14. Remote monitoring system workshop and technical cooperation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Soo; Kwack, E. H.; Yoon, W. K.; Kim, J. S.; Cha, H. Y.; Na, W.W

    2000-06-01

    RMS workshop at the year focus on installing the material monioring system at technology lab. within TCNC. This system was developed by cooperative monitoring center(CMC) belonging to Sandia national lab. MMS consisted of data storage computer, data collection computer and easily connet to DCM-14 camera using monitoring the NPP by IAEA. The system run when the motion is catching and stroes the event data to MMS server. Also, the system communicate with the internet and then they access to check the event data only if the authencated person.

  15. Remote monitoring system workshop and technical cooperation

    International Nuclear Information System (INIS)

    Kim, Jung Soo; Kwack, E. H.; Yoon, W. K.; Kim, J. S.; Cha, H. Y.; Na, W.W.

    2000-06-01

    RMS workshop at the year focus on installing the material monioring system at technology lab. within TCNC. This system was developed by cooperative monitoring center(CMC) belonging to Sandia national lab. MMS consisted of data storage computer, data collection computer and easily connet to DCM-14 camera using monitoring the NPP by IAEA. The system run when the motion is catching and stroes the event data to MMS server. Also, the system communicate with the internet and then they access to check the event data only if the authencated person

  16. Topology-oblivious optimization of MPI broadcast algorithms on extreme-scale platforms

    KAUST Repository

    Hasanov, Khalid

    2015-11-01

    © 2015 Elsevier B.V. All rights reserved. Significant research has been conducted in collective communication operations, in particular in MPI broadcast, on distributed memory platforms. Most of the research efforts aim to optimize the collective operations for particular architectures by taking into account either their topology or platform parameters. In this work we propose a simple but general approach to optimization of the legacy MPI broadcast algorithms, which are widely used in MPICH and Open MPI. The proposed optimization technique is designed to address the challenge of extreme scale of future HPC platforms. It is based on hierarchical transformation of the traditionally flat logical arrangement of communicating processors. Theoretical analysis and experimental results on IBM BlueGene/P and a cluster of the Grid\\'5000 platform are presented.

  17. Exascale Workshop Panel Report Meeting

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2010-07-01

    The Exascale Review Panel consists of 12 scientists and engineers with experience in various aspects of high-performance computing and its application, development, and management. The Panel hear presentations by several representatives of the workshops and town meetings convened over the past few years to examine the need for exascale computation capability and the justification for a U.S. Department of Energy (DOE) program to develop such capability. This report summarizes information provided by the presenters and substantial written reports to the Panel in advance of the meeting in Washington D.C. on January 19-20, 2010. The report also summarizes the Panel's conclusions with regard to the justification of a DOE-led exascale initiative.

  18. Reliability, validity, and sensitivity to change of the lower extremity functional scale in individuals affected by stroke.

    Science.gov (United States)

    Verheijde, Joseph L; White, Fred; Tompkins, James; Dahl, Peder; Hentz, Joseph G; Lebec, Michael T; Cornwall, Mark

    2013-12-01

    To investigate reliability, validity, and sensitivity to change of the Lower Extremity Functional Scale (LEFS) in individuals affected by stroke. The secondary objective was to test the validity and sensitivity of a single-item linear analog scale (LAS) of function. Prospective cohort reliability and validation study. A single rehabilitation department in an academic medical center. Forty-three individuals receiving neurorehabilitation for lower extremity dysfunction after stroke were studied. Their ages ranged from 32 to 95 years, with a mean of 70 years; 77% were men. Test-retest reliability was assessed by calculating the classical intraclass correlation coefficient, and the Bland-Altman limits of agreement. Validity was assessed by calculating the Pearson correlation coefficient between the instruments. Sensitivity to change was assessed by comparing baseline scores with end of treatment scores. Measurements were taken at baseline, after 1-3 days, and at 4 and 8 weeks. The LEFS, Short-Form-36 Physical Function Scale, Berg Balance Scale, Six-Minute Walk Test, Five-Meter Walk Test, Timed Up-and-Go test, and the LAS of function were used. The test-retest reliability of the LEFS was found to be excellent (ICC = 0.96). Correlated with the 6 other measures of function studied, the validity of the LEFS was found to be moderate to high (r = 0.40-0.71). Regarding the sensitivity to change, the mean LEFS scores from baseline to study end increased 1.2 SD and for LAS 1.1 SD. LEFS exhibits good reliability, validity, and sensitivity to change in patients with lower extremity impairments secondary to stroke. Therefore, the LEFS can be a clinically efficient outcome measure in the rehabilitation of patients with subacute stroke. The LAS is shown to be a time-saving and reasonable option to track changes in a patient's functional status. Copyright © 2013 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  19. Data Base Directions; the Next Steps. Proceedings of the Workshop of the National Bureau of Standards and the Association for Computing Machinery (Fort Lauderdale, Florida, October 29-31, 1975).

    Science.gov (United States)

    Berg, John L., Ed.

    To investigate the information needs of managers making decisions regarding the use of data base technology, the National Bureau of Standards and the Association for Computing Machinery held a workshop with approximately 80 experts in five major areas: auditing, evolving technology, government regulation, standards, and user experience. Results of…

  20. Introduction to the workshop: Electroweak symmetry breaking at the TeV scale

    International Nuclear Information System (INIS)

    Gaillard, M.K.

    1984-01-01

    As viewed from today's perspective, electroweak symmetry breaking is both the central issue to be addressed by physics in the TeV region, and the most compelling argument for the need to explore that region. While the picture may change considerably over the next decade, it seems reasonable to focus theoretical attention on this issue which is in fact very broad in terms of its possible ramifications. Such a concerted effort can help to sharpen the scientific case for the SSC and provide fresh theoretical input to the ongoing series of workshops and studies aimed at forming a consensus on a choice of SSC design parameters. To set the mood of the workshop the author reviews briefly the physics to be explored prior to the SSC as well as the motivations for exploration of the TeV region for hard collisions. He follows with an example of a possible scenario for the first manifestation of electroweak symmetry breaking at the SSC